Technology

MIT researchers just created a ‘psychopath’ AI named Norman

Written by Muneeb Ahmad ·  1 min read >

A team of researchers from the prestigious Massachusetts Institute of Technology turned an Artificial Intelligence mad after they fed it with inappropriate content.

The set of news which comes from Newsweek, states that the MIT-borne scientists were able to develop a terminator-like ‘psychopath’ AI that interprets everything very gruesome. The scientists state that the said AI Norman, named after the main character from Psycho movie, was developed just to make the general users wary of the bad side of AI if it ever turns up into wrong hands.

The AI was developed by feeding an algorithm data from an unnamed Reddit community that is notoriously known for sharing violent content. Once enough data was fed, the AI was put to Rorschach inkblots which are nothing but psychological tests to gauge the AI and the results weren’t very good. Where a normal algorithm saw “a black and white photo of a baseball glove”, this AI saw “man is murdered by machine gun in broad daylight”.

Pointing light as to what this would mean to a techie, the researchers said that the algorithm itself wasn’t responsible for the results rather than the data which was fed to it. The scientists said, “The data used to teach a machine-learning algorithm can significantly influence its behavior. So when people say that AI algorithms can be biased and unfair, the culprit is often not the algorithm itself but the biased data that was fed to it”.

The big tech heads have always feared the point at which AI might become uncontrollable for humans. “Humans will have to merge with machines in order to keep up”, the CEO Tesla Elon Musk once said.

This means that in future, if the humans are put to face some psychopathic AI, this would be more because of the biased data or alternatively the humans responsible for feeding it said data. The AI could then be made better by feeding it with the unbiased data on large scale. Even in this case when highly biased data was fed to the system, Norman’s performance could be restored if the input data was diversified and optimized without necessarily having to alter with the algorithm itself.

Images —CNN

Written by Muneeb Ahmad
I love to talk about global tech-happenings, startups, industry, education and economy. Get in touch: muneeb@techjuice.pk. Profile