How intelligent is Alex Jones

Last week, the organization Media Matters published a clip that shows the right-wing conspiracy theorist Alex Jones, among other things, how he describes a school massacre as an "inside job" on the video portal Youtube. Youtube responded immediately and blocked the content - albeit the Media Matters compilation that exposes Jones' absurd claims.

These are the cases that have put YouTube under pressure in the past few months. Harmless videos are deleted without warning, conspiracy theories stand still. Against this background, the transparency report that YouTube published for the first time on Tuesday night is particularly interesting. Almost 8.3 million pieces of content were removed between October and December 2017. 6.7 million of these videos were automatically recognized by YouTube's algorithms and subsequently deleted. For three-quarters of these clips, YouTube's algorithms struck before anyone could see the content. Since Youtube has been using machine learning to identify such content, the proportion of such videos that are no longer live or that only have a few viewers has increased.

The figures show that artificial intelligence can help identify criminal content and quickly remove it. This is not only a blessing for the users who are spared these videos. The machines also relieve human employees who have to sift through disturbing depictions from violence to child pornography. But the numbers also illustrate a trend that many experts consider dangerous: computers decide what can be written, said and posted. If experienced lawyers disagree - how are machines supposed to judge such questions?

Nevertheless, Facebook boss Mark Zuckerberg also believes that artificial intelligence will soon be able to identify fake news and hate speech or expose terrorist propaganda and election manipulation. In his opinion, it should be so far in five to ten years. Hopefully by then the machines will be a little more intelligent than they are today.