Censors TikTok content

TikTokGood mood and censorship

No app was downloaded as often in the past year as TikTok. The video sharing platform is growing rapidly. In November 2019, TikTok broke the 1 billion user: indoor sound barrier - faster than any other social network ever before. The video app and its culture are currently so popular with children and young people that even the Tagesschau now has its own account there.

However, research by netzpolitik.org shows: TikTok can still suppress videos of political protests and demonstrations and also determine in a variety of ways which content is visible and which is invisible.

Exclusive insight into the moderation

For this research, netzpolitik.org spoke to a source at TikTok, viewed moderation criteria and communications, and made tests with specially created accounts to see how well videos with China-critical content are visible on the platform.

TikTok's moderation rules, of which netzpolitik.org was able to see different versions, are remarkably thin and can be interpreted widely - even for the moderators themselves. The strategy is clear, however: certain content is given the greatest possible reach, while others are systematically suppressed.

The success platform belongs to the Chinese technology company ByteDance. As early as September, the Guardian reported from leaked documents how TikTok censored political statements on the Tiananmen massacre or the independence of Tibet. The protests in Hong Kong, which are currently attracting media attention around the world, are virtually invisible on TikTok between selfies and duets. The app is also available in Hong Kong.

Beijing moderates at night

The German-language videos on TikTok are moderated from three locations, the source reports: Berlin, Barcelona and Beijing. At the German location, the leadership is Chinese. Work is carried out in 8-hour shifts, during which around 1,000 tickets are to be processed. That's just under half a minute per ticket, which is very little time for video content. The wages are very different, the pressure on individual moderators is high, the mood in the team is "toxic", reports the source.

According to the source, moderation takes place in three review stages. The first review takes place after 50 to 150 video views in Barcelona. Berlin is responsible for the second review for 8,000 to 15,000 views and the third review for around 20,000 views. At night, German-speaking Chinese moderate content from Beijing. This is also confirmed by TikTok to netzpolitik.org.

In contrast to Facebook, the moderation rules are kept very concise: According to the source, they fit into a table a few pages long. The instructions that netzpolitik.org was able to see are confusingly vague. This leaves a lot of room for interpretation, the restriction of content can be interpreted very broadly, because the set of rules does not have the much clearer character of Facebook moderation criteria in many cases.

Extinguishing, braking, pushing

The rules divide unpleasant content into four categories. This is reported by the source and shows the moderation rules that netzpolitik.org was able to see. Videos that completely violate the platform's requirements will be deleted (“deletion”). Other content is marked as "visible to self". This means that one user can still see the video herself, but not others.

A mark as “not for feed” or “not recommend” means that the video no longer appears in the algorithmically curated news feed that users see when they open the app. Tagging can also lead to a disadvantage in terms of search and findability in hashtags, the source says. Strictly speaking, such contributions are not deleted - but in fact they no longer have an audience.

For the unrestricted videos, however, there are two levels. Most of them are referred to as "General", but here the content can be regionally blocked or throttled by marking "Risks".

Videos, the distribution of which the marketing department wants to increase, can be pushed with the “Featured” mark. TikTok only confirmed the existence of “deletion”, “visible to self” and “risks” to netzpolitik.org. The "risks" are necessary so that the videos do not violate local laws in certain countries.

Not only content can be slowed down, but also entire hashtags, the source tells us. In general, TikTok seems to run a system of promoting and slowing down, in which certain content is visible and viral, while others never take off and are not visible. Most of all, control of what people see on TikTok is in the hands of the company.

Managed content policy

Protests are generally not welcome on the platform, says the source. Basically, due to the orientation of TikTok, there would be less protest content than on other platforms, but often such videos did not even make it into marketing, but would be deleted beforehand when the moderator looked at them for the first time at other locations such as Barcelona. The “deletion teams” do not even see the videos, but only individual frames; the sound is only listened to on suspicion. Tiktok denies that it moderates content because of its political orientation.

If content gets through to the marketing team, it determines the composition of the curated for-you feed, which is displayed to the user in an algorithmically tailored manner.

During the moderation, the moderators not only mark content for deletion or braking, they also classify what they see. "That should help in setting up a machine moderation," tells us the source, who has an insight into the moderation. Tiktok denied keywording to netzpolitik.org for learning artificial intelligence, but algorithmic systems were used "for checking when posting content".

Criticism of politics used to be excluded from feed

According to the source, TikTok changed its moderation rules after the Guardian's reporting in September and the criticism that followed. The company explicitly referred employees to the bad press, the source says. The extent of the changes in September has been unique to this day, but minor modifications are more common.

Tiktok had spoken to the Guardian about May, and the company told netzpolitik.org that the significant changes had been made “much earlier”.

Until this major change, the rules of moderation had virtually completely ruled out criticism of politics and political systems. For example, anyone who criticized the constitutional monarchy, parliamentary systems, the separation of powers or socialist systems was slowed down. Only with the major change was this “political ban” removed from the moderation rules.

We publish an excerpt from TikTok's moderation policy (PDF), in which several rules are recorded before and after the change, at this point. For source protection reasons, it is not the original document, but a copy.

Demonstrations can still be easily censored

The way in which the presentation of so-called “controversial events” is slowed down has also been changed. Until then, this generally included protests, riots and demonstrations. A list also included examples such as the Kurdish, Tibetan or Taiwanese independence movements. After the change, the representation of demonstrations and protests is no longer restricted per se.

However, according to the current rules, demonstrations with the reference to “possible violent conflicts” can still be marked with “not for feed” and regulated downwards. TikTok says it does not remove such content. However, that was not the answer to the question asked. Netzpolitik.org had asked about the throttling classification "not for feed", the versions of the moderation policy available to us did not provide for a complete removal anyway.

Criticism of public political figures, the police and the military were also banned from the feed. TikTok has now canceled these rules in Germany, which the company also confirms.

LGTBQI content is not displayed in many countries

In addition, certain things used to be marked as "Islam Offence". Content tagged with this keyword, such as two men kissing, triggered a geoblock for certain regions. LGTBQI content was particularly affected. This regulation has been abolished, according to the Guardian. Or rather: renamed.

Content that deals with sexual orientation is now given the catchphrase “Risk 3.4” - the result, as with the “Islam Offence” before, is that this content is slowed down in Islamic countries. Marking with a risk, there are many more, led to geoblocking, the source says. TikTok argues that you have to obey local laws.

Protests in Hong Kong are barely visible

Since TikTok belongs to a Chinese company, the handling of the democracy protests in Hong Kong is a particularly good indicator for attempts at censorship.

As early as September, we searched the app and the web version for certain hashtags that are prominent on other networks, such as #hongkongprotest, #freehongkong or #antielab and found no or only very few results. Instead, videos that had nothing to do with the protests were shown under the hashtags in the app. It was only after a press inquiry to TikTok in Germany that some videos became visible, including those that we tried to upload with a specially created account.

The company's spokeswoman told netzpolitik.org at the time:

Users are on TikTok because the app gives them a positive, fun experience where they can use their creativity. Short, entertaining videos are what our users mostly upload and watch on TikTok. TikTok's moderation follows our community guidelines and terms of use and does not remove videos related to the Hong Kong protests.

This over-specific statement can mean: Our users: inside do not upload anything political, there are hardly any videos about the protests in Hong Kong and we do not delete them either. However, the statement leaves it completely open whether TikTok systematically disadvantaged videos on Hong Kong by means of the different levels of visibility and thus made or made them invisible to the public. For example, a search for the hashtag #JoshuaWong in the TikTok app did not produce any results in September. The hashtag didn't exist at all.

TikTok told netzpolitik.org that it would not restrict any content on the protests in Hong Kong or Joshua Wong today.

Die Welt am Sonntag recently found in a test that the search for keywords such as "falungong", "tiananmenmassacre" and "Tiananmensquare", which are explosive from the Chinese government's point of view, yielded no or only very few suitable articles. Numerous searches by netzpolitik.org in the app come to a similar result.

"Tame and steered"

For Christian Mihr of Reporters Without Borders, research by netzpolitik.org confirms fears that TikTok is under the influence of the Chinese state - even though it is a private company. It is part of Beijing's media strategy to implement its totalitarian vision of hand-tamed, controlled media internationally.

"If the reach of content is throttled on TikTok via protests, for example, that fits exactly into the picture, since protests in Hong Kong or Xinjiang, as well as abroad, are a taboo topic for the media in China," says Mihr. He criticizes the TikTok approach as a "sign of great lack of transparency".

Update, 11/25/2019:

After publishing the article, TikTok insists on adding the following statement:

TikTok does not moderate content based on political orientations or sensitivities. Our moderation decisions are not influenced by any foreign government, including the Chinese government. TikTok does not remove videos related to the protests in Hong Kong, nor is it suppressed videos related to the protests in Hong Kong within their reach. That includes content about activists.

About this research and the sources:

Our knowledge about moderation at TikTok in Germany is based on a conversation of several hours between netzpolitik.org and a source who has insight into the moderation structures and the policy. We checked the identity of the source and their employment contract. We cannot and do not want to describe the source in more detail for reasons of protecting informants.

If you have any information or advice on this or other topics, we look forward to hearing from you - encrypted, gladly. Do not use any professional email addresses, telephone numbers, networks or devices for this.

Would you like more critical reporting?

Our work at netzpolitik.org is financed almost exclusively by voluntary donations from our readers. With an editorial staff of currently 15 people, this enables us to journalistically edit many important topics and debates in a digital society. With your support, we can clarify even more, conduct investigative research much more often, provide more background information - and defend even more fundamental digital rights!

You too can support our work now with yours Donation.

About the author

Markus Reuter

Markus Reuter deals with the topics of digital rights, hate speech & censorship, fake news & social bots, right-wing radicals on the internet, video surveillance, basic and civil rights and social movements. At netzpolitik.org since March 2016 as an editor. He can be reached at markus.reuter | ett | netzpolitik.org and on Twitter at @markusreuter_

Chris Köver

Chris Köver is a journalist. In her work, she researches the cross-connections between digital technologies and social justice, machine learning and discrimination, surveillance and gender - from an intersectional feminist perspective. Chris reports on all these topics for netzpolitik.org and also moderates the netzpolitik.org podcast from time to time. She gives lectures, moderates panels, gives workshops and is happy to share tips where you can see, read and hear other experts in these areas. Contact: email, OpenPGP, Twitter.
Published 11/23/2019 at 12:05 PM