Three-quarters of the reported Youtube content remains online

Three-quarters of the reported Youtube content remains online
The Law against Hatred on the Internet has been in force since January. First reports of the platforms show: Relatively few reported contents had to be deleted.

BerlinThe politics criticized for a long time, hate, hate and fake messages would not be deleted on the net consistently enough – also after the large social networks had stepped up their procedure step by step. The dissatisfaction finally led to the so-called Network Enforcement Act (NetzDG).

The rules should ensure that illegal postings are removed faster and more consistently from the Internet. Services such as Facebook, Twitter, and Youtube (Google) have since January 1 clear criminal content within 24 hours after a notice. In less clear cases, they have one week left. For systematic breaches of the obligation, penalties of up to 50 million euros.

Now, as required by law, the platform operators are providing information about their handling of problem content for the first time. On Friday, the three major network providers published their transparency reports. Noteworthy here are the sometimes considerable differences in the number of complaints.

Registered Facebook in the first half of the year 886 NetzDG complaints, in which a total of 1704 problem content (one complaint can refer to multiple contents) were called, reported Youtube users in the same period 214 827 content – either directly to the platform or indirectly via complaint points like jugendschutz.net or the Eco-Internetverband. The short message service Twitter reached about 265,000 NetzDG complaints.

The deletions and blockages are very manageable on Facebook. Only 218 NetzDG complaints resulted in consequences. After that, as the world’s largest online network reported in its nine-page comprehensive transparency report, a total of 362 contents were deleted or blocked. In the ten-page report from Twitter is of nearly 29,000 content that has been deleted or blocked, also comparatively little to the reported content.

The report from Google’s video platform Youtube is much more detailed and longer: On 26 pages, the network provider uses statistics to describe his handling of hate comments. The figures suggest that the NetzDG apparently has a much smaller impact on the company’s extinguishing practice than its own Regulate.

Arnd Haller, head of the Google legal department in Northern and Central Europe, nevertheless expressed sharp criticism of the NetzDG. “We continue to be concerned that this law is forcing companies to focus on speed when it comes to testing, which should be a matter of careful consideration,” said Haller. Similarly expressed Facebooks chief lobbyist Richard Allen.

Oliver Süme, CEO of the Eco-Association of the Internet Industry, became even clearer. “The companies delete more than a few years ago,” said Süme, “but this is not due to the law, but to extremely improved technologies and increased awareness of the problems among companies.” Reports on NetzDG fell in the overall balance sheet barely into the Mass.

Gerd Billen, Secretary of State in the Ministry, was satisfied with the fact that the law showed the first effect: “Nevertheless, we are only just at the beginning.” The NetzDG, however, had also caused criticism. Opponents argue that it tempts platform operators to block marginal content for fear of fines. That could lead to censorship.

At Youtube, after checking 214,827 content, only 58,297 were removed. That means: Almost 73 percent of the reported contents remained online because, as Youtube explains, they “did not violate our Community Guidelines or any of the offenses listed in the NetzDG”.

Most other content (just under 93 percent of it) was removed within 24 hours of receiving a legal complaint. “Mostly, the content was removed worldwide due to a breach of Youtube’s Community Guidelines and not just blocked locally due to unlawful content under NetzDG.” Removal for “Community Guidelines Violation” therefore affected 42,121 pieces of content, the removal of NetzDG 16.176 content.

Three categories stand out
Most of the content removed was, according to Youtube, “hate speech or political extremism” (24,804), followed by “breach of privacy or insult” (11,428) and “pornographic content” (7,338).

The three complaints also stand out in terms of overall complaints. From the area of “hate speech or political extremism”, about 75,892 contents were reported, from the area “Personal Rights Violation or Insult” 45,190 content and the section “Pornography” 27,308 content.

In the case of Facebook, contents of the categories “insult” (114), “libel” (90) or “sedition” (74) were particularly affected by deletion or blocking. This happened mostly within 24 hours.

On Twitter, most of the complaints against the categories involved were incitement (9,828), insult (4,495) and use of marks of unconstitutional organizations (2,472). In just about 600 cases it took more than 24 hours, in just 37 cases in more than seven days.

As there are cases in the criminal law requiring specific legal expertise, platform providers can seek external advice. For example, by engaging a law firm specializing in criminal law, or by contacting a regulated self-regulatory body within the NetzDG.

Youtube reportedly applied to a law firm in just 40 cases. Facebook consulted external legal advisors on 54 of the 886 complaints. Twitter used external help in 107 cases.

Checking content is complex. Depending on the number of incoming complaints under the NetzDG could vary the number of examiners, Youtube writes in his report. At present, about 100 examiners of an external service provider are working exclusively on complaints that are submitted via the NetzDG registration procedures on Youtube and Google+. To ensure cultural diversity, these examiners have different professional backgrounds, speak different languages and come from different age groups.

According to Facebook, the teams that handle the complaints submitted via the NetzDG form are said to consist of about 65 people, whereby the number of employees can be adjusted in case of increased complaints.

According to Twitter, “more than 50 people” edit the NetzDG complaints on Twitter. Since not all of the reported content is in German, those responsible for handling these complaints also have knowledge of other languages, including Arabic, English, French, Greek, Dutch, Italian, Catalan, Polish, Portuguese, Russian, Spanish and Turkish.”

Platforms use multilevel test procedures
The NetzDG complaints are checked according to Facebook in two stages. In a first step, the reported content would be reviewed by the partner company Arvato. If the review reveals that the content violates Community standards, it will be removed. In a second step, all reported content that has not been removed for violations of community standards will undergo a multi-stage legal review process by the “Legal Team”. The members of this team were qualified insofar as they could handle “complexity of the NetzDG complaints”.

According to the short message service on Twitter, everyone received NetzDG complaint is first checked against the terms and conditions and Twitter rules. If you find violations, they would be “completely” removed from the platform. “If we find that content violates NetzDG, it will be withdrawn based on the local law in Germany.”

YouTube reported on its audits procedure that teams saw the context of reported content during their review, including the video description, other content uploaded to the channel, and metadata (titles, tags, or captions). These contextual hints are important for assessing the intent of the content creator.

“In addition,” says Youtube, “our system records the timestamp at which point in the video the content was reported.” The complainants were also asked in reporting forms to provide the “relevant timestamp”. “This allows our reviewers to focus on the potentially critical sections within a video.”

Youtube also uses special technologies to help identify problematic content. For example, to find the objects and patterns quickly and to scale in pictures, videos, and audio recordings. Nonetheless, the company also emphasizes that there are still only people who could judge the context.

“For example, algorithms can not tell the difference between terrorist propaganda and critical reporting on such organizations, or between inciting content and political satire,” the Transparency Report said. “That’s why people – especially in areas where freedom of expression is concerned – usually have to judge the content.”

As an example, Youtube cites a video of a terrorist attack. This could be an “informative news coverage” when uploaded by a news agency or contemporary witness. However, the same video could be used for recruiting if uploaded, glorified, or recruited by the terrorist organization itself.

“Informative messages would probably be allowed under our worldwide community guidelines, while the recruitment of terrorists would violate our Community Guidelines and be removed,” said Youtube.

The video platform is also confronted with legally irrelevant complaints – possibly the number of content complained of is therefore so large, the number of deletions, however, relatively small. For example, a speech by Chancellor Angela Merkel on the affiliation of Islam to Germany had been reported both as a hate speech and as a terrorist content because, in the complainants’ view, Merkel put the state in danger and led Germany into catastrophe. Of course, this complaint did not entail any consequences for the content.

Leave a Reply

Your email address will not be published. Required fields are marked *