ADVERTISEMENT
ADVERTISEMENT

Scouring hate off Facebook in Germany

BERLIN — Security is tight at this brick building on the western edge of Berlin. Inside, a sign warns: “Everybody without a badge is a potential spy!”

They are the agents of Facebook. And they have the power to decide what is free speech and what is hate speech.

This is a deletion center, one of Facebook’s largest, with more than 1,200 content moderators. They are cleaning up content — from terrorist propaganda to Nazi symbols to child abuse — that violates the law or the company’s community standards.

Germany, home to a tough new online hate speech law, has become a laboratory for one of the most pressing issues for governments today: how and whether to regulate the world’s biggest social network.

ADVERTISEMENT

Around the world, Facebook and other social networking platforms are facing a backlash over their failures to safeguard privacy, disinformation campaigns and the digital reach of hate groups.

In India, seven people were beaten to death after a false viral message on the Facebook subsidiary WhatsApp. In Myanmar, violence against the Rohingya minority was fueled, in part, by misinformation spread on Facebook. In the United States, Congress called Mark Zuckerberg, Facebook’s chief executive, to testify about the company’s inability to protect its users’ privacy.

As the world confronts these rising forces, Europe, and Germany in particular, have emerged as the de facto regulators of the industry, exerting influence beyond their own borders. Berlin’s digital crackdown on hate speech, which took effect on Jan. 1, is being closely watched by other countries. And German officials are playing a major role behind one of Europe’s most aggressive moves to rein in technology companies, strict data privacy rules that take effect across the European Union on May 25 and are prompting global changes.

“For them, data is the raw material that makes them money,” said Gerd Billen, secretary of state in Germany’s Ministry of Justice and Consumer Protection. “For us, data protection is a fundamental right that underpins our democratic institutions.”

Germany’s troubled history has placed it on the front line of a modern tug-of-war between democracies and digital platforms.

ADVERTISEMENT

In the country of the Holocaust, the commitment against hate speech is as fierce as the commitment to free speech. Hitler’s “Mein Kampf” is available only in an annotated version. Swastikas are illegal. Inciting hatred is punishable by up to five years in jail.

But banned posts, pictures and videos have routinely lingered on Facebook and other social media platforms. Now companies that systematically fail to remove “obviously illegal” content within 24 hours face fines of up to 50 million euros.

The deletion center predates the legislation, but its efforts have taken on new urgency. Every day content moderators in Berlin, hired by a third-party firm and working exclusively on Facebook, pore over thousands of posts flagged by users as upsetting or potentially illegal and make a judgment: Ignore, delete or, in particularly tricky cases, “escalate” to a global team of Facebook lawyers with expertise in German regulation.

Some decisions to delete are easy. Posts about Holocaust denial and genocidal rants against particular groups like refugees are obvious ones for taking down.

Others are less so. On Dec. 31, the day before the new law took effect, a far-right lawmaker reacted to an Arabic New Year’s tweet from the Cologne police, accusing them of appeasing “barbaric, Muslim, gang-raping groups of men.”

ADVERTISEMENT

The request to block a screenshot of the lawmaker’s post wound up in the queue of Nils, a 35-year-old agent in the Berlin deletion center. His judgment was to let it stand. A colleague thought it should come down. Ultimately, the post was sent to lawyers in Dublin, London, Silicon Valley and Hamburg. By the afternoon it had been deleted, prompting a storm of criticism about the new legislation, known here as the “Facebook Law.”

“A lot of stuff is clear-cut,” Nils said. Facebook, citing his safety, did not allow him to give his surname. “But then there is the borderline stuff.”

Complicated cases have raised concerns that the threat of the new rules’ steep fines and 24-hour window for making decisions encourage “over-blocking” by companies, a sort of defensive censorship of content that is not actually illegal.

The far-right Alternative of Germany party has been quick to proclaim “the end of free speech.” Human rights organizations have warned that the legislation was inspiring authoritarian governments to copy it.

Other people argue that the law simply gives a private company too much authority to decide what constitutes illegal hate speech in a democracy, an argument that Facebook, which favored voluntary guidelines, made against the law.

ADVERTISEMENT

“It is perfectly appropriate for the German government to set standards,” said Elliot Schrage, Facebook’s vice president of communications and public policy. “But we think it’s a bad idea for the German government to outsource the decision of what is lawful and what is not.”

Richard Allan, Facebook’s vice president for public policy in Europe, put it more simply: “We don’t want to be the arbiters of free speech.”

German officials counter that social media platforms are the arbiters anyway.

It all boils down to one question, said Billen, who helped draw up the new legislation: “Who is sovereign? Parliament or Facebook?”

The deletion center here is run by Arvato, a German service provider owned by the conglomerate Bertelsmann. The agents have a broad purview, reviewing content from a half-dozen countries.

ADVERTISEMENT

“Two agents looking at the same post should come up with the same decision,” says Karsten König, who manages Arvato’s partnership with Facebook.

When Nils applied for a job at the center, the first question the recruiter asked him was: “Do you know what you will see here?”

Nils has seen it all. Child torture. Mutilations. Suicides. Even murder: He once saw a video of a man cutting a heart out of a living human being. “You see all the ugliness of the world here,” he said.

The issue is deeply personal for Nils. He has a 4-year-old daughter. “I’m also doing this for her,” he said.

The arrival of nearly 1.4 million migrants in Germany has tested the country’s resolve to keep a tight lid on hate speech. The law on illegal speech was long-established but enforcement in the digital realm was scattershot before the new legislation.

ADVERTISEMENT

Posts calling refugees rapists, Neanderthals and scum survived for weeks. Many were never taken down. Researchers reported a tripling in observed hate speech in the second half of 2015.

Billen, the secretary of state in charge of the new law, was alarmed. In September 2015, he convened executives from Facebook and other social media sites. A task force for fighting hate speech was created. A couple of months later, Facebook and other companies signed a joint declaration, promising to “examine flagged content and block or delete the majority of illegal posts within 24 hours.”

But the problem did not go away. Over the 15 months that followed, independent researchers, hired by the government, twice posed as ordinary users and flagged illegal hate speech. During the tests, they found that Facebook had deleted 46 percent and 39 percent.

“They knew that they were a platform for criminal behavior and for calls to commit criminal acts, but they presented themselves to us as a wolf in sheep skin,” said Billen.

By March 2017, the German government had lost patience and started drafting legislation. The Network Enforcement Law was born, setting out 21 types of content that are “manifestly illegal” and requiring social media platforms to act quickly.

ADVERTISEMENT

Officials say early indications suggest the rules have served their purpose. Facebook’s performance on removing illegal hate speech in Germany rose to 100 percent over the past year, according to the latest spot check of the European Union.

At Facebook’s Berlin offices, Allan acknowledged that under the earlier voluntary agreement, the company had not acted decisively enough at first.

“It was too little and it was too slow,” he said. But, he added, “that has changed.”

This article originally appeared in The New York Times.

KATRIN BENNHOLD © 2018 The New York Times

JOIN OUR PULSE COMMUNITY!

Unblock notifications in browser settings.
ADVERTISEMENT

Eyewitness? Submit your stories now via social or:

Email: eyewitness@pulse.ng

ADVERTISEMENT
ADVERTISEMENT