Wikipedia fixed its swastika problem fast. Why can’t anyone else?



Mira Fox

This article appeared first on Forward. Reposted with permission.

Hate speech is notoriously hard to police online, and nearly every major social media platform has been criticized in the recent past for allowing disinformation and hate to proliferate on their platforms.

Wikipedia, meanwhile, got a hacker’s swastikas off of its site in under five minutes.

On Monday morning, a Wikipedia template was vandalized, impacting a wide range of pages across many unrelated topics. The change to the template caused each impacted page to show up as red, with an enormous, blaring swastika in the center; the original contents of the page, including the title, were not visible.

Input Magazine reported that Johnny Depp’s page had been turned into a giant Nazi flag at 9:52 a.m., but one minute later, the page had reverted to its usual form. Tweets also pointed out the same swastika effect on pages for Joseph Stalin and philosopher Theodor Adorno, as well as celebrities such as Jennifer Lopez and Madonna, among others.

In a statement to Input, a spokesperson for Wikipedia called the vandalism “a particularly vile action” and said that volunteers have “fixed the vandalism, blocked the account responsible, and will further evaluate the situation to see if additional recourse is needed.” The spokesperson also said that the template had been protected from additional malicious editing.

The reaction time is impressive, given that social media companies such as TikTok have left up antisemitic hate even after it has been reported by multiple users, and their algorithms often fail to spot hate speech. The Center for Countering Digital Hate found that five out of six of its reports of antisemitism to social media platforms were ignored.

Of course, a giant Nazi flag obscuring any actual content about frequently-Googled celebrities is more obvious than the antisemitic comments often left on Jewish creators’ videos, which often rely on obscure or coded references to ovens or employ certain numbers used by white supremacist groups to indicate hatred of Jews. Other tricks, such as intentionally misspelling “Jews” as “Joos” or J00s” helps hide their comments; clearly whoever splashed the blaring swastika across Wikipedia was not trying to be so subtle.

Yet it is still notable that Wikipedia was able to address the issue so quickly that many Twitter users doubted the veracity of the screenshots of Wikipedia pages covered by the swastika. Wikipedia addressed their response time in a statement: “Over the years, a number of tools and processes have been developed to quickly spot and revert vandalism on the site. Most vandalism on Wikipedia is corrected within five minutes, as we saw today.”

Trolling on Wikipedia is common, given that the encyclopedia is written and maintained by an open community, and ranges from harmless pranks to devious attempts to spread disinformation. Given that Wikipedia is often a main, and trusted, source of facts by nearly everyone who uses the internet, these edits can be dangerous and impactful.

The online encyclopedia developed a rigorous plan to combat this kind of sabotage during the leadup to November’s election, which likely served it well in its response to this most recent act of vandalism.

However, nearly every social media company has been working to combat the spread of disinformation and hate speech. The question remains, however, why no one else has been this successful. Now, at least, we know it’s possible.