As COVID-19 spread rapidly across the world in 2020, people all over the world craved reliable information. A global network of volunteers rose to the challenge, consolidating information from scientists, journalists and health professionals, and making it accessible to ordinary people.
Two of them live nearly 3,200 kilometers from each other: Dr. Alaa Najjar is a Wikipedia volunteer and doctor who takes breaks during his emergency department to fight COVID-19 misinformation on the Arabic version of the site. Based in Sweden, Dr. Netha hussein, clinical neuroscientist and physician, spent her spare time editing COVID-19 articles in English and Malayalam (a language in southwest India), later focusing her efforts on improving Wikipedia articles on COVID-19 vaccines.
Thanks to Najjar, Hussain and over 280,000 volunteers, Wikipedia has become one of the most trusted sources for updates, in-depth knowledge of COVID-19, covering nearly 7,000 articles in 188 languages. Wikipedia’s reach and ability to support knowledge sharing on a global scale – whether it’s educating the public about a major illness or helping students study for tests – is unfulfilled. possible only through laws that allow its collaborative model led by volunteers to thrive.
As the European Parliament considers new regulations to hold Big Tech platforms accountable for illegal content amplified on their websites and apps through packages such as the Digital Services Act (DSA), it must protect citizens’ ability to collaborate in the service of the public interest.
Lawmakers are right to try to stem the spread of content that causes physical or psychological harm, including content that is illegal in many jurisdictions. As they envision a range of provisions for the full DSA, we welcome some of the items on offer, including requirements for more transparency on how content moderation platforms work.
But the current draft also includes normative requirements on how the service conditions are to be applied. At first glance, these measures may seem necessary to slow the rise of social media, prevent the spread of illegal content and ensure the safety of online spaces. But what happens to projects like Wikipedia? Some of the proposed requirements could shift power further from people to platform providers, stifling digital platforms that operate differently from large commercial platforms.
Big Tech platforms work fundamentally different from collaborative nonprofit websites like Wikipedia. All articles created by Wikipedia volunteers are available free of charge, without advertising and without tracking the browsing habits of our readers. The incentive structures of commercial platforms maximize profits and time spent on the site, using algorithms that exploit detailed user profiles to target the people whose content is most likely to influence them. They deploy more algorithms to automatically moderate content, resulting in over- and under-application errors. For example, computer programs often confuse masterpieces and satire with illegal content, while not understanding the human nuances and context needed to apply the real platform rules.
The Wikimedia Foundation and subsidiaries based in specific countries, such as Wikimedia Germany, support Wikipedia volunteers and their autonomy in making decisions about what information should exist on Wikipedia and what should not. The open publishing model of the online encyclopedia is based on the belief that people should decide what information stays on Wikipedia, taking advantage of rules established and developed by neutrality volunteers and trusted sources.
This template ensures that for any Wikipedia article on any topic, people who know and care about a topic follow the rules regarding what content is allowed on its page. In addition, our content moderation is transparent and responsible: all conversations between publishers on the platform are accessible to the public. It is not a perfect system, but it has extensively worked make Wikipedia a global source of neutral and verified information.
Forcing Wikipedia to function more as a business platform with a top-down power structure, without accountability to our readers and editors, would arguably overturn the real public interest intentions of the DSA by leaving our communities out of important decisions regarding the contents.
The Internet is at an inflection point. Democracy and civic space are under attack in Europe and around the world. Now more than ever, we all need to think carefully about how the new rules will foster, not hinder, an online environment that enables new forms of culture, science, participation and knowledge.
Lawmakers can work with public interest communities like ours to develop more inclusive, enforceable, and effective standards and principles. But they shouldn’t impose rules that target only the most powerful commercial internet platforms.
We all deserve a better and safer Internet. We call on lawmakers to work with collaborators from all sectors, including Wikimedia, to design regulations that empower citizens to improve it, together.