Managing conflict on Wikipedia and internationally
- This article invites readers to join an effort to improve the management of (a) conflict on Wikipedia and through that (b) major societal conflicts, including violent conflicts.
The section on "Articles on contentious issues" of the Wikipedia article on "Reliability of Wikipedia" cites research documenting the effectiveness of Wikipedia in getting people with very different perspectives to collaborate in producing a narrative of a physical reality they share for which their constructed realities are very different and a source of conflict.
If we can improve how conflicts are managed in Wikipedia and other Wikimedia Foundation projects, it might help reduce the intensity and lethality of major societal conflicts, as noted in the section on "Wikimedia Foundation and managing conflict" of the Wikiversity article on "International Conflict Observatory. The present article reviews several presentations at the recent Wikimania 2021 that discussed (a) managing conflict on Wikipedia and (b) using Wikimedia Foundation projects to broaden media coverage of societal conflicts.
Relevant presentations from Wikimania:2021:Program[edit | edit source]
This section summarizes presentations at Wikimania 2021 relating to conflict. This includes managing conflicts between Wikipedia editors and problems with biased editing that interact with conflicting ideologies in the global village.
Managing conflict on Wikipedia[edit | edit source]
- Wikimania:2021:Submissions/RedWarn and improving Wiki counter vandalism into the 2020's by Ed Englefield & Chlod Alejandro.
- - This presentation says, "Wikipedia:RedWarn is a counter-vandalism tool, ... used by hundreds of English Wikipedia editors to revert problematic edits, warn and report editors, request page protection and perform other moderation and maintenance tasks."
- - CONCERN: What limits the ability of vandals and biased / paid editors to use RedWarn (or other reversion tools like Wikipedia:Twinkle or Wikipedia:Huggle) to further their nefarious ends, demoralize and demotivate honest editors and drive them away from Wikipedia?
- - Some conflict is inevitable and can be constructive. However, conflicts that go too long become destructive. It's better to set time limits with a procedure for making a decision. A poor decision within a reasonable time frame is often better than extending a conflict indefinitely.
Problems with biased editing[edit | edit source]
- Wikimania:2021:Submissions/Come on, It’s Wikipedia, not Westeros: A Brief Introduction to the Wikipedia Conspiracy Theory in Japan by Kitamura Sae
- - The Japanese-language Wikipedia was unfairly accused of conspiracies after a Japanese music therapist/professional writer had her first 3 edits reverted and published several articles on this topic. One questionable incident was the Higashi-Ikbukaro Runaway Car Accident when an 87-year-old retired Head of the Industrial Science and Technology Agency of the former Ministry of International Trade and Industry caused a traffic accident in which 2 people were killed and others injured. After edit wars in the Japanese and English language Wikipedia articles on him and on the accident, the Japanese article on him does not mention the accident, and the Japanese article on the accident does not mention him, either. There are also notable cases of historical negationism, which has made several article historically inaccurate and politically biased. For these problematic articles, the presenter, Dr Kitamura Sae, quoted Hanlon's razor: "Never attribute to malice that which is adequately explained by stupidity." She admitted that historical negationism and inaccuracy are major problems in the Japanese Wikipedia, but they are caused by stupidity and the lack of proper control rather than organised conspiracies. There are only about 40 administrators for the Japanese-language Wikipedia, which contains over 1,270,000 articles. According to the presentation, the Japanese Wikipedia is suffering from a serious shortage of administrators and good editors, which leaves it a completely disorganised space.
- - Dr Kitamura Sae said she thought the only solution to the problem of biased editing was recruiting more editors. To this end she organized a "Western History Wikipedia Workshop" 2021-05-16 at the 71st Annual Meeting of the Japanese Society of Western History .
- Wikimania:2021:Submissions/Cross-wiki ideological conflict and Wikimedia's vision of knowledge equity by User:Deryck Chan.
- - The presenter said, "censorship entrenches etho-linguistic conflicts."
- - To reduce conflicts, Wikipedia needs to be pro-active in identifying and pushing back against censorship and biased editing, whether by governments or other major organizations like corporations.
- - A relatively minor example is Polish w:pl:Gdańsk vs German w:de:Danzig and the long edit wars that eventually decided on w:en:Gdańsk, even though the city's official name changed happened 55 years before Wikipedia was invented.
- - These problems are much more serious with in places with repressive governments. For example, the Great Firewall of China dramatically limits what can be seen on the Internet in China. Some who find ways to defeat the Great Firewall spend time in prison. People in China who are allowed to edit Wikipedia promote official government policies. This creates biases in the Mandarin Chinese Wikipedia. Some editors in Hong Kong of the Cantonese language Wikipedia were told they would be reported to the national security police if they continued trying to make an article reflect what could be documented from reliable sources when that conflicted with official government policy. Threats like these are very serious, because it's much easier to be imprisoned for saying or writing the wrong thing in places like China than in the US, for example.
- - Wikipedia should expand its support for ethno-linguistic groups perceived to be under attack. He offered three competing not-quite tongue in cheek definitions of a language, being a dialect with:
- - Bottom line: The presenter said the Wikimedia movement should be more aggressive in responding to censorship or biased editing by governments or corporations:
- If authority censors Wikipedia, Wikipedia should censor their state media.
Broadening media coverage of societal conflicts[edit | edit source]
- Wikimania:2021:Submissions/The experiences of WikiGap and WikiForHumanRights. Working with UN experts on topics for impact by John Cummings, Alex Stinson & Eric Luth.
- - WikiGap is an effort to close gender gaps in Wikipedias, most of which have roughly 3 articles on males for every one on a female.
- - WikiForHumanRights encourages people to write about human rights, environmental health and diverse communities impacted by environmental issues around the world.
- - UNESCO has a Wikimedian in Residence, and the publications branch of the Food and Agricultural Organization (FAO) is working with Wikipedia to improve dissemination of information they produce.
- Wikimania:2021:Submissions/Documenting social movements through the Wikimedia Projects: Argentina, Chile and Colombia experiences by Patricia Díaz-Rubio, w:User:Luisina Ferrante (WMAR), and Juan Carlos Vargas.
- - Wikimedia Argentina, Chile, and Columbia organized dozens of workshops and edit-a-thons that taught volunteers in those countries how to upload photos to Wikimedia Commons, create articles in Wikipedia and use Wikidata. These activities helped people develop new skills that were used to upload well over a thousand new images that were used in many more than a thousand articles. The results presented different perspectives of political activities in those countries, especially street demonstrations, from what appeared in the traditional media. This included documenting violations of human rights including violence by police against demonstrators. These contributions to Wikipedia were viewed by many thousands of people.
- Wikimania:2021:Submissions/Supporting a gentle discussion - Can we raise awareness through research results to bring about change? by Lea Volz.
- - Summarized two years of survey work of contributors to the German language Wikipedia and invited input from others about how we might make Wikipedia more welcoming and supportive, especially for newbies.
Evolving legal environment especially regarding "terrorism"[edit | edit source]
- Wikimania:2021:Submissions/Do Something Doctrine - looking back on the Terrorist Content Regulation in EU by Anna Mazgal.
- - Anna Mazgal said that the Terrorist Content Regulation in the European Union (EU) is bad law, potentially criminalizing discussions of terrorism or even eco-terrorism that may make it difficult to talk rationally about anything that someone in power might want to label as "terrorism". It was passed using procedures that circumvented the usual open discussions afforded normal law making in Brussels. For more see Wikimedia.brussels' blog.
Discussion[edit | edit source]
As suggested in the section on "Wikimedia Foundation and managing conflict" of the Wikiversity article on "International Conflict Observatory, there may be great opportunities for Wikimedia Foundation projects to help reduce the intensity and lethality of major conflicts.
Presentations at Wikimania 2021 documented many gains in recruiting new volunteers to contribute photos to Wikimedia Commons and create new Wikipedia articles to improve information available about violations of human rights including violence by law enforcement officers while also contributing to less contentious issues like helping close the gender gap in Wikipedia and improving the quality of information available about food and agriculture. Wikimedia Foundation efforts in Argentina, Chile, and Columbia seem to have been effective in challenging the narrative presented by traditional media. The backlash from the Chinese government made it clear that honest contributions to Wikipedia could be severely punished. Censorship of Wikipedia in developed countries like Australia, France, Germany, and the United Kingdom show that these concerns are not limited to lesser developed countries. Appropriate vigilance by Wikimedians everywhere might reduce the chances that progress for human rights be reversed, like what seems to have happened to much of the "Twitter Revolution", especially the Arab Spring.
The presentation on "Supporting a gentle discussion" seemed not to consider the issue of blatant and vicious ideologues.
- The Wikipedia article on "International Conflict Observatory" suggests we need to invite and encourage more conflict on Wikipedia, similar to the discussion of "Documenting social movements through the Wikimedia Projects: Argentina, Chile and Colombia experiences", summarized above. This could include training not only in the technical aspects of how to upload photos to Wikimedia Commons and how to edit and even create articles on Wikipedia but also on how to manage conflict constructively.
- However, people with power often perceive activities like these as threatening. They can be expected to react by paying people to do what they can to destroy the integrity of Wikipedia, as described in the presentation on "Cross-wiki ideological conflict and Wikimedia's vision of knowledge equity". And editors in Hong Kong were told they would be reported to the national security police if they continued trying to make an article reflect what could be documented from reliable sources when that conflicted with official government policies. People threatened can use help thinking through the actual risks they may face depending on where they live, e.g., in China vs. the US. This training can include ways to help editors seek support. Those who live in places like the US should be encouraged to continue to respectfully confront apparently biased editors, knowing that in doing so, they may sometimes help make the world safer for people like Jamal Khashoggi.
- This suggests that it might be wise to change the job descriptions of Administrators and Bureaucrats in Wikimedia Foundation projects so they focus first on identifying and responding appropriately to harassment and secondly on the more traditional Admin tasks of deleting inappropriate articles, protecting articles, and blocking people who abuse the rules and purpose of Wikimedia Foundation projects. To accomplish this, it may be wise to organize specialty conferences of administrators and bureaucrats to discuss these issues and develop improved procedures. We do NOT want repeats of the case where the Wikimedia Foundation banned an administrator from editing the English Wikipedia for a year, apparently because he was too abrasive in the actions he took.
EU regulation 2021/784 on "terrorism" seems truly terrifying, especially if we believe that the "War on terror" and related efforts to fight "terrorism" seem from some perspectives to be only the latest marketing wrapper for a policy of support by the elites of the advanced industrialized nations like the US and France, to name only two, for state terror in countries with repressive governments that have good relations with international businesses supported by countries like the US and France. Key points in this regard are made in the Wikiversity article on "Winning the War on Terror", supported by those who control the money for the media in the different countries. For example:
- Why didn't the mainstream media in the US and its allies demand that the US supply evidence of Osama bin Laden's culpability, as requested by the government of Afghanistan?
- 3,516 traffic fatalities were recored in the US in the average month in 2001, more than the 2,996 attributed to the terrorist attacks of September 11 of that year. But the US doesn't declare war on traffic accidents.
- Roughly 500 men die in the US each year due to breast cancer (ignoring breast cancer in women) and 335 Americans were reported to have drowned in a bathtub, hot tub or spa in the average year between 1993 and 2003, several times the 116 Americans who have succumbed to terrorism in the average year between 1970 and 2015 according to the Global Terrorism Database (52 per year without 2001). But the US does not declare war on male breast cancer nor bathtubs.
- Virtually everyone believes in rule of law, but people with power evidently believe they should be above the law. Jones and Libicki (2008) found that military force was the least effective response to terrorism. Most effective were negotiations, like the 1998 Good Friday Agreement in Northern Ireland, and law enforcement; see the accompanying figure and their report. But rule of law does not sell high tech weapons nor continued support for repressive governments like in Saudi Arabia.
Multiple research summaries indicate that the "click economy" is a primary driver of the increase in political polarization that the world has seen since the founding of Facebook in 2004. To the extent that this analysis is appropriate, it suggests a couple of things:
- Requiring all media companies to submit copies of all advertisements, underwriting spots and click bait to a central repository like the Internet Archive with metadata to make it quite useable for research, funded by a reasonable tax on the advertising revenue.
- Public funding of media, e.g., in proportion to clicks, possibly at a level to match the amount that governments spend on accounting, advertising, and media and public relations.
References[edit | edit source]
- Seth Jones; Martin C. Libicki (2008), How Terrorist Groups End: Lessons for Countering al Qa'ida, RAND Corporation, ISBN 978-0-8330-4465-5, JSTOR 10.7249/mg741rc, OL 16910145M, Wikidata Q57515305
Notes[edit | edit source]
- See also "confirmation bias and conflict".
- Around 2007 I was invited to teach a class in China. In the plane on the way there, I was trying to remember who was the president of China? After I got there and checked into a reasonably high class hotel suitable for someone invited to teach a high tech class, I got on the Internet and began to search for "Who is the president of China?" Every way I could think to find an answer to that question was blocked by the Great Firewall of China. Finally I remembered: The president of Chna was Hu -- Hu Jintao. It reminded me of the famous Abbott and Costello comedy routine, "Who's on First?"
- Selina Cheng (14 July 2021), "Hong Kong Wikipedia editors take precautions amid fears mainland peers may report users to national security police", Hong Kong Free Press, Wikidata Q108667501
- Lüpke and Storch (2013) are cited in Bradley McDonnell; Andrea Berez-Kroeker; Gary Holton, eds. (2018), Reflections on language documentation 20 years after Himmelmann (PDF), Wikidata Q108667724
- "The European Union Moves to Fight Terrorist Content Online", The Soufan Center, 15 June 2021, Wikidata Q108679526 Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online, Wikidata Q103793218
- The figure for 2017 was 510 per Male Breast Cancer Incidence and Mortality, United States—2013–2017, Centers for Disease Control and Prevention, October 2020, Wikidata Q108680621 The Wikipedia article on "Male breast cancer" previously said, 'The number of annual deaths in the US is about 440 (for 2016 "but fairly stable over the last 30 years").' However, the source for that figure 440 was a broken link, which could not be found on the Internet Archive.
- Alejandra Fernandez-Morera (26 February 2018), "Someone drowns in a tub nearly every day in America. Experts blame alcohol; others suspect homicide", Seattle Post-Intelligencer, ISSN 0745-970X, Wikidata Q60226981
- Jones and Libicki (2008, p. 19)
- In ancient Athens, Anacharsis said that the laws are like spider webs: Strong enough to hold the little flies but not the big ones.