Jump to content

ユニバーサル行動規範/2021年聞き取り調査/協議/報告

From Meta, a Wikimedia project coordination wiki
This page is a translated version of the page Universal Code of Conduct/2021 consultations/Discussion/Report and the translation is 1% complete.
ユニバーサル行動規範

このページは2021 主要な質問の聞き取り調査として個別のプロジェクトならびにメタで参加者から寄せられた主要なテーマと概要を示しており、これらはユニバーサル行動規範第2フェーズに共有されたものです。左記の期間はプロジェクトチームの提携団体へのアウトリーチを続け、役務者の会議に裁定委員、スチュワード、グローバル管理者の同席を得ました。また実施策に関する特筆するべき洞察は2021 ローカル言語の聞き取り調査でも集まったことから、当概要の対象ははそのアウトリーチの範囲から外れたコミュニティとします。ここにまとめたコメントの原文はローカルの聞き取り調査ならびに協議をご参照ください。

This goal of this phase is outlining clear enforcement pathways for the Universal Code of Conduct (UCoC) policy. The project follows the Wikimedia Foundation Board Statement on Community Health (2020) and Statement on Healthy Community Culture, Inclusivity, and Safe Spaces, which holds that a global policy is necessary to "welcome people from every background to build strong and diverse communities" and "break down the social, political, and technical barriers preventing people from accessing and contributing to free knowledge". In order to address "persistent antisocial behavior patterns [that] persist within Wikimedia communities", the code should work towards "the elimination (to the extent practicable) of toxicity and harassment in the Wikimedia movement."

Recognizing that a code of conduct is only effective when supported by the community and those that participate, the project team sought to consult individual users in local community contexts. This report was prepared by facilitators with experience as volunteer contributors on various projects.

Over 247 participants including administrators, functionaries, Arbitration Committee members, affiliate staff, and both new and established contributors from 16 different projects participated in these consultations, round-tables, and other ongoing conversations. The summary also includes material from other volunteer-led discussions and initiatives concerning the enforcement of the Universal Code of Conduct.

While the ideas, thoughts, and concerns raised by participants varied widely, it was possible to observe common themes among the communities. Participants voiced their opinions about the need and risks of private reporting systems, flaws present in current enforcement and review systems, the need for context sensitivity, and expressed a desire for additional support to be available for volunteers. Many participants felt communities should continue to have a meaningful role in the revision and approval process.

Process

The Universal Code of Conduct project team conducted deep and targeted outreach during the local language consultations, and the objective was to give all other individual communities the same opportunity to express how the code should apply locally. The consultation format was a distributed model using a hub-and-spoke or mesh design, where discussions would be held on-wiki at individual communities, and summaries of the local discussions would be posted to the Meta centralized discussion page. The facilitation team also organized round-table discussions about the key questions using an online meeting platform.

The team delivered over 700 mass messages to active projects on 5 April 2021 inviting users to initiate local discussions and link back to the consultation hub. The project facilitators reached out to established users of many projects including all major and available languages before and during the comment period to foster engagement. Individual discussions began on Russian Wikinews, as well as Mandarin Chinese (zh), English (en), Portuguese (pt), and Russian (ru) Wikipedia projects. The project was also discussed on the Czech (cs) and German (de) Wikipedia community pages. The project team reviewed German, French (fr) and Dutch (nl) community discussions that were concurrent with other language consultations and coordinated with local users to help understand those project's unique challenges. On 14 April 2021, a talk page on Meta was formatted to display the key questions in the interface language of the user where available, and comments were invited in any language. Available translations included アラビア語, カタロニア語, 中国語, チェコ語, オランダ語, 英語, フランス語, ドイツ語, インドネシア語, 日本語, ポーランド語, Brazilian Portuguese, ロシア語, スペイン語, スウェーデン語, and ウクライナ語. On 17 April 2021, invitations were sent to other English Wikimedia projects to participate in the central discussion, resulting in local engagement from users active on Wikiquote, Wikiversity, and Wikivoyage. On 28 April 2021 an invitation was placed on the German Wikipedia central discussion template resulting in significant German-language engagement in the Meta discussion venue.

Participation

Participation levels at individual projects ranged from 1 to 48, and there were 247 participants who commented during the on-wiki consultation period across 16 projects including the Meta-Wiki centralized discussion page and concurrent local discussions. English Wikipedia and MetaWiki had the highest attendance for those beginning in April, and there was a significant number of comments from German-speaking users. The concurrent French discussions from earlier in the year had the most participation overall. Other languages and Wikimedia projects had smaller yet still productive discussions, and the facilitators were thankful for the highly experienced users who commented in these consultations or attended round-tables, who provided impactful feedback, and helped understand their community positions and sentiment on the code.

Total Count of Participation by each wiki. A bar graph shows the following values: Meta (27); English Wikipedia (41); Chinese Wikipedia (4); Russian Wikipedia (13); Russian Wikinews (5); English Wikiquote (1); English Wikiversity (2); English Wikivoyage (4); German Wikipedia (46); Polish Wikipedia (6); Polish Wikisource (2); Korean Wikipedia (3); Czech Wikipedia (3); Portuguese Wikipedia (1); French Wikipedia (48); Dutch Wikipedia (14).
English, French, and German Wikipedia had the most participants comment during phase 2 when including concurrent community discussions.
Demographics (Overall). A bar graph outlining the user rights of contributing users: Admin/Advanced rights (54); ArbCom (8); None (183)
54 local admins or advanced right holders and 8 arbitration committee members participated in the consultations.

Major themes

This is a high-level overview of the themes that emerged. These were both commonalities and divergences between participants and across projects within these areas. For reference, the original discussions are linked here.

Context sensitivity
Participants frequently pointed out that the code and its enforcement must remain subject to the appropriate context, and account for local differences as to culture and social norms. Several worried that the civility rules might be too harsh or subject to weaponizing, and there were concerns expressed by most communities that overly strict application of civility rules adopted by one culture to another may be divisive or exclusionary, or infeasible for linguistic reasons.
Useful reporting system
Participants generally indicated a need for a more useful reporting system, especially one that could handle interactions that could span years, over thousands of edits, making it easier to submit useful reports with the requisite information to be addressed effectively. Several English Wikipedia users discussed how and whether to have a way to flag problematic posts, while a Russian Wikipedia user suggested a feature similar to upcoming mentorship tools from the Growth team for new users to get quick help.
Routing reports
If there is a user reporting mechanism, it will probably send some reports to the Foundation: these would be the "bright-line" reports traditionally handled by Trust and Safety Operations. Meanwhile, most participants from communities with established local governance systems indicated a desire for issues traditionally handled by the community to be addressed locally with T&S reserving action for the usual violations necessary for legal reasons as a platform provider.
Appeals process
In the context of a system where reports could be sent to local volunteer moderators, or to Foundation staff, users may wish to request an appeal of actions taken or not taken. Many of the participants expressed concern that Foundation Office Actions may not align with local norms, or that local systems can be ineffective or inadequate.
Global concerns
Currently there is no process set out to conduct a project-wide review: several participants suggested a policy would be needed covering the form of these discussions, the timelines they operate under, and what outcomes are permissible, and how they will be carried out. A Chinese Wikipedia user indicated support was needed to strengthen local policies and put in place to have local functionaries since currently there are no local checkusers following the protective removal by a global body. A user from Portuguese Wikipedia felt that all review of the project must be by local users and conducted from within, where a global body could help mediate if needed.
Support for volunteers
There were a number of calls for additional support to be provided by the Foundation: additional training in identifying and addressing harassment, software development efforts (especially the resolution of bugs affecting workflows), and mental health support and resources for volunteers. It was also pointed out that peer support often happens only in an ad hoc fashion, so contributors can feel isolated.
Private reports
Participants have expressed apprehension about reports being handled privately, especially as it relates to allowing the accused an understanding of the conduct considered problematic and an opportunity to respond to allegations. Meanwhile, the current pathways available to report problems can be unclear, especially to new users. Even established users can be reluctant to create public reports about inappropriate conduct by other established users who wield expert, referent, or connection power, especially if prior peer moderation efforts have already been unsuccessful.
Abuse response workflows
There have been requests for Foundation to take legal action to curb undesired behaviour more often in combatting long-term abuse. There is also a need for the UCoC enforcement to not negatively affect current on-wiki abuse response workflows, for example, by adding additional burdens on volunteer responders to engage with those who may be trolling.
External influence or misconduct
Participants expressed that projects, their content, and their administrative bodies, must be protected from capture or undue influence/control by exterior organizations, governments, or groups of political idealogues coordinating off-project. However several participants pointed out that sanctioning for behaviour off-site was ill-advised or impossible. Users from several projects expressed concern or confusion as to the extent the UCoC sought to affect content.
Community approval/ratification
Participants have indicated a strong desire to have a community approval or ratification process before the new code section is implemented. Some discussions indicated local communities would only enforce sections of the code after they had been approved locally. These views were communicated in the ArbCom Open Letter, and were also expressed outside of the key questions: in general comments at the consultation, at UCoC venues, and in replies to the announcements/invitations.
Community revision process
There are present concerns from contributors about the currently-ratified policy text, and a desire to have the ongoing revision process built-in to the new code section with a process for meaningful input from volunteer participants. Participants from most projects have expressed the code is hard to translate or understand in parts and suggested a glossary be provided. Several were disturbed by the lack of progress addressing concerns previously raised.
Open Letter from Arbcoms to the Board of Trustees
Shortly before the launch of the consultations, 50 members of 7 "arbcoms" signed an open letter. Arbitration Committees are typically the highest volunteer user disciplinary body and dispute resolution body on a project. The letter asked for volunteers with certain qualifications to be included on the UCoC enforcement drafting committee and sought a formal process for ratifying the UCoC enforcement system, and an amendment process that ensures communities and individuals have a chance for meaningful input before amendments are adopted.

Individual projects

英語版ウィキペディア

According to English Wikipedia, it "was founded on 15 January 2001 as Wikipedia's first edition and, as of April 2021, has the most articles of any edition, at 6,289,759."

The consultation was advertised at village pumps, administrators' noticeboard, the arbitration committee noticeboard, and the centralized discussion template, which provided a link to the consultation on over 5000 pages and highly-trafficked venues. 41 community members participated in the local discussion in English Wikipedia including 17 users holding administrator or other advanced rights.

The project has an established local governance system with over 1000 administrators, including an Arbitration Committee that has a standing monthly call with Trust & Safety Operations. There are multiple ways to file private reports, including by private email to administrators, the Arbitration Committee, local CheckUsers & Oversighters, or Trust and Safety.

The project also has a mailing list known as functionaries-en that serves as a platform for general discussion among current Arbitration Committee members, advanced permission operators (CheckUser and/or Oversight), all former members of the list in good standing, and other editors with official Wikimedia Foundation status. As a result, the term "functionary" can be confusing as might refer to 1) a general class of users that administrate (as used in the UCoC); 2) arbitration committee members (as used globally, in a more restrictive sense); or 3) subscribers to the "functionaries-en" mailing list on English Wikipedia.

EnWiki is a highly-researched project:

Editors responded either to key questions or by leaving general comments, and the participants were mostly experienced users.

English Wikipedia user input

Input has been roughly sorted into themes and condensed. Individual views expressed here may represent one or many: see original full-text of the consultation.

Context sensitivity
  • Mostly comes down to shared community values; a community that does not endorse harassment will self-moderate though there were disagreements as to which form the moderation should take, provide those self-moderators a guideline for interactions so it's easier to identify; perhaps a separate noticeboard should be available
    • As a counterpoint, it's almost impossible to set out a full rulebook; enforcing a "no personal attacks" rule becomes an exercise in futility
  • It was pointed out that sometimes correct editorial or administrative action can feel like discrimination, failure to engage, or harassment and appropriate attention should be paid to context if a user files reports
  • Meanwhile certain editorial actions can be done with harassing intent yet be given the veil of editorial legitimacy and not addressed by the community
  • It was pointed out that intense editorial scrutiny can be traumatic to contributors and it is important to take the perspective of the other when acting
  • It was also mentioned that for certain traumatic subjects, just one or two participants behaving aggressively can disenfranchise contributors topic areas
  • Mediation can be useful; sometimes behaviour can be seen differently depending on the observer (what may be viewed as harassment, could be a frank discussion)
    • There is a suggestion for professional mediators to be available, and a note that the lack of any kind of "content board" for settling content disputes means content disputes will eventually result in conduct disputes as parties become more entrenched in their side of the content disagreement
  • We shouldn't sanction for making good-faith mistakes with pronouns / using singular they
  • Hate speech may be too vague (for example, someone trying to report a user who expressed conservative religious views as making hate speech)
  • Insults should not be taken to include pointing out that a contributor's language fluency is not adequate to contribute to the project or that a contributor lacks sufficient understanding about the topic they are writing or the project scope
  • Hounding should not be taken to include the proper use of Special:Contributions to fix problems that a contributor may have widely created, this may upset the person which cannot necessarily be helped if the edits are necessary for project integrity
  • The code must be implemented on a contextual basis, taking into account local differences as to economic development, culture, social norms and politics and the prevailing cultural and economic background of the average editor of that wiki
  • Civility rules that are being espoused by the UCoC are too harsh for enwiki, may cause more problems than solutions and will end up pushing people away
    • in a collaborative environment, this means sometimes you have heated arguments and ruffled feathers
    • concern that civility enforcement could actually result in less diverse contributorship from cultures that do not accord with strict views of civility
  • Erroneous reports: a way to ensure the reporting system can handle the volume of reports that will simply be new users not understanding how the normal editing process works; being undone is not harassment
Useful reporting system
  • A way to flag posts that are seen as harassing, a summary of reports could be provided in an anonymized form
    • A concern raised was the flag capability would generate many reports without a capacity for investigation, especially given the misuse or abuse of the feature
    • A concern is that if there is a post flagging system which only calls attention when the threshold is reached could reduce confidence if no action is taken on reports
  • Ask volunteers to rate samples of interactions over time to see if interactions were becoming more positive or negative; as well as a suggestion for ongoing surveys to determine community health
  • Context is important and hard to determine from surveys and it is hard to accurately collect and evaluate these data, for example, fewer reports could mean less confidence not less harassment
  • A reporting wizard that helps a user find where a report should go and what evidence would need to be provided; a tool that makes it easier for non-experienced editors to create a proper complaint with evidence
  • Improve the ability for new editors not familiar with norms and reporting mechanisms to get help, e.g. a useful link from the contact us form where people might start
Routing reports
  • The proper pathway to address most harassment is through the community: administrators, escalating to the arbitration committee and this usually works especially by appeal to common values
  • Harassment may occur over hundreds of interactions, making it hard to identify and address; volunteer time and the appetite to conduct deep investigations like this is limited; accordingly community mechanisms – including arbitration – are often ineffective here; T&S noted as willing but unable to scale
  • Several users pointed out the Foundation has not had good success with private handling of conduct disputes, and expressed concern about a private peer-dependent ticketing system for user conduct reports
  • A committee similar to Ombuds Commission tasked with receiving and triaging reports of abuse to be forwarded to groups on local projects or T&S depending on the pathways available, a concern that being triaged by this group into a pathway would prejudice the conclusion towards an action
  • A previous RfC held on the project (89 participants) was closed with the statement that "Editors strongly feel that en-wiki issues should be handled [locally by volunteer users], and only matters that affect the real world ([fear of retaliation] / [safety of reporters]) should be passed to T&S. A better/improved dialogue between ArbCom and the WMF is also desired, with the Foundation and T&S passing along en-wiki-specific information to ArbCom to handle."
Appeals process
  • Separation of administration and content; perhaps a new role is needed to handle behavioural issues that are as impartial detached as possible – even paid employees having no interest in the content whatsoever, to deal with behavioural issues
  • A comment that the T&S conduct warnings do not provide appropriate guidance or clarity
Global concerns
  • The criteria are unclear when trying to determine that a project is "struggling"; each project should be looked at individually and provided assistance strengthening their local governance as needed
  • Ombuds commission is already available for allegations against functionary groups
  • Communities should have a formal means of desysopping if there is no local body; or the local body is inadequate, so there would need to be a way to make this determination
  • Foundation should act through generally accepted mechanisms that exist already; though Stewards are reluctant to take action without strong consensus on Meta; no one is tasked to problems, so they tend to persist and escalate over time
  • Entire-project reviews shouldn't go to a group but remain a Meta discussion however what is necessary is for a policy covering the form of these discussions, the timelines they operate under, and what outcomes are permissible and how they will be carried out
  • Global or Regional ArbComs for communities without a local one: for a referral or opt-in from local communities (or if it is identified as "local body inadequate")
    • a global body could work with stewards and global sysops on smaller projects without established processes
    • consensus needs to be clear, while any opt-in process wouldn't address struggling projects as they won't opt-in
    • a global dispute resolution body would not usually have a reason or leave to operate at enwiki or other larger projects with ArbComs (leaving only the rare cases where improper actions were across several wikis and global in scope)
    • Global policy implementation is hard (e.g. global bot policy implementation), though it may be better than the status quo which hasn't been particularly effective either
Support for volunteers
  • Volunteers should be able to access mental health support and other resources through the Foundation and receive the same support available to staff who are feeling stressed or harassed
  • Contributors are generally "on their own" unless they have developed their own networks, if they come up against opponents with more time, determination, associates, they will be outmatched
  • Since there is no structured support, people feel isolated or form small groups for informal support- which can lead to groupthink
  • The unofficial Discord server can sometimes be used as a support network
  • While there should be support available It is difficult to create a 'universal platform' for support since communities have different needs and outlooks; it is also pointed out that decisions shouldn't be made "off-site" in support groups
  • There was a request for far greater support for the accumulated technical debt and generally enhancing administrative tools (especially at Foundation village pump: users concerned with the state of notifications on mobile and apps, noting that the editors lost due to the lack of ability to engage with them will affect contributorship)
  • Foundation-funded anti-harassment workshops and bystander intervention training for community members; create a sizeable group of contributors who can intervene in situations, de-escalate, and properly respond to allegations of abuse will make the community more resilient in the long term not only because we will have people able to handle issues properly, but this group can further train other volunteers to create a self-sustaining cultural institution
  • Proactive reactivity: a response team that didn't wait for a report to on-wiki harassment
Private reports
  • Editors are less likely to report off-wiki harassment when they see it, especially when it targets other editors: a dedicated form for reporting off-wiki harassment (as well as other issues that need to be examined in private) would be more approachable than having to find the correct department to email
  • Serious harassment and threats of violence (as issues potentially requiring the involvement of law enforcement) can be dealt with by a central mechanism; projects should be better assisted to deal with specific requirements of "vulnerable people"
  • Some understood that reports for vulnerable people, those involving serious harassment, or threats of violence would be handled privately, but care should be taken to ensure those investigating these incidents are competent, diligent and empathetic people
    • put in place some sort of clear review mechanism – perhaps some kind of committee composed of community members and trained staff – so there's a sense of accountability
  • It is felt important that the person being sanctioned, and the community, can understand what was the act (harassment, etc.) that led to the sanction. Disclosure of the act involved does not require disclosure of the reporter
  • Although there are cases that need to be handled privately, to handle cases privately without justification could result in a lack of due process and unsubstantiated conclusions
  • Not allowing private reports means there won't be many reports at all; generally a user will only report wrongdoing by power users when they are pushed to a breaking point; a mechanism for non-public complaints run by competent users with integrity
  • Demanding transparency does not mean either resistance to criticism and any step that risks unfairness should be fought against firmly
    • Accused needs to know the evidence and, quite possibly, awareness of the accuser if not inherently clear in order to ensure that adequate defences can be made
    • Exceptions should be limited to when there would appear to be an appreciable threat of off-wiki follow-up occurring
    • Private courts, as well as being problematic inherently, would also face the same issue that admins who work heavily in AE generally become more severe over time than the general community
Abuse response workflows
  • Most harassment is relatively straightforward and results in a block, ban, disable email and TPA, range blocks, edit filters, and protection where indicated, and it is felt the Foundation should work with ISPs, legal, PR, tech, the ordinary admins who witness it, and really anyone else they need to, in order get the [long-term abusive individuals] effectively legally and technically kicked off the site
    • (a) a system to proactively globally block open proxies & VPN endpoints, (b) a framework to request "Office contact" with ISPs whose subscribers commit serious, ongoing, intractable abuse on Wikimedia projects, and most importantly (c) a formal way for admins, stewards, and functionaries on the various projects to work with the WMF to address the issues of long-term, serious abuse
  • Integration with CUwiki? Allow non-CUs to help travel abuse without seeing the IP addresses
  • Some way to avoid putting all the counter-abuse techniques out in the open
  • Read plainly, section 3.3 seems to require appropriate discussion or providing an explanation when reverting, blocking, and ignoring ("RBI", the standard response to trolls), which would be undesirable in responding to long-term abuse cases, people disruptively pushing a point-of-view, and most areas that are of interest to Arbcom
    • A belief that those that engage in these behaviours won't follow a code anyway
External misconduct or influence
  • Participants were generally unconvinced that the UCoC could or should try to address off-wiki behaviour and one raised concern that it wasn't clear if an editor could be sanctioned for a violation of the UCoC off-site in an unrelated matter and how/whether the UCoC would affect off-platform Wikimedia-specific discussion spaces and their moderation team
Other comments
  • So-called WP:UNBLOCKABLES, said to be power users that are impervious to sanction or peer moderation and continue to act with impunity in an abusive fashion
  • A contributor's own race and ethnicity may be considered meaningful to them, they should not be dismissed out of hand, "meaningful distinctions" text should be omitted
  • Language in the harassment should drop the words "in an effort" as it is subject to interpretation
  • The "Why" is unconvincing: a set of justifications focusing on ensuring Wikimedia covers content from diverse perspectives and maximizing social benefit for editors and readers would be much more convincing than the current text
    • draft does not explain how the CoC will help us accomplish the goals it lays out
    • It should prevent tacit rules from inadvertently protecting inappropriate behavior
  • Bullying is a concern, and scope should include "attacks and harassment"; 'no personal attacks' and 'no harassment' should be elevated to a fundamental principle
    • Contributors are expected to bear recurring personal attacks that don't stop after reasoned requests by seeking dispute resolution, suggesting that such situations can be resolved by working together and focusing on content

ロシア語版ウィキペディア

Russian Wikipedia was included in Phase 1 of the Universal Code of Conduct local language consultations (summary report). In March 2020 a dozen active Russian Wikipedia editors opposed the idea of Universal Code of Conduct or its enforcement. A productive dialog was opened later that year between the local Arbitration Committee and the Foundation.

The 2021 consultation on Russian Wikipedia saw participation from 13 users including 2 administrators and 2 arbitration committee members. The discussion was organized by a trusted local user who also condensed and translated the community input to aid with facilitation. This is an overview of the points raised; the views expressed here may represent one or many: see point-by-point translations below or original full-text of the consultation.

Project users have ongoing concerns with the policy text and its translatability to Russian. "It is difficult to translate the code word for word. Some words from English in another language are neologisms. Existing words are not suitable, for example, the word harassment requires the disclosure of the meaning to the reader. In translations, we need a place for maneuvers, synchronization of the meaning of a word with everyone, so that we all understand one thing, not different", a glossary. (Context sensitivity)

Users mentioned a need for very simple reporting mechanisms. They have argued that a person who needs help usually wants to receive it as fast as possible and a simple system could minimize the time needed to report an incident. Users have also said that a simpler mechanism could make it easier to contact administrators if their help is needed. (Useful reporting system)

When it comes to solving problems, one prominent member of the community mentioned that the community requires neutral users, who did not engage in a conflict, to resolve it. They are not sure whether it could be any users or if a special group (like "mediators") should be created. Users mentioned that "interest-groups" (for example Wikiprojects) usually work fine when it comes to resolving disputes between their own members. (Routing reports)

Users have said that it should be possible to report incidents privately involving harassment. In their opinion, it would help with describing the situation without antagonizing the community and the possible opponent. (Private reporting)

When it comes to reacting to unwanted behaviours, one administrator has mentioned that any punishments should be levelled and built-upon. Any blockades should start at a low level and progress from it. (Appeals process)

Users have mentioned a need for everyone to be treated equally: "Situations must be considered regardless of who said and in response to what. We are not in kindergarten, there are acceptable ways to deal with attacks. There should be a way to assess a phrase for violations without the evaluator asking WHO and against WHOM whether he likes the breaker or doesn’t like the one against whom the rules are broken." It was also noted that people who exploit the rules and are provoking others, should be excluded from the community.

Users suggested that "private" channels (like rooms, mailing lists, social media groups and private meetings) are a difficult area to create any rules for. They say that people should be allowed to join such groups, but it should be discouraged and people should be advised to join official communications channels. The same user has also mentioned that it is unjustified to apply sanctions on the Wikimedia projects for behaviour outside of them and that sanctions in these platforms (where the behaviours took place) should be enough. (External misconduct or influence)

It was argued that any process should be transparent and available for the members of the community to be seen. Some portion of the material could be hidden to provide for the safety of the accused/victim and the people assessing the material, but the judgement and judges opinion should be published. (Private reporting / Appeals process)

Users mentioned that any enforcement rules should be created as much not-anglocentric as possible. It was also suggested that people who undertake the process of enforcement should be familiar with the culture and the rules of the community they are engaging in.

Russian Wikipedia user input

A trusted local user translated the answers to the key questions.


This is a short briefing; for a complete understanding of the ideas, it is better to look at the detailed answers in our survey: original full-text of the consultation.

  1. Community support
      • Polls. Sometimes announce a 5-20 question survey. (the foundation has already done this several times Research:Characterizing Wikipedia Reader Behaviour/Demographics and Wikipedia use cases)
      • Button. In the beginning, the user can try to get help via a link/button, and a counter can be made from the event of clicks. It can be difficult to fill out and send a message, so the fact of the first click is important. (you can further compare how many people entered the entry point (button) and how many left / active after a few days)
      • Mini-administrator with peace enforcement tools. It all comes down to requests to participants with administrator rights. If someone wants to communicate neutrally and smooth out social conflicts, he can have his own new group of rights without influencing the versions of the articles.
      • Large pages for reviewing places on a wiki, suggesting steps before requesting, personal message (sometimes it is emotionally easier / safer to write to one person in person than in public places)
      • Contacting Wikipedia and the administrators is scary for new contributors. The participant can act closely – on his own talk page, place the template "I need help" (:Russian "get help" template). It would be helpful for experienced participants willing to respond to be notified quickly immediately after using the template, rather than checking the category occasionally.
  2. Reporting pathways
      • Interactive guide. You choose the reasons, you get the solution (there is a New Articles Wizard and an Image Upload Wizard)
      • ORES. It will help just knowing that the edit can be recognized by machine learning as bad, they will try to avoid this (participants can try to train the AI to determine the toxicity of the message)
      • Gradual freezing of the environment. Partial blocking, ban on topic articles, temporary blocking (many participants consider blocking to be something bad, so much so that they even ask to hide it from the logs. Perhaps a mechanism for freezing rights without the old meaning of blocking is needed)
      • ruwikinews faced the threat of violence (iii), they probably have some experience with this
      • Soft and hard pressure and clearly marked boundaries of the permissible.
      • Gradual progressive increase in subsequent punishments. Errors made for the first time can be lightly punished, but it is necessary to bring and acquaint with the norms and probable real punishments. (not very serious in the beginning. Any first block is emotionally serious)
      • Automatic ways of applying restrictions from one account to new accounts created by a participant
      • Communication under the supervision of a third party. The text is interpreted by everyone in different ways and in their own favor. Abstract game situations that can be associated with your case.
  3. Managing reports
      • There should be a simple way to assess toxicity/malice in an action/statement, without referring to the chronology of events and rules, without imagining a psychological portrait of the individual. If the participant is persistently toxic, he should be limited in metapedism
      • Contribution intersection tool https://interaction-timeline.toolforge.org/ . This tool needs a configurable maximum period between one and a second participant in the same page/discussion
      • Warnings if new actions of a participant in a topic area/article where a restriction was issued to him (setting filters for a participant)
      • Automatic notifications to administrators about the activation of the situation control filters. If the edits appeared in a prohibited article or a short time after the victim
      • Buy scientific research. (there are specialists with an education in the field of Conflictology, pay them for a scientific article about the processes in the wiki society from their point of view)
      • Public support for opinions. for example, a button template showing the number of consonants with a message from one person
      • More supporting initiatives, tools, approaches from the system (extensions to use or add to the wiki)
      • Assessed by the severity of the impact if the harm came back and happened inside the wiki
      • The participant provides links to several cases of investigations
        • Probably, you can specifically write that coordinating attacks on wikis in closed chats is highly undesirable (complicated by the fact that a volunteer whistleblower compromises personal data of his own and others)
        • Systems are acceptable, the log of which can be obtained, but warn that their moderation does not depend on us
        • It is unjustified to punish locally for what was done in another project
  4. Handling reports
      • The user could see the progress of the request through stages, comments (confidentially) and the result at each stage, and not just send a request, wait, and receive a prepared response
      • Rechecks should have access to all the data of the previous process of finding solutions. In the new process, there must be a preliminary decision, which is executed after a short time, if there were no objections (so as not to provoke another appeal cycle)
      • It should look like a common, complex procedure being carried out. If considered narrowly within the framework of one rule, then participants will use this and repeat it with other rules or deliberately avoid mentioning other rules.
      • Protect as necessary, but if there is something that you consider the abuse of the request system, of course, you should notify the local functionaries (checkers, etc.)
      • Access to read them should be controlled and retain a reference to the initiating request/access motivation. If there were cross-wiki violations, will the data be accessed by local checkers or the global authority will look at the data on demand.
  5. Global questions
      • Provide a way/form in which participants can express their opinion, by which the global body will see support for the opinion, and itself will remove the participant from problematic instruments. (in practice, it is more difficult for the participants of one wiki to take away a group of rights than to give it)
      • If ordinary participants receive pressure, you should not be afraid to leave the wiki without groups of rights, the wiki will survive with external control (stopping/freezing access to tools is not such a big problem, we can always go through the distribution of rights again)
      • Process automation. Opinions are usually collected slowly. (any means of simplifying the collection and posting of comments are already useful)
      • Separate chat system. There is an elected group of participants that considers a complex request and prepares one solution. They can be given an extension to work on one page with the text of the resolution, in which they can open and close discussion threads in a simple chat and link the thread to a place in the text. (like google docs, but with a visible action log for the wiki)
      • First, ask the communities to check the rules for compatibility because there may be differences
      • Community requests the examination of the draft of the new/old rules for compliance with the Code and the global authority may evaluate ok or not
      • Inviting a member from another community to look at a problem can provide a fresh perspective and illuminate an unknown solution

Questions

  • It is difficult to translate the code word for word. Some words from English in another language are neologisms. Existing words are not suitable, for example, the word harassment requires the disclosure of the meaning to the reader. In translations, we need a place for maneuvers, synchronization of the meaning of a word with everyone, so that we all understand one thing, not different. For example, at the beginning of the contract terms are established (e.g. "referred to" in the [1]), or in scientific articles, there is a section on abbreviations and definitions. General comments
  • Some automated tools, such as protection, stabilize in the Extension:FlaggedRevs and rollbacks, make it difficult for volunteers to contribute, so maybe a global body could look at cases of use of violence with tools.

ロシア語版ウィキニュース

A total of five users (including two local administrators) participated in the discussion, providing links to Russian community UCoC Phase 1 discussion, Russian UCoC Policy text discussion, Russian UCoC Draft review discussion as well as the UCoC 2021 consultation on Russian Wikipedia.

The organizing user summarized as follows: "The Russian Wikinews community was not interested in this discussion. Perhaps because if something ain't broke, don't fix it. Users note that the problems raised in UCoC are not relevant to wiki projects now, and problems relevant to projects are not raised there. The Russian-speaking community has expressed this opinion many times (see links above) and regrets that opinion was ignored".

Individual views expressed here may represent one or many: see original full-text of the consultation.

  • There seems to be a dispute regarding the translation of the word "harassment" in the Russian UCoC policy texts, and concerns that the current wording may be used in bad faith by vandals to prevent proper sanctions
  • It was also highlighted that, all possible problems seems to have been reduced to "harassment" cases, meanwhile, the Russian wiki/community faces some major problems which are very different from what the UCoC is addressing
  • More than one user thinks of Wikimedia Foundation as authoritarian, comparing the WMF to Russian government, in the sense that they think that they don’t actually have a say in this process, even though the Foundation would have them believe that they do
  • A user thinks the Foundation should just go ahead and implement the UCoC policy, instead of going through the difficult path of getting a global consensus on the subject

中国語版ウィキペディア

Four participants contributed to the Chinese Wikipedia consultation including a local user who helped to translate the comments made.

Individual views expressed here may represent one or many: see original full-text of the consultation

  • (On handling reports) Foundation should consider whether the measures would hinder or do nothing to anti-vandalism work. Otherwise, it may end up with the same result as the IP masking proposal
  • (On managing reports) Reports obviously violating the current policies may be rejected either manually or automatically with technical measures
  • (Relationship between local and global enforcement body) If a local enforcement mechanism doesn't exist, or exists but cannot function properly (i.e. the local agency fails), a global body should "handle the cases (issues) decisively and as soon as possible with appropriate (and strong enough) measures"
  • (Question 1 of Global Questions) Meta [global body] should improve the related policies, in order to help the local community build a local policy. The alternative to this is to let the global administrators and Stewards handle the cases instead of the local community
  • Comments:
    • It is important to note that the Foundation removed all local checkusers from Chinese Wikipedia due to security concerns
    • In the context of this project, the ability for participants to contribute content neutrally and uphold founding principles should be reinforced; access is restricted from some regions
    • On the level of engagement, it was suggested that the local community generally doesn't pay attention to global affairs, even resisting the Foundation and global efforts or discriminating against those users who do not agree with them: there are ideological differences

チェコ語版ウィキペディア

The facilitation team coordinated with the executive director of the local affiliate who provided translations of the key questions and invited local user input. A local arbitration committee member and the affiliate representative pointed out that they had provided input in previous outreach already. A question was raised whether additional support would be needed by the arbitration committee should there be a possible increase in complaints, or how they would envision support from a global structure. Only one other user commented at the thread.

ポルトガル語版ウィキペディア

The discussion at Portuguese Wikipedia was a later entry to the consultation, starting on 6 May. The discussion was advertised centrally and several established users were invited to comment. So far, only the organizing community member has participated in the discussion. They provided their thoughts in English on Meta as well:

For Global Issues: In my opinion, no decision of any kind, including decisions to withdraw verification tools, should occur apart from the home communities. This kind of decision creates a huge schism between the home communities and the global community (Meta), which appears to the home communities as imposing a will from above, a will, moreover, completely foreign to the home communities. Not that there cannot be external auditing bodies, there must be, but audits must inevitably work with the home communities. For cases that require secrecy, let them work with those in the home communities who are able to work with this information. On dispute resolutions, they should follow what was said before in general: they should first of all contact the home community and work together with them, never, ever, separately."

Other English Wikimedia projects

There were 7 volunteers that engaged following outreach to other English Wikimedia projects. Community members either commented on the invitation directly or expressed opinions on the user talk page of a facilitator.

英語版ウィキボヤージュ

Following an invitation to a local community venue, 4 volunteers engaged in a productive discussion summarized by the team below. Individual views expressed here may represent one or many: see original full-text of the discussion.

The input from Wikivoyage is that it is a smaller, developing community, with local enforcement mechanisms that are functioning adequately. Users mentioned that any problems were generally from cross-wiki disruption rather than abuse originating locally.

  • Harassment is not tolerated on Wikivoyage, so harassment/vandalism/trolling is not an overwhelming issue
  • The majority of the harassment on Wikivoyage is from vandals, trolls, or long-term abusive sockpuppet accounts mostly spilling over from other projects ("cross-wiki abuse")
  • Participants generally found reporting harassment to local administrators was a satisfactory enforcement pathway
  • Sometimes, disagreements can be perceived as harassment if there is a misunderstanding
  • In the context of administrator accountability, participants generally felt administrator reconfirmations would not be a good use of community time, as local administrators have community support

英語版ウィキクォート

One user posted to a facilitator's user talk page, expressing they found the global policy to be difficult to translate given the ambiguity or lack of clarity about some of the language used (referencing an example on the talk page of the policy text.

英語版ウィキバーシティ

Input provided at Wikiversity did not relate directly to that project: a user expressed their view on global bans and the role held by Stewards in enforcement pathways, while another user responded to rebut some of the views expressed.

Participants from projects without local discussions were invited to provide input in any language at a talk page formatted to display questions in the available languages based on the user's language setting: 27 users participated in this central discussion on Meta, and more than 75% of those responding had significant participation in more than one project, with a third holding advanced user rights on at least one project.

Meta user input (English)

The facilitation team summarized the English-language comments. Individual views expressed here may represent one or many: see original full-text of the consultation.

Community Support
  • A concern was raised on measuring anti-harassment efforts, because the methodology can be easily biased.
  • A user pointed out that users should never be sanctioned until they have been given adequate opportunity to respond. An example was cited on how the ANI process in many projects are easily misused, especially where a group of editors are targeting a single editor and then the process degenerates into mob justice.
  • It was suggested to set-up a central committee for each language which should consist partly community members and partly Wikimedia Foundation to handle allegations.
  • There is support for public proceedings, but with clear step-process on how proceedings should go.
  • Point was raised on harassment which may not be caused by interaction with other editors, but indirectly by article content. A feedback structure for highlighting divisive and problematic content was suggested.
  • Truly neutral and unbiased editors should be given the authority to judge in harassment cases, especially when an editor is being silenced by arbitrary blocking or sanctions.
Reporting Pathways
  • Having a link in the left column to report misconduct/harassment. This may cause a flood of complaints. However, if enough information were captured in the complaint, a dashboard could be created to act on the data in aggregate.
  • A user suggested involving psychologists in the resolution process.
  • Points were raised on how difficult it currently is to resolve complicated issues, and how most victims eventually resort to just leaving the project/movement.
  • One idea is to implement tools to detect harassment or divisive content before it has been submitted. During the editing process, a tool could be used to give feedback on the author’s language. This might provide an early indication to the contributor that their approach may be inflammatory, prompting them to change their ways before actually publishing.
Managing Reports
  • A user suggested that rather than considering the reporting structure as a ticketing system where each inquiry must be individually addressed, it could instead be used by an admin to detect high-value areas of correction / improvement.
  • We can reach out to nonprofits that support diversity and inclusion programs within corporations, who may be willing to support with important training materials.
  • Every content related issue should be addressed by a different set of admins and editors, with zero interests in the problem area, in order to avoid “veto” and domination attempts by a biased group of editors.
  • There were major calls for transparency and accountability, with references to several occasions cited when the WMF had banned users and there were absolutely no ways to find out what their offences were or if the bans were justified.
  • It was also argued that sanctions should always be appealable.
Handling smaller projects struggling with neutrality and conduct issues
  • One solution suggested, especially for projects that have very few administrators, was to have "guest administrators" with decent command of the languages. However, another user highlighted that such admins might not be really effective due to the language barrier and might also be tempted to use the processes they are used to, rather than to check for project-specific guidelines.
  • Another point made was to ensure true inclusivity by not imposing western values and perspectives on projects. It is important to learn and understand the local culture, as conduct and neutrality issues can only be truly addressed if you are capable of fully understanding the situation.

Meta user input (German)

After being advertised via central notice on German Wikipedia, a significant number of German-language responses were offered at the Meta talk page.

One user asked where UCoC reports would go (i.e. a newly created body, the Foundation, higher appeals body, etc.) and whether the reporting user could choose which body would hear the complain, expressing that reports should go to native speakers, as there can be subtleties not detectable by non-native speakers which could lead to unfair results. This user also pointed out that if every local decision could be appealed, stable fair verdicts will be difficult.

The facilitation team was thankful for a multilingual user who provided the below summary. Individual views expressed here may represent one or many: see original full-text of the consultation.

  • "Nur fürs Protokoll" – Just for protocol, the user thinks WMF dismisses external opinions if they contradict the opinion of WMF. Examples: Fram, Superprotect, Renaming. There is a wish for seeking support by the WMF not only consultation (10 users supported the statement as well).
  • (Global dispute resolution) Users discuss the somehow broken relation between WMF and some communities and are in fear of repeating problems (super-protect, Janneman, Fram). These incidents are named hostile interference.

The users present also the following points:

  • The UCoC feels forced upon the communities
  • The possibility of rejection is missed
  • The WMF should take care of the servers and act only in case of legal problems
  • m:Global arbitration committee might be a good idea, but should evolve bottom-up from the communities, without any influence of the WMF (and thus trusted by the WMF and the communities)
  • some other/higher committee than the arbcoms might be a problem

Concurrent discussions

Some communities had discussed the code at the same time as the other local language consultations. The facilitation team coordinated with users familiar with these projects to help understand the existing community positions.

オランダ語版ウィキペディア

The facilitation team summarized the Dutch Wikipedia community discussion started 30 January 2021 which included 14 community members, including 3 administrators and 1 arbitration committee members.

Individual views expressed here may represent one or many: see original full-text of the discussion.

  • Several participants saw the code as interfering with the community, or had issues with the way it was written, or the concept of a Foundation-ratified policy about user behaviour
  • The code should create a positive environment to fulfill the mission of gathering and sharing the sum of all human knowledge; so as many people as possible are able to actively participate in Wikimedia projects and spaces
  • While empathy is important, a requirement to "Practice empathy" is impracticable to enforce
  • It is important for established projects to consider other, younger Wikipedias that may not have all the practical guidelines in place yet.
  • More clarity is needed about who is represented by "We"
  • There are concerns about less autonomy for the Arbitration Committee and each individual community, it does not necessarily follow that if the bad behavior is filtered out, the project will automatically get better
  • Concerns about who will deal with the information including confidentiality (currently only CU, OS, and Steward members are signing the Non-public information policy agreement)
  • It was noted that there are users who test the limits of behaviour, despite warnings, and there is no adequate deterrent
  • Confidentially of reporters are not practical because explaining the misconduct to the actor usually gives an indication of the target (who will be assumed to be the reporter)
  • While community guidelines are basically in accord with the UCoC, there may be a lack of clarity on the definition of misconduct

フランス語版ウィキペディア

The facilitation team followed the French Wikipedia discussion that started 13 February 2021. The discussion page attracted over 700 edits and ran concurrently with the local language consultations. It was the largest and most well-attended volunteer-organized discussion, and saw comments from 48 participants, 8 who held administrators or advanced rights, including 4 arbitration committee members who had signed the open letter.

A clever collaborative effort was undertaken by participants here: the translation of the code was copied to the local project and wikilinks were added to each expectation. This showed where the code was already enforced locally, and highlighted in the form of "red links" where a local policy did not exist to cover the global policy.

The team coordinated with trusted local users who explained the community's current position and sentiment about the code. Individual views expressed here may represent one or many: see original full-text of the discussion and discussion archive.

An exhaustive comparison of the current rules & recommendations of fr-WP vs the UCoC has been carried out. Eleven UcoC requests are not explicitly reflected in our rules. Among them, some are perceived as obvious, useless to detail (example: no sexual harassment)

To the fr-WP community, the three major problems are: "Respect the way contributors call and describe themselves. Some may use specific terms to describe themselves. Out of respect, use these terms when communicating with or about them, when linguistically and technically possible" rallies the strongest opposition, for the reason that in classical French there are no pronouns or established rules to gender non-binary people. As it stands, it is likely that this provision of the CdCU would be rejected by a community vote.

The definition of harassment is perceived as too broad ("behavior likely to upset a person"), and could open the door to impinging the editorial content of the encyclopedia

"The use of symbols, images, categories, tags or other types of content that are intimidating or harmful to others outside of an encyclopedic and informational context of use. This includes putting in place rules on content intended to marginalize or ostracize" is not understood, and (so) perceived as a risk to limit the editorial content of the articles in the encyclopedia.

The fr-WP sysops and its arbitration committee are chosen by the fr-WP community to enforce the rules and recommendations that the community has approved. They have no proxy to sanction specific UCoC requirements. If WMF wants local officials to enforce the UCoC, it must be approved beforehand by the fr-WP community

The community suggested that voting should be organized to accept or reject each of the 11 gaps between the UCoC and its rules (rather than one voting on the whole text), and that we should leave enough time for discussions.

ドイツ語版ウィキペディア

The facilitation team summarized the German Wikipedia community discussion started 1 February 2021, which saw comments from 45 participants.

Individual views expressed here may represent one or many: see original full-text of the discussion.

Can UCoC help address personal attacks?

  • For larger projects, several community members said it will not be a major improvement for the local behavioural policy; feeling that sufficient rules are in place
  • Some community members pointed out that it could be helpful for smaller wikis
  • The idea of global ground rules for interpersonal behavior, a kind of basic consensus for all projects, is quite sensible even if redundant with existing local guidelines
  • The fact that contributors work together successfully is largely due to existing fairly broad rules that capture broad consensus on generally accepted etiquette
  • The UCoC will possibly help, diversity through universality is important: a commitment to Wikipedia as a place where diverse opinions, backgrounds, interests, and genders can find space and feel welcome to obtain a broad, objective illustration of article content; the project can only survive if it strives towards diversity and creates the conditions that make this possible
  • One user found the global policy helpful, pointing out that there is nothing in the local policy related to referencing another contributor's gender or disability

Is the proposed course of action legitimate?

  • Established projects can usually solve their problems themselves without being mothered by the Foundation; the CoC is more for small projects whose communities can't handle the problems
  • It is acknowledged that a problem with personal attacks was communicated by the community but whether the solution is appropriate should be decided by the community itself, not by some legal department or the Board of Trustees or a strategic group
  • UCoC could only be enforced by people who understand that language: if the WMF would like to do something there "against the will of the community", it would have to employ first of all people with appropriate language knowledge
  • Some small projects don't have an actionable "community" at all: in the smallest projects there are sometimes only a handful of users, if one of them (especially administrator or bureaucrat, e.g. Croatian Wikipedia) – they may be quite helpless
  • Language barrier will cause a problem to enforcing the all the cased in the smaller projects
  • It will be hard for the Foundation to assess discussions in other languages, especially considering the availability and limits of machine translation, to determine whether the complaints raised have any merit at all
  • Several users were clearly opposed to the Foundation intervening in the enforcement process, referencing concerns about the "Fram" case in EnWp

Can the Universal Code of Conduct be accepted in this way?

  • About evenly divided, opposition concerns about the "Top-down" approach from the Foundation

Additional material from LLC projects

Polish and Korean were primarily covered in the 2021 local language consultation. Additional information for consideration is below.

ポーランド語版ウィキペディア

The participants have strong beliefs that issues should be handled locally, see the local language summary for additional details. Functionaries and older members of the community are apprehensive of intervention from the global community or Wikimedia Foundation, which would not be welcomed. That said, the community is prepared to change its regulations and rules to fit the global requirements.

Polish Wikipedia has a local Arbitration Committee, however the members have not signed non-disclosure agreements, possibly due to a belief that disclosure of identity is required.

  • Some users mention their disapproval of the policy allowing functionaries and on-wiki enforcement systems to take under consideration actions of users that were done off-wiki
    • Connecting on-wiki and off-wiki accounts or behaviour is problematic; a wiki account should not be blocked/locked/banned unless there is certainty about the connection
    • Users mentioned that informal groups usually have looser rules and fewer regulations than formal groups; the UCoC should differentiate between these spaces
    • All users agreed that any meeting/groups/trips should have clearly defined rules about whether the UCoC applied to their behaviour and what actions can be undertaken based on it

韓国語版ウィキペディア

The Korean Wikipedia community expressed that there is no problem with handling cases locally (by local administrators) in general cases, see the local language summary for additional details. However, this should not be the only kind of reporting body, since many wikis like Korean Wikipedia struggle to handle violations by administrators to deal with some critical cases.

  • In Korean Wikipedia, Administrators tend to be very reluctant to get involved in complex matters. There were concerns that some behaviours cannot be stopped at the volunteer level. In some cases when the administrator takes administrative action against the perpetrator, the perpetrator still tends to engage in inappropriate behavior beyond that.
    • There were concerns that some behaviours cannot be stopped within the volunteer level, even when the administrator takes administrative action. Still, they tend to engage in inappropriate behavior beyond that. Therefore, administrators cannot handle every case that violates UCoC.
    • This was the reason why there was consensus that T&S needs to be involved in serious behavioural violation during LLC in the past.
  • It is very ineffective to run any arbitration or UCoC enforcement committee in the Korean Wikipedia.
    • The Arbitration Committee in Korean Wikipedia ran between December 2011 to March 2017, suspended in March 2017 due to ineffectiveness of operation and lack of volunteers.
    • While arbcom was in place, it did not bring any kinds of positive effect to the community. There were cases that only brought controversies.
    • Lack of people willing to volunteer: There are few people who would like to volunteer when we run the committee. Organizing a committee takes a lot of time and effort, and in the end, the lack of volunteers was the main reason for the suspension of the arbitration committee system.

By looking over the case of Arbcom from the Korean Wikipedia. Korean Wikipedia and many other medium or small size local communities will have difficulty organizing their own committees.

  • There was an issue about how UCoC can be implemented at this point while there are no global enforcement pathways ready by the Board. After the exchange of opinions on KakaoTalk (a social media platform), the LLC team discussed how we should answer, We replied here saying "It’s solidly on local judgement until an enforcement proposal created by the drafting committee is approved by the Board".

Next steps

The Universal Code of Conduct Phase 2 drafting committee has started work towards designing enforcement guidelines for a comprehensive community review scheduled in the project timeline for July to September 2021. The project team continues to seek thoughts and ideas from the communities in the context of open round-table discussions and other ongoing outreach.