Time vs Money within the Wikimedia movement

From Wikiversity
Jump to navigation Jump to search

By Aaron Shaw (US), Celio Costa (Brazil), Stanislav Kozlovskiy, Balaji, Lionel Scheepmans (Belgium)

In the frame of CivilServant's Wikimedia studies/Summit Stockholm 2019.

Active editors on English Wikipedia over time.png
Active editors on French Wikipedia over time.png
Editor cohort longevity - matrix - en.png
Wikimedia Foundation financial development multilanguage.svg
Wikimedia Foundation's expenses evolution by rubrics in US Dollars.svg

Contextual introduction and frame[edit]

Starting questions[edit]

  • Do fundraising donation campaigns on Wikimedia projects reduce contributions in terms of edits and time to the projects?
  • Is a time donation campaign (excluding money aspect) could be good for participation (editing) on Wikimedia projects?

(There may be a tradeoff between the two questions)

Goal[edit]

What is a goal or challenge your community cares about?

1. This experiment aims to increase participation and retention of newcomers.

2. We also aim to have a better understanding of how to motivate people to contribute to the projects.

3. Involve hard core editors communities in a positive and kind process that include and accompanies newcomer in Wikimedia projects.

Theory of Change[edit]

What forces might influence in the direction toward your goal?

Asking people to edit rather than give money would help mobilize them to participate because in volunteer activity, people may reduce or not start their activity when they donate money.(Ouedghiri, 2010, p.38).

What forces might influence in a direction away from your goal?

Projects design and communities are not ready enough to welcome donations of time in good conditions and donors are frustrated and no longer want to repeat the experience.

Intervention[edit]

What intervention (tool, strategy, innovation) might help reach that goal?

We propose to manipulate a fundraising campaign banner to solicit donations of time [and|or] money. Potential recruitment via social media campaigns (e.g., “#Time4wiki hashtag)

1. Sitenotice banner

2. Page like https://meta.wikimedia.org/wiki/Wiki4Women

Special sitenotice banner campaign in some language editions (or to some individuals) in which we replace requests for monetary donations with requests for donations of time & participation. Explicitly focus on asking people to donate time.

The manipulated banner could link to a “welcome” page like Wiki4Women that can route new volunteers to simple tasks suitable for newcomers.

3. Hashtag #time4wiki — social media campaign to mobilize contributions.

We would need to pilot test specific banner designs to determine exact messaging and design features for the eventual test but the banners must not differ with usual one for not complicated the comparison with the control group.

What control might you compare the intervention to?

Other language editions or individuals would have the “normal” fundraising campaign banners.

Unit of Assignment[edit]

What might be the unit of random assignment? (remember that a larger sample is better)

A language edition of WP? Individuals within a language edition? Regional IP-block of individuals within a language edition?

Measure[edit]

What is one way to measure and compare between the intervention and the control if the intervention helps achieve that goal?

Outcome (dependent) variables we might care about: For new and anonymous participants:

  • Number of new accounts
  • “Survival” of new contributors recruited during each campaign
  • Dollars donated
  • Edit session length
  • Edit quality

For experienced editors:

  • Talk interactions/tone with newcomers
  • Edits
  • Survival

Potential Risks[edit]

Risks to individuals[edit]

If a “welcome center” is not smart enough, newcomers become lost or worse when they try to figure out how to help edit. If we fail to design a welcoming, easy, smooth editing experience for them, they may never come back.

Risks to the community[edit]

Overwhelming onslaught of clueless newcomers or vandals who need more help than anyone can provide.

Lost money from donations that don’t happen.

Community members complaining about the use of the site notice. Some projects/communities may have very specific/restrictive rules about how it can be used and may reject this experiment design. Previous campaigns have enabled logged in users to adjust a setting so as not to see any campaign banners.

What might you do to reduce or manage those risks?

  • Run the campaigns at different times to make sure fundraising goals are not undermined.
  • Show different banners to logged-in users and not logged-in users.
  • Well prepare de welcome design.

Community Consent[edit]

What is the best way to seek community review & consent in your community?

Outreach through individual liaisons through village pump, mailing lists, Facebook, and Telegram groups. Some communities prefer one or several of these over others. Eventually, an announcement (and ideally consent and agreement) needs to be happen on the wiki (probably at the village pump or a separate page).

During the campaign there should be a page or link to a way to submit complaints/comments.

Gotchas[edit]

What other challenges or obstacles (technical, community, etc) would need to be managed and resolved for this study to happen?

If we randomly assign different banners to individuals within the same project, it would be hard to guarantee that the same person doesn’t see both banners except if it’s done in a time repartition. It might be possible to manage this by randomizing at the IP-block level (i.e., each block of IPs gets only one banner) or by shifting assignment at specific points in time.

Measuring donation for individuals given that the foundation doesn’t share/collect a lot of that.

Country-specific campaigns may be different in lots of ways already and happening at different times of the year. WMF cannot accept donations in some countries! Also, money has very different values in different places.

Also, different communities have different welcome center and newcomer systems/sites/practices that would need to be accommodated in a study like this.

If we run the study at the language-edition or community level some communities may be more/less supportive and that could limit the size of the study.

A study like this would require close collaboration with the WMF fundraising/advancement team.

Moderators and Mediators[edit]

Moderators: might the intervention work differently for some people and groups? If so, explain here:

People without the skills or time to contribute may not be able to edit.

Low income populations may be systematically different.

Mediators: What would be an outcome to test your theory of change? (remember, it's really hard to do this)