Wikimedia Education Greenhouse/Unit 3 - Module 1
Why Monitoring?[edit | edit source]
Listen to Vasanthi Hargyono explain the basics of a monitoring plan through this video. As you watch the video, reflect on the following questions:
- Is this practice relevant for your Wikimedia education project?
- Have you done this or a similar process before?
Additionally, you can find the transcription of this video on this link.
Monitoring for continuous improvement[edit | edit source]
The continuous improvement loop: As mentioned earlier, monitoring and course correction efforts happen at different times during the development of your project. Page 2 of the Monitoring and Evaluation document by the organization WRAP presents sample flow of monitoring efforts which could be expressed with the following formula:
Set monitoring objectives -> develop indicators -> collect data -> analyze data -> take actions
This might look a bit too abstract or full of technical and obscure terminology. Don't worry, we will be looking at this more closely through some examples further in the lesson. But before we start planning any monitoring and evaluation activity, we should ask ourselves these three questions:
- WHO are we doing this for?
- WHAT is going to be monitored or evaluated?
- HOW is the monitoring and evaluation activity to be done?
Having the answers to these questions will help focus our efforts and create an effective monitoring and evaluation plan that will respond to our project's needs and vision.
In your own words....[edit | edit source]
On their Virtual Knowledge Centre to End Violence Against Women and Girls, UN Women states that:
|"At the programme level, the purpose of monitoring and evaluation is to track implementation and outputs systematically, and measure the effectiveness of programmes. It helps determine exactly when a programme is on track and when changes may be needed. Monitoring and evaluation forms the basis for modification of interventions and assessing the quality of activities being conducted.
Monitoring and evaluation can be used to demonstrate that programme efforts have had a measurable impact on expected outcomes and have been implemented effectively. It is essential in helping managers, planners, implementers, policy makers and donors acquire the information and understanding they need to make informed decisions about programme operations. Monitoring and evaluation helps with identifying the most valuable and efficient use of resources. It is critical for developing objective conclusions regarding the extent to which programmes can be judged a “success”. Monitoring and evaluation together provide the necessary data to guide strategic planning, to design and implement programmes and projects, and to allocate, and re-allocate resources in better ways.
Source: UN Women - Why is monitoring and evaluation important?
This text speaks about monitoring and evaluation in general terms, but we also want to hear from you. Based on what you learned on this module so far (and in your previous experience), why is monitoring and course correction important for your Wikimedia education project? Post your answers on the Discuss section of this page!
Why, who, and what?[edit | edit source]
Think about the Wikimedia education project you have been developing in the past modules. If you are just now joining the online course, think of one Wikimedia education project you have developed in the past, one you are currently developing, or one that you want to develop in the future. Reflect on the following questions:
- Why is monitoring and course correction important for your Wikimedia education projects?
- Who are you doing this for?
- How would you conduct monitoring activities?
You can reply to the three questions above using the Discuss section of this page or document them on your Course Portfolio.
How can we monitor our activities?[edit | edit source]
Below you can find two different examples of Wikimedia education projects and their potential monitoring activities. Choose one example, read the information provided, and select a recommended monitoring action the team can take in this case. Post the action you would recommend on the Discuss section and compare your answers with those of other participants.
[Example A] Wikidata training for teachers.[edit | edit source]
You might remember from different modules that our team of Wikimedians (Alex, Laura, and Isaac) started a Wikidata training program for teachers. They identified that teachers have a basic understanding of Wikidata and their ultimate goal is that teachers improve their education practice by becoming active contributors and users of Wikidata (impact). In particular, they focused on two main outcomes:
- Teachers can comfortably conduct SPARQL queries on Wikidata to create additional didactic resources for their classes, and
- Teachers become advocates for the important role of Wikidata in the OER field. They are conducting WikiCafé gatherings (2 per month) for the teachers to grow their skills and knowledge in a comfortable and supporting environment.
The team wants to start monitoring their activities because they believe it's important to learn how the project is developing and where some adjustments need to be made (the "why"). They want to use the information for their own documentation purposes, and they plan to share their findings with other members of their Wikimedia community (the "who").
What monitoring activities would you recommend for this team? Choose one from the options below:
- Monthly quizzes for the teachers to see how much they are learning about Wikidata, and a final focus group to understand their impressions about Wikipedia.
- A 10-minute fire-quiz at the beginning of each WikiCafé to learn how teachers are feeling and the main challenges they have faced practicing on their own, and a SPARQL query demonstration after 3 WikiCafés.
- A 5-minute anonymous survey at the end of each WikiCafé to learn how teachers are feeling and the main challenges they are facing, and observation guides the volunteers can use during each session to see which participants are not advancing at the same pace with the SPARQL queries.
[Example B] Wikipedia in the classroom[edit | edit source]
Our team of Wikimedians (Alex, Laura, and Isaac) have started developing another project idea. They have identified that local education actors have very strong feelings against the use of Wikipedia in the classroom even though the national education policy encourages school districts to provide teacher development training on OERs. Their ultimate goal is that policy makers and school administrators in their country view Wikipedia as a valuable resource in school (impact). In particular, they want to focus on two main outcomes:
- School administrators promote Wikipedia training for their teachers throughout the school year, and
- Policy makers understand the value of Wikipedia as an OER.
They will be conducting three information seminars for policy makers and school administrators called "WikiMeets" and they hope to schedule two follow-up sessions with at least 40% of attendees to agree on a Wikipedia education project in their schools. The team wants to design their monitoring plan because they believe it's important to learn how to improve this pilot as it develops (the "why"). They want to use the information for their own documentation purposes, to improve the sessions they have planned, and they don't plan to share their findings with other members of their Wikimedia community yet only with members of the team (the "who"). What monitoring activities would you recommend for this team? Choose one from the options below:
- A quick survey after the first WikiMeet to gather first impressions and questions from the attendees that can be incorporated in the second session, and another quick survey after the second WikiMeet to see if the impressions changed.
- An observation guide to keep track of the attitudes and facial expressions of the attendees during the first meeting.
- In-depth interviews with policy makers/school administrators that decided not to incorporate Wikipedia in their schools to understand their reasons for not wanting to participate further.
Creating a monitoring plan[edit | edit source]
How do we know what to monitor?[edit | edit source]
As you can see from the previous examples, we started thinking about relevant monitoring activities based on the impact and outcomes that the team has for their project. That is to say: their logic models. This is because monitoring activities are an extension of your project planning stage. Having monitoring (and evaluation - M&E) activities in place supports the successful development of your project and the achievement of your long-term goal. According to the University of Wisconsin-Extension, our logic model guides us in 5 main aspects: what to focus on, what questions to ask, what indicators to set, the timing of our interventions, and the data collection methods and tools we will use.
Having a well-structured and clear logic model will make it easier for us to decide what to focus on for our evaluation purposes (because we can't evaluate everything), to create relevant and critical questions to monitor, to determine the evidence we need to collect to learn if we are achieving our goals, and to know when and how to collect all this information.
Going back one more time to our logic model, remember the examples we saw in the previous units? You can identify the elements of monitoring and evaluation in them:
Remember that the logic model (objective tree) you developed for your project in the previous unit might look different from these examples - even the names of the sections. However, it contains the same main elements and information:
- activities (outputs)
- objectives (short and intermediate term outcomes)
- impact (long-term outcomes)
So how can you use your logic model to create your monitoring (& evaluation) plan? Let's find out in the next section!
How do we organize our monitoring activities?[edit | edit source]
As you saw earlier, monitoring activities are conducted at a regular and frequent pace while the project is developing. They can be formal or informal, and they help us to decide if we need to make mid-course corrections in our project. The information collected through monitoring activities is usually only utilized by the main internal stakeholders of a project. After reflecting on the purpose of our monitoring activities (the "why"), identifying the people that will use the information collected (the "who"), and reviewing our logic model to focus on the relevant aspects we need to monitor (the "what"), we move on to the how we will conduct our monitoring activities.
There is not a single model on how to design a monitoring plan but below you can find some general recommendations and examples:
- Define the indicators: Monitoring process and outcome indicators will provide us with a wider perspective of how our project is developing. Process indicators help us monitor if our activities are being implemented according to the plan. Outcome indicators help us monitor if our activities are helping us achieve the outcomes of our project. For example, in developing this Wikimedia & Education Greenhouse online course (among other things) we are paying attention to the number of participants that engage in the course (process indicator) and relevance of the contents and tools provided for Wikimedia education projects (outcome indicator).
- Define the data collection methods: You have decided what aspects to monitor during your project's development, now it's time to figure out how you will gather the information needed to track these indicators. The methods and tools you will use to collect this information will likely be diverse - we will look at them more closely in the next section. Continuing with the example of this course: to know the number of participants that engage in the course we use the statistics tools available on Moodle, and to know the relevance of the contents and tools provided for Wikimedia education projects we rely on online surveys.
- Define the timeline: Once you have decided what you want to monitor and how you will collect the relevant data, it is time to decide how often you will be collecting this information. This will vary depending on your available resources (time, staff, data collection tools, etc.) and the duration of your project. To finalize with this online course's example: to know the number of participants that engage in the course we review the statistics in Moodle either monthly or quarterly, and to know the relevance of the contents and tools provided for Wikimedia education projects we rely on online surveys once in every unit.
Finally, all this information is not collected automatically. The last consideration to keep in mind is who will be responsible for conducting these monitoring activities.
We can use a simple spreadsheet to consolidate a monitoring plan that follows these recommendations. Let's go back to the WikiCafé project from our favorite group of Wikimedians, this is what their monitoring plan might look like:
|Outcomes (Objectives)||Indicators||Data Collection Method||Timing||Owner|
|Teachers become advocates for the important role of Wikidata in the OER field||Number of teachers that can explain in simple words the value of Wikidata in the OER field||Interview guide||After the 4th session||Isaac|
|Teachers can comfortably conduct SPARQL queries on Wikidata to create additional didactic resources for their classes||Number of teachers that can follow the instructions to conduct a SPARQL query on Wikidata||Observation sheet||After the 3rd, 4th, and 5th session||Laura|
In some cases, to monitor the achievement of our outcomes we might need more to pay attention to more than one indicator, and we might need to use more than one data collection method. In these cases, it can also be helpful to add a "data analysis method" to your matrix. This will help you differentiate and organize better how you will process the information we are collecting
Reflection time: Which part of the evaluation plan seems the easiest to decide on?
- The indicators
- The data collection methods
- The outcomes
- The timing
- The owner
Course Portfolio Assignment: The start of a monitoring plan for your Wikimedia education initiative[edit | edit source]
To start building a monitoring matrix for your Wikimedia education project! Follow the steps given below:
Step 1: Review the logic model you created in Unit 2 (if you have not completed that module yet, take some time to do so and create a logic model for your project idea following that lesson) or another logic model you have previously created.
Step 2: Choose two outcomes (objectives) to focus your monitoring activities on.
Step 3: Complete the simple spreadsheet presented previously with information corresponding to your project. Remember, the indicators will come from your selected outcomes.
Step 4: Include this monitoring plan in the document/space where you have been developing your idea and share the link here!
References[edit | edit source]
- WRAP, Monitoring and Evaluation guide -http://www.wrap.org.uk/sites/files/wrap/Monitoring%20and%20Evaluation.pdf
- "Why is monitoring and evaluation important?". www.endvawnow.org. Retrieved 2020-06-30.
- UNESCO, On Target: A Guide for Monitoring and Evaluating Community-Based Projects - https://www.goethe.de/resources/files/pdf125/evaluation-manual-unesco1.pdf
- "Enhancing Program Performance with Logic Models". lmcourse.ces.uwex.edu. Retrieved 2020-06-30.