Learning and Evaluation/Evaluation reports/2013/Edit-a-thons
This is the Program Evaluation page for edit-a-thons. It currently contains information based on data been collected in late 2013 and will be updated on a regular basis. For additional information about this first round of evaluation, please see the overview page.
This page reports data for 28 program leaders, about a total of 46 edit-a-thons, which includes 20 that were mined for additional data. Program leaders who responded about edit-a-thons were a mix of people associated with chapters (staff/volunteers) and individuals who work "solo" without chapters/affiliates in the movement. Some were associated with organizations, such as GLAMs or educational institutions.
Key lessons include:
- Edit-a-thons are popular! They are the most commonly reported program by program leaders who responded to our survey - we were also able to easily pull additional data on 26 more to fulfill some areas where we were missing data. 28 program leaders reported 26 edit-a-thons, making them the highest reported program in this first round of data collection.
- Edit-a-thons have four priority goals with a primary focus on increasing contributions, skill sets, recruitment and perceptions about Wikimedia projects.
- Many program leaders aren't tracking participant usernames which makes it a challenge to track retention and contributions of participants. We hope through improved program design, tools, and sharing among program leaders of what works to track usernames, we can make this a standard for edit-a-thons.
- Most program leaders are tracking budget and donated resources, but most aren't tracking staff and volunteer hours which are also two critical inputs for evaluation. We aim to work with program leaders to make tracking hours - staff and volunteer - easier.
- Edit-a-thons rely more on donated resources than any other reported program with meeting space being the most commonly donated resource.
- Participants average 3 pages of text each out of all of the reported events which averages out to almost 24,000 characters per event by all event attendees.
- Edit-a-thons are productive for generating decent amounts of content regardless of size or cost, but the more participants, the more the content produced. This shows that edit-a-thons are successful at content production.
- Budget size doesn't appear to have an effect on the amount of content produced; events with small budgets can be as equally productive as events with large budgets. We also learned that having staff support, doesn't necessarily suggest a more productive or impactful event.
- Events with lots of new editors can be just as productive as events with lots of experienced editors
- Out of 328 new editors who attended the reported edit-a-thons, three were retained 6-months after the event. Experimentation with edit-a-thon series, surveying, follow-up and more pro-active program design is worth trying to learn more about how these events could possibly retain new editors - or we might see a change in the primary goals selected by program leaders.
- Qualitative research shows that having edit-a-thon series and more frequent events with follow up, and improved training, could potentially help retain new editors afterwards, which we learned through interviews with new editors who did not edit after they attended (and edited at!) an edit-a-thon. We hope to work with program leaders to experiment with these ideas, and support them in evaluation.
- Early indicators show that experienced editors might edit more on average during the event than they do in an average day. They also appear to edit more on average after the event than they did before the event. We plan on researching this further to learn more about the impact of edit-a-thons on experienced editors.
Planning an edit-a-thon? Check out some process, tracking, and reporting tools in our portal and find some helpful tips and links on this resource page
Program basics and history
[edit]Edit-a-thons are outreach events that bring together Wikipedians, and those interested in editing Wikipedia, to do just that: edit Wikipedia in a collaborative setting. These events, which last generally between 3–5 hours, provide a social environment for new and experienced editors to edit together, often about a specific subject matter. Many events take place in educational and cultural facilities, such as libraries, museums, and universities, when others might take place in offices buildings, homes, and public venues such as cafes and restaurants. Sometimes edit-a-thons are combined with training lessons, where experienced Wikipedians educate participants about the basic "how-to's" on editing, followed by an editing session. Other events might include backstage tours of cultural institutions that are hosting the event, followed by editing, or just a simple edit-a-thon where participants start editing upon arrival at the venue.
An early editathon for a wikiproject was proposed on the English Wikipedia in February 2004. Jimbo proposed a library-based Editing Weekend in mid-2004. Neither were pursued at the time.
In 2009, Australia's Powerhouse Museum hosted what is believed to be the first GLAM edit-a-thon. Wikipedians were given a tour of the museum, met with curators, were able to take photographs, and improved and created articles related to the Powerhouse and its collections. The Wikipedian who coordinated the Powerhouse event, Wittylama, eventually organized another event at the British Museum, which featured a tour of the museum and a contest focused around improving the article about the Hoxne Hoard. Called the "Hoxne Challenge," it has served as an inspiration for edit-a-thons in partnership with cultural institutions since. One of the first events to use the "editathon" name was the British Library Editathon in January 2011 (with the more recent emphasis on multiple topics), and the pace of such events increased over the following years.
Data report
[edit]Response rates and data quality/limitations
[edit]- Edit-a-thons are popular! 28 program leaders reported 26 edit-a-thons. But program leaders need to track usernames of participants better. We had to pull additional data to supplement the lack of username reporting.
- Edit-a-thons were the most frequently self-reported program type – 26 edit-a-thons were reported directly through the survey by program leaders. However, many program leaders did not track usernames of participants in order to track their contributions made before, during, and after the event. Aside from the 26 edit-a-thons that were self-reported, our team pulled data on 20 additional English Wikipedia edit-a-thons for which public records of participants were available on wiki.[1]
A total of 46 edit-a-thons completed between February 2012 and October 2013 were reviewed for this report. As with all the program report data reviewed in this report, report data were often partial and incomplete, please refer to the notes, if any, in the bottom left corner of each graph below.
Report on the submitted and mined data
[edit]Priority goals
[edit]- According to program leaders, edit-a-thons have four priority goals.
We asked program leaders to select their priority goals for edit-a-thons. We provided 18 priority goals with an additional 19th option to report on "other" goals as well, and they could select as many or as little as they saw fit. 13 program leaders chose between seven and 18 priority goals.[2] Our team noted four stand out goals that appeared as priorities amongst the reporting program leaders (see table below):
Inputs
[edit]- In order to learn more about the inputs that went into planning edit-a-thons, we asked program leaders to report on
- The budget that went into planning and implementing the edit-a-thon
- The hours that went into planning and implementing the edit-a-thon
- Any donations that they might have received for the event: a venue, equipment, food, drink, giveaways, etc.
Budget
[edit]- The majority of program leaders were able to report budget for edit-a-thons.
- In the survey, budget was reported for 16 edit-a-thons. Whereas 7 of these budgets were reported as zero dollars, 9 of the budgets were reported as ranging from $10.00 US to $750.00 US (with an average budget of $359.99).[3][4]
- While 62% of direct reports that came in through our survey included budget, this represents only 35% of all edit-a-thons reviewed in this report, because we didn't have budget numbers for the edit-a-thons we mined.
Hours
[edit]- Most program leaders were unable to provide data for how many staff hours went into implementing edit-a-thons. But, little over half were able to submit data for volunteer hours.
Staff and volunteers put the following time into implementing edit-a-thons, according to respondents:
- 39% of program reports included staff hours, which ranged from 0 to 200 hours with an average of 3 hours.[5]
- 54% of program reports included volunteer hours, which ranged from 2 to 215 hours with an average of 15 hours.[6]
- Total hours (staff and volunteer hours combined) ranged from 9 to 300 hours with an average of 15 hours.[7]
Donated resources
[edit]- Edit-a-thons rely more on donated resources than any other reported program.
Program leaders reported use of donated resources for their edit-a-thons more than any of the other programs reviewed in the survey. In most cases, edit-a-thons were held using donated meeting space (85%) and materials or equipment (58%), while donations of food (46%) and prizes/giveaways (23%) were also noted for some events (see Graph 2).
Outputs
[edit]- We also asked about two outputs in this section
- How many hours did the edit-a-thon last?
- How many people participated in the edit-a-thon?
Event length
[edit]- The average edit-a-thon is five hours long.
Edit-a-thons lasted from 2 to 24 hours with an average of 5 hours (see Graph 1).[8]
Participation
[edit]- The average edit-a-thon has 16 participants.
Out of all edit-a-thons included in this report, the number of participants ranged from 2 to 74 with an average of 16 participants.[9]
Outcomes
[edit]- The majority of program leaders reported about content production. Reported events averaged almost 24,000 characters in total, with individual participants averaging almost 3 pages of characters added each.
The majority of program leaders were able to provide us with the amount of new content that got added to Wikipedia's article namespace during the event.[11] Events produced an average 23,993 characters with the most productive event reporting 157,586 characters of text produced during the event.[12]
In order to make the metrics easier to understand, we converted "characters" into "printed pages," assuming that one printed page equals 1,500 characters. Printed pages of text produced during edit-a-thons ranged from .1 to 8.6 pages an hour with an average of 2.9[13] and from 0.1 to 3.8 pages of text per participant with an average of nearly three-quarters a page of text (0.7 pages) per participant. [14]
- Reported events averaged three media uploads each. Half reported had no or unknown numbers of media uploads.
Regarding image uploads, the average was 3 uploads per event, with 11 edit-a-thons reporting with 0 or no known uploads. The largest reported number of uploads was 85.[15]
- Content produced at events varies, and even smaller events—participant or budget wise—can produce lots of content. Events with more participants generally produce more content, but the cost of the event doesn't necessarily effect participation rate or content production rates.
We were able to analyze the dollars to content produced for only 5 edit-a-thons. This is because only those 5 reported both the budget and the amount of content added during the event. We looked at the budgets and characters added at those five events and were able to determine how much one "printed page" added costs. The average cost for one printed page of content for these five events was $17.15 US (see Graphs 3 and 4).[16]
-
Graph 3: Dollars to pages This bubble graph depicts how much content gets produced based on budget and participation rates. The size of the bubble, and corresponding number they are labeled with, show how many total printed pages were produced at the event. It shows that the amount of content created varies, meaning it's not clear if participation or budget makes a difference in how much content is produced. Even the events that had small or zero budgets were productive regarding content production. The graph does show that the more participants, the more content is created, but the amount of money it costs to produce the event does not relate to the amount of participants or content produced.
-
Graph 4: Dollars to pages This box plot depicts the distribution of how many dollars were invested for each printed page added during the event. As illustrated, by the long vertical line running from the the low of $7.08 to high of $153.44, results were highly variable. However, the 50% of reports occurring around the median of $17.15 ranged from $8.86 to $23.28 per page of bytes added, a somewhat smaller range for what is most typical for these types of events.
- During edit-a-thons, it takes little over one hour to produce one page of content.
We also wanted to know how many hours it took to produce one page of content based on the hours that were put into implementing an event. Reports of staff and volunteer hours input into edit-a-thons was available for 11 of the events reviewed. [17] Using those reports, we were able to take those hours and the amount of characters added during those 11 events and calculate that the average was 1.11 hours to produce one page of content. The smallest amount of time was 0.25 hours and 4.5 hours was the largest amount of time to produce a page worth of content[18] (see Graphs 5 and 6).
-
Graph 5: Hours to pages As illustrated in the bubble chart, higher number of text pages added was not always associated with higher number of participants and more hours input to implementation did not necessarily related to more participants or more content added for the 11 edit-a-thons for which data are available.
-
Graph 6: Hours to pages This box plot depicts the distribution of how many hours were invested into edit-a-thon events per page created (by bytes added), reports were highly varied and ranged from 0.25 hours to 4.5 hours invested per page of text added. However, the 50% of reported values occurring around the median of 1.1 hours ranged from 0.63 to 2.25, a much closer approximation of the most typical range for these events.
- Hourly productivity at edit-a-thons is all over the place, regardless of size and length. Edit-a-thons with lots of new editors can also be just as productive as those with lots of experienced editors.
We also looked into how productive new editors at the events were, to see if the budget and hours put into implementing the edit-a-thons were supporting new editors to be productive at the event (see Graph 7).
- The most commonly reported data about production at edit-a-thons was about article creation/improvement. This is followed by media and article quality, respectively.
The majority (72%) edit-a-thons reported included how many pages were created or improved on wiki during the event. In total, 620 pages were created or improved during the 46 edit-a-thons! It was also reported that 334 images or media were added, and 81 of those were added to project pages. 7 of the edit-a-thons reported producing Good Articles, totaling 51. Two of the edit-a-thons reported 4 Featured Articles coming out of the event (see Graph 8).
Recruitment and retention of new editors
[edit]- The majority of the 46 edit-a-thons reported were able to pull data about retention. Out of 328 new editors who attended these edit-a-thons, after 6-months, only three were actively editing.
We also wanted to learn about the retention of active editors after they attended an edit-a-thon. In the survey, we asked program leaders to report the retention of their active editors 3 and 6 months after the end of the event. A retained active editor is considered a Wikipedia editor making an average of five or more edits a month. Out of the 46 edit-a-thons reported on, 37 of them (80%) had reached or passed the 3 months after their event end date. 29 of the 46 (63%) passed their 6 month mark, so we were able to gather retention data for those edit-a-thons only.
In total, the edit-a-thons reported attracted 328 new editors (36% of 906 total participants). The number of active editors (5+ edits/month) at 6-months follow-up time were reported for 27 of the edit-a-thons (59%), 15 of the events (33%) had not yet reached the point of 6-month follow-up.[19] Only 18 reports provided a separate count for number of active editors at 6-months follow-up for new editors (39%). Active editor retention rates for new users were most often 0% (83% of the reports).[20] For all edit-a-thons reported, the total number of new user participants was 328, of which, only 3 were active editors at 6 month follow-up, 1.4% of new users who had reached the 6-month follow-up window (see Graph 9).
Replication and shared learning
[edit]- Edit-a-thon program leaders are pro-active at producing materials, blogs, online resources, and other information related to their event which can help others implement their own events.
Finally, we asked program leaders to share with us how replicable their edit-a-thons could be, and what types of shared learning resources were produced for and after the event. This allows us to learn if the reporting program leaders considered themselves experienced in implementing edit-a-thons, which would allow them to perhaps help others design and implement edit-a-thons of their own. We also are able to learn how program leaders and others (i.e. chapters, press, bloggers, etc) were covering the events, and if resources were available for others to use to produce their own events.
For the 46 edit-a-thons reported by program leaders, we learned that the majority (96%) are experienced at producing edit-a-thons and could help others conduct their own. The majority (62%) reported having blogs or other online information available for others to learn more about the event. A smaller amount of program leaders reported that they developed brochures and/or printed materials (27%) and guides or instructions on how to contribute to Wikipedia for event participants (23%) (see Graph 10).
Summary, suggestions, and open questions
[edit]How does the program deliver against its own goals?
[edit]Being branded as the "Swiss Army knife" of the Wikimedia movement by some, edit-a-thons are expected to serve a number of different goals. We didn't have sufficient quantitative data on whether edit-a-thons "increased the positive perception of Wikipedia" and to which extend edit-a-thons "increased the editing skills of newcomers". For this reason, we focused our quantitative analysis on whether and to which extent edit-a-thons (a) increased the amount of content on Wikipedia and (b) retained newcomers who have been taught how to edit during this type of event.
First, we looked at the amount of new content created by edit-a-thon participants. For 30 edit-a-thons, we knew the total number of characters added during the event (either through self-reported or mined data). In order to make these numbers easier to grasp, we assumed that 1,500 characters account for one printed page. The median number of printed pages that participants produced during the afore-mentioned edit-a-thons on average per person was 0.7 (with a high of 3.8 and a low of 0.1).
Then, we looked into the retention rate for edit-a-thons. Out of 328 edit-a-thon participants who created new user accounts during the events, only 3 (1.4%) qualified as "active editors" half a year after the event.
In order to understand the low retention rate better, we took a sample of people who had created new user accounts during edit-a-thons and asked them via email why they stopped editing. Here are the two most common reasons:
- lack of time ("A full-time job takes up much of my time" / "other priorities just got in the way" / "have not found the time" / "time is pretty restricted so it might well be that I would never have the time to actually contribute")
- no obvious editing opportunities ("I just don't know where to start in Wikipedia" / " when using Wikipedia […] I didn't spot anything that I felt compelled to edit instantly.")
Answers like "I would hope to get back into editing in the future, although now I might need a refresher course", "The longer I do not do it, the more it goes to the back of my mind, and the harder it will be to remember how to do it again" and "I haven't edited mainly because I wouldn't be able to remember how to do it!" indicate that a series of events or follow-ups might serve the learning needs of the participants better than one-off events. Also, edit-a-thon organizers might want to look more in-depth into whether the training they offer is effective (one participant shared: "the training we got during the event […] wasn't straightforward").
Although we have some early indicators that edit-a-thons lead to an increase of editing frequency in the article namespace among long-term Wikipedians in the weeks after the event, this area needs further investigation. Given the fact that edit-a-thons have become very popular across language versions over the last two years, it also seems to be worthwhile to investigate on a more general level what draws Wikimedians to this event type. We think that edit-a-thons might have an impact that goes beyond the mere fact that they increase Wikipedia's article quality and quantity. With early indicators that Wikipedians edit the article namespace more after the event and the fact that edit-a-thons have become so popular, they might play a role in increasing long-term editors' motivation by strengthening the social ties of our community.
How does the cost of the program compare to its outcomes?
[edit]Five out of eight edit-a-thon organizers reported that they were running their events at zero costs for the event budget and without any staff costs. In three of these cases, implementers also reported the amount of characters that have been added to Wikipedia's article space: 15, 16 and 20 pages (@ 1,500 characters). This means that staff support and a budget are not necessarily preconditions for an increase in content on Wikipedia:
Budget | Staff time @ salary/hour | Total monetary cost | Content added[21] (in printed text pages) |
Monetary cost per printed page |
---|---|---|---|---|
0 | 0 | 0 | 15 | $0.00 |
0 | 0 | 0 | 16 | $0.00 |
0 | 0 | 0 | 20 | $0.00 |
Also, we had two self-reported data sets at our disposal for which edit-a-thon organizers reported an event budget that was not zero and the number of staff hours that were needed to plan and execute the event. The following table gives an overview of how much money was spent on these edit-a-thons, how much content got added to Wikipedia's article namespace and how much money was spent per printed page (at 1,500 characters per page):
Budget | Staff time @ salary/hour | Total monetary cost | Content added[21] (in printed text pages) |
Monetary cost per printed page |
---|---|---|---|---|
$469 | 10 hrs @ $20[22] | $669 | 20 | $33.45 |
$300 | 35 hrs @ $20[22] | $1,000 | 34 | $29.41 |
How easily can the program be replicated?
[edit]Over the last couple of years, edit-a-thons have been documented widely through blog posts and chapters' quarterly/annual reports. And the general concept of having Wikipedians spend some editing time together at the same physical location is easy to replicate.[23] What seems to be missing though, is an in-depth analysis of what makes some edit-a-thons more successful than others. It would include numbers that indicate to which extend the specific goals of the events have been met (e.g. retention rate, number of articles improved and created, etc.). Ideally, it would also include qualitative data (e.g. a survey among participants who were new to editing measuring to which extend their editing skills improved, whether their attitude towards Wikipedia changed, and whether their expectations were met). Based on such an analysis, people around the world would be better able to build on the learning of others and edit-a-thon outcomes could improve over time.
Next steps
[edit]- Next steps in brief
- Increased tracking of detailed budgets and hour inputs by program leaders
- Increased tracking of usernames and event dates by program leaders
- Pre- and post- edit-a-thon surveys to understand more about what participants know going in, and leaving, the workshop to see if it's meeting priority goals about Wikimedia movement education and skill improvement.
- More experimentation in edit-a-thon design including experimentation with edit-a-thon series, invitations, personal follow-up after the event, etc.
- Exploring opportunities to determine the value of edit-a-thons through Return on Investment analysis
- Investigate effect of edit-a-thons on editing habits and productivity of experienced editors before, during, and after edit-a-thons.
- Next steps in detail
As with all of the programs reviewed in this report, it is key that efforts are made toward properly tracking and valuing programming inputs in terms of budgets and hours invested as well a tracking usernames and event dates for proper monitoring of user behaviors. Further investigation of expectations and efforts directed toward the other goal priorities (i.e., increasing skills for editing and increasing awareness of Wikimedia projects) and the development of strategies for measuring such potential impacts will be important next steps as well as making efforts toward valuations for future Return on Investment analysis.
External resources
[edit]- How to run an edit-a-thon on the English Wikipedia.
Appendix
[edit]Summative data table: Edit-a-thons (raw data)
[edit]Percent reporting | Low | High | Mean | Median | Mode | SD | |
---|---|---|---|---|---|---|---|
Non-zero budgets | 20% | $10.00 | $750.00 | $332.97 | $359.99 | — | $239.98 |
Staff hours | 39% | 0.00 | 200.00 | 16.33 | 3.00 | 0.00 | 46.64 |
Volunteer hours | 54% | 2.00 | 215.00 | 28.36 | 15.00 | 15.00 | 43.85 |
Total hours | 54% | 9.00 | 300.00 | 40.12 | 15.00 | 15.00 | 67.90 |
Donated meeting space | 85%[24] | Not applicable (frequency of selection only) | |||||
Donated materials/equipment | 58%[25] | Not applicable (frequency of selection only) | |||||
Donated food | 46%[26] | Not applicable (frequency of selection only) | |||||
Donated prizes/give-aways | 23%[27] | Not applicable (frequency of selection only) | |||||
Participants | 100% | 2 | 74 | 20 | 16 | 9 | 2 |
Dollars to participants | 35% | $0.00 | $39.47 | $7.14 | $3.50 | $0.00 | $10.40 |
Input hours to participants | 54% | 0.33 | 7.50 | 2.16 | 1.42 | 1.00 | 1.96 |
Bytes added | 65% | 991 | 157586 | 29862 | 23993 | — | 32576 |
Dollars to text pages (by byte count) | 11% | $7.08 | $153.44 | $41.96 | $17.15 | — | $62.66 |
Input hours to text pages (by byte count) | 20% | 0.25 | 4.50 | 1.52 | 1.11 | — | 1.38 |
Photos added | 67% | 0 | 85 | 11 | 3 | 0 | 22 |
Dollars to photos | 24% | Not appropriate metric calculation for program type | |||||
Input hours to photos | 26% | Not appropriate metric calculation for program type | |||||
Pages created or improved | 72% | 1.00 | 71.00 | 18.79 | 15.00 | 18.00 | 15.42 |
Dollars to pages created/improved | 30% | $0.00 | $32.73 | $6.55 | $1.00 | $0.00 | $9.40 |
Input hours to pages created/improved | 37% | 0.42 | 15.00 | 2.40 | 1.40 | none | 3.40 |
Unique photos used | 48% | 0 | 17 | 4 | 2 | 0 | 5 |
Dollars to photos used (non-duplicated count) | 22% | Not appropriate metric calculation for program type | |||||
Input hours to photos used (non-duplicated count) | 24% | Not appropriate metric calculation for program type | |||||
Good article count | 52% | 0 | 30 | 2 | 1 | 0 | 7 |
Featured article count | 48% | 0 | 2 | 0 | 0 | 0 | 1 |
Quality image count | 0% | Not applicable (none reported) | |||||
Valued image count | 0% | Not applicable (none reported) | |||||
Featured picture count | 0% | Not applicable (none reported) | |||||
3 month retention[28] | 89% | 3% | 100% | 35% | 35% | 40% | 18% |
6 month retention[29] | 93% | 12% | 73% | 35% | 34% | — | 15% |
Percent experienced program leader | 96%[30] | Not Applicable (frequency of selection only) | |||||
Percent developed brochures and printed materials | 27%[31] | Not Applicable (frequency of selection only) | |||||
Percent blogs or online sharing | 62%[32] | Not Applicable (frequency of selection only) | |||||
Percent with program guide or instructions | 23%[33] | Not Applicable (frequency of selection only) |
Bubble graph data
[edit]Budget | Number of participants | Bubble size: number of printed pages added to Wikipedia's article namespace |
---|---|---|
$300.00 | 20 | 34 |
$500.00 | 74 | |
$0.00 | 13 | |
$0.00 | 9 | |
$0.00 | 26 | 28 |
$750.00 | 19 | 5 |
$0.00 | 36 | 20 |
$427.38 | 46 | 60 |
$30.00 | 15 | |
$10.00 | 2 | |
$0.00 | 31 | 16 |
$359.99 | 21 | 21 |
$0.00 | 30 | 29 |
$0.00 | 4 | 15 |
$469.40 | 40 | 20 |
$150.00 | 19 |
Participants | Hours | Bubble size: number of printed pages added to Wikipedia's article namespace |
Hours per Page |
---|---|---|---|
20 | 65 | 34 | 1.92 |
40 | 300 | 105 | 2.86 |
26 | 15 | 28 | 0.55 |
19 | 22 | 5 | 4.50 |
36 | 51 | 20 | 2.50 |
46 | 15 | 60 | 0.25 |
31 | 31 | 16 | 2.00 |
21 | 15 | 21 | 0.72 |
30 | 15 | 29 | 0.52 |
4 | 17 | 15 | 1.11 |
40 | 25 | 20 | 1.24 |
Percent new accounts | b1 participants | Bubble size: number of printed pages added to Wikipedia's article namespace per hour |
---|---|---|
80% | 20 | 8 |
58% | 40 | 4 |
32% | 74 | |
38% | 13 | |
74% | 38 | |
71% | 17 | |
0% | 7 | |
78% | 9 | |
80% | 25 | |
33% | 26 | 4 |
38% | 13 | 2 |
53% | 19 | 2 |
33% | 36 | 5 |
50% | 4 | |
0% | 9 | 3 |
11% | 46 | 9 |
20% | 15 | |
17% | 12 | 2 |
18% | 17 | 3 |
13% | 16 | 1 |
50% | 2 | |
3% | 31 | 4 |
33% | 21 | 3 |
33% | 30 | 4 |
33% | 6 | 0 |
0% | 2 | |
50% | 4 | |
25% | 4 | 2 |
73% | 11 | 4 |
29% | 35 | 2 |
18% | 40 | 3 |
25% | 24 | 3 |
29% | 17 | 1 |
More data
[edit]Note: the Report ID is a randomly assigned ID variable in order to match the data across the inputs, outputs, and outcomes data tables.
Report ID | Budget | Staff hours | Volunteer hours | Donated space | Donated equipment | Donated food | Donated prizes |
---|---|---|---|---|---|---|---|
56 | $300.00 | 35 | 30 | Yes | Yes | ||
11 | 200 | 100 | Yes | Yes | |||
14 | $500.00 | 0 | 215 | Yes | |||
25 | $0.00 | 0 | 15 | Yes | Yes | Yes | |
23 | 0 | 30 | Yes | Yes | Yes | Yes | |
28 | |||||||
33 | 0 | 14 | Yes | Yes | |||
43 | $0.00 | 0 | 16 | Yes | Yes | Yes | |
50 | 0 | 40 | Yes | Yes | Yes | Yes | |
13 | $0.00 | 15 | Yes | Yes | Yes | ||
104 | |||||||
15 | $750.00 | 15 | 7 | Yes | Yes | Yes | |
22 | $0.00 | 0 | 51 | Yes | |||
17 | 7 | 4 | Yes | Yes | |||
108 | |||||||
18 | $427.38 | 15 | Yes | Yes | |||
19 | $30.00 | 15 | Yes | Yes | Yes | ||
109 | |||||||
110 | |||||||
111 | |||||||
26 | $10.00 | 15 | Yes | Yes | |||
31 | $0.00 | 0 | 31 | Yes | |||
32 | $359.99 | 15 | Yes | Yes | |||
39 | $0.00 | 15 | Yes | Yes | Yes | Yes | |
113 | |||||||
40 | 7 | 2 | Yes | Yes | |||
46 | 7 | 4 | Yes | Yes | |||
49 | $0.00 | 0 | 17 | ||||
106 | |||||||
123 | |||||||
54 | $469.40 | 10 | 15 | Yes | Yes | ||
115 | |||||||
119 | |||||||
55 | $150.00 | 20 | Yes | Yes | |||
114 | |||||||
122 | |||||||
105 | |||||||
121 | |||||||
118 | |||||||
116 | |||||||
59 | 7 | 2 | Yes | Yes | Yes | ||
117 | |||||||
120 | |||||||
63 | |||||||
64 | |||||||
65 | 6 | 6 |
Report ID | Event length (hours) | Number of participants | Number of new user accounts | Total bytes added | Text pages (by bytes added) | Dollars per page of bytes | Hours per page of bytes |
---|---|---|---|---|---|---|---|
56 | 4 | 20 | 16 | 50798 | 33.87 | $8.86 | 1.9 |
11 | 24 | 40 | 23 | 157586 | 105.06 | 2.9 | |
14 | 18 | 74 | 24 | ||||
25 | 3.5 | 13 | 5 | ||||
23 | 7 | 38 | 28 | ||||
28 | 3 | 17 | 12 | ||||
33 | 6 | 7 | 0 | ||||
43 | 6 | 9 | 7 | ||||
50 | 7 | 25 | 20 | ||||
13 | 7.75 | 26 | 41259 | 27.51 | 0.6 | ||
104 | 4.5 | 13 | 5 | 13560 | 9.04 | ||
15 | 3 | 19 | 10 | 7332 | 4.89 | $153.44 | 4.5 |
22 | 4 | 36 | 12 | 30594 | 20.4 | ||
17 | 5 | 4 | 2 | ||||
108 | 2 | 9 | 0 | 8802 | 5.87 | ||
18 | 7 | 46 | 5 | 90532 | 60.35 | $7.08 | 0.3 |
19 | 5 | 15 | 3 | ||||
109 | 4 | 12 | 2 | 13863 | 9.24 | ||
110 | 6 | 17 | 3 | 28551 | 19.03 | ||
111 | 4.5 | 16 | 2 | 5298 | 3.53 | ||
26 | 2 | 2 | 1 | ||||
31 | 4 | 31 | 1 | 23251 | 15.5 | ||
32 | 7 | 21 | 31480 | 20.99 | $17.15 | 0.7 | |
39 | 7.75 | 30 | 43085 | 28.72 | 0.5 | ||
113 | 4 | 6 | 2 | 1692 | 1.13 | ||
40 | 5 | 2 | 0 | ||||
46 | 5 | 4 | 2 | ||||
49 | 10 | 4 | 1 | 22953 | 15.3 | 1.1 | |
106 | 4 | 11 | 8 | 24734 | 16.49 | ||
123 | 8.5 | 35 | 10 | 29105 | 19.40 | ||
54 | 7.5 | 40 | 7 | 30240 | 20.16 | $23.28 | 1.2 |
115 | 5.5 | 24 | 6 | 26707 | 17.80 | ||
119 | 3.5 | 17 | 5 | 5156 | 3.44 | ||
55 | 6 | 19 | 8 | ||||
114 | 6.5 | 39 | 16 | 79334 | 52.89 | ||
122 | 4.5 | 11 | 3 | 9387 | 6.26 | ||
105 | 8 | 10 | 2 | 41040 | 27.36 | ||
121 | 2 | 10 | 4 | 4468 | 2.98 | ||
118 | 6 | 9 | 4 | 991 | 0.66 | ||
116 | 4 | 10 | 6 | 2486 | 1.66 | ||
59 | 5 | 9 | 4 | ||||
117 | 5 | 15 | 1 | 20950 | 13.97 | ||
120 | 5 | 13 | 6 | 2046 | 1.36 | ||
63 | 7 | 37 | 29 | 48580 | 32.39 | ||
64 | 2 | 35 | 17 | ||||
65 | 3.5 | 6 | 6 |
Report ID | Number photos/media added | Number unique photos used | Number pages created or improved | Number good articles | Number featured articles | 3 month active editor rate | 6 month active editor rate | 6 month retention (new contributors only) rate |
---|---|---|---|---|---|---|---|---|
56 | 0 | 0 | 20 | 55% | 55% | |||
11 | 46 | 7 | 71 | 0 | 0 | 18% | 18% | |
14 | 20 | 30 | 2 | n/a | n/a | n/a | ||
25 | 0 | 7 | 0 | 0 | 31% | n/a | n/a | |
23 | 10 | 16 | 0 | 3% | n/a | n/a | ||
28 | 16 | 2 | n/a | n/a | ||||
33 | 1 | 1 | 10 | 100% | n/a | n/a | ||
43 | 7 | 7 | 9 | n/a | n/a | n/a | ||
50 | 84 | 17 | 18 | n/a | n/a | n/a | ||
13 | 1 | 1 | 34 | 35% | 27% | |||
104 | 18 | 12 | 14 | 0 | 0 | 46% | 46% | 0% |
15 | 2 | 32% | 21% | |||||
22 | 1 | 33 | 1 | 0 | 19% | 19% | 0% | |
17 | ||||||||
108 | 0 | 0 | 44% | 56% | ||||
18 | 0 | 36 | 26% | 24% | ||||
19 | 6 | 15 | 1 | 0 | 40% | 40% | ||
109 | 3 | 3 | 6 | 0 | 0 | 42% | 50% | 0% |
110 | 3 | 3 | 18 | 0 | 0 | 35% | 41% | 0% |
111 | 0 | 4 | 0 | 0 | 25% | 25% | 0% | |
26 | 0 | 1 | 0 | 0 | 50% | n/a | n/a | |
31 | 24 | 1 | 0 | 58% | 58% | 0% | ||
32 | 5 | 5 | 11 | 38% | n/a | n/a | ||
39 | 2 | 2 | 27 | n/a | n/a | n/a | ||
113 | 0 | 0 | 3 | 0 | 0 | n/a | n/a | n/a |
40 | n/a | n/a | n/a | |||||
46 | n/a | n/a | n/a | |||||
49 | 3 | 3 | 7 | 0 | 0 | n/a | n/a | n/a |
106 | 21 | 18% | 18% | 0% | ||||
123 | 42 | 34% | 34% | 10% | ||||
54 | 0 | 40 | 20% | 20% | ||||
115 | 0 | 0 | 18 | 0 | 0 | 46% | 46% | 0% |
119 | 6% | 12% | 0% | |||||
55 | 7 | 3 | 18 | 1 | 0 | n/a | n/a | n/a |
114 | 13 | 13 | 43 | 1 | 0 | 38% | 33% | 0% |
122 | 3 | 6 | 0 | 0 | 18% | 45% | 0% | |
105 | 1 | 1 | 11 | 0 | 0 | 40% | 30% | 0% |
121 | 40% | 40% | 0% | |||||
118 | 0 | 2 | 0 | 0 | 33% | 33% | 0% | |
116 | 0 | 0 | 6 | 0 | 0 | 20% | 20% | 0% |
59 | ||||||||
117 | 0 | 0 | 14 | 0 | 0 | 60% | 73% | 100% |
120 | 0 | 4 | 31% | 38% | 17% | |||
63 | 27 | 0 | 0 | 14% | n/a | n/a | ||
64 | 85 | 40% | n/a | n/a | ||||
65 |
Notes
[edit]- ↑ Based on the List of edit-a-thons available on the English Wikipedia.
- ↑ For edit-a-thon reports, 13 program leaders (87% of the 15 who provided direct reports) reported and the number of selected priority goals ranged from 7 to 18 with an average of 12 (Mean=12, Standard deviation=3).
- ↑ Averages reported refer to the median response.
- ↑ The mean=$332.97 and standard deviation was $240.
- ↑ Mean=16.3, Standard deviation= 47
- ↑ Mean=28.4, Standard deviation=44
- ↑ Mean=40.1, Standard deviation= 68
- ↑ Mean= 5.9, Standard deviation= 4
- ↑ Mean= 20, Standard deviation= 2
- ↑ Note: Although "content production" is a direct product of the program event itself and technically a program output rather than outcome most of the program leaders who participated in the logic modeling session felt this direct product was the target outcome for their programming. To honor this community perspective, we include it as an outcome along with quality improvement and retention of "active " editors
- ↑ We transformed the WikiMetrics metric "bytes added" to character count in order improve understandability. In most European languages, one byte equals one character.
- ↑ 65% of program leaders reported total number of bytes added during their Edit-a-thon event, which ranged from 991 to 157,586 bytes with an average of 23,992.5 bytes (Mean=29,862, Standard deviation=32,576).
- ↑ Mean = 3.0, Standard deviation = 2
- ↑ Mean = 0.9, Standard deviation = 1
- ↑ 67% of of reports also included the number of photos added which ranged from 0 to 85 with an average of 3. (Mean=11, Standard deviation=22)
- ↑ Calculating the Pages of Text Added (in bytes) to the Dollars invested, the cost per Pages of Text Added (in bytes) ranged from $7.08 to $153.44 with an average of $17.15 (Mean= $41.96, Standard deviation=$63)
- ↑ (24% of all reviewed, 42% of those directly reported)
- ↑ the cost per pages of text added (in bytes) ranged from to 0.25 to 4.5 hours with an average of 1.11 hours (Mean= 1.52, Standard deviation= 1)
- ↑ For the 15 edit-a-thons that had not yet reached the 6-month follow-up time, there were 173 existing, and 110 new editors not yet eligible for 6-month retention follow-up
- ↑ At 6-month follow-up retention rates ranged from 0% to 100% with an average of 0% (Mean= 7%, Standard deviation= 24%).
- ↑ a b To the article namespace.
- ↑ a b Estimated salary per hour.
- ↑ On the English Wikipedia, a well developed how-to document exists: Wikipedia:Edit-a-thon.
- ↑ Percentages out of 26 who provided direct report
- ↑ Percentages out of 26 who provided direct report
- ↑ Percentages out of 26 who provided direct report
- ↑ Percentages out of 26 who provided direct report
- ↑ Edit-a-thons retention listed refers to both new and existing users see program specific reporting for details on retention of new vs existing contributor, and the reporting percentage out of 37 who were ready to report 3 month retention
- ↑ Edit-a-thons retention listed refers to both new and existing users see program specific reporting for details on retention of new vs existing contributor, and the reporting percentage out of 29 who were ready to report 6 month retention
- ↑ Percentages out of 26 who provided direct report
- ↑ Percentages out of 26 who provided direct report
- ↑ Percentages out of 26 who provided direct report
- ↑ Percentages out of 26 who provided direct report