The value – and dangers – of volunteer editors

Is volunteer editing a way to quickly and accurately validate data?

Is volunteer editing a way to quickly and accurately validate data?

Our culture is undergoing a rapid "wikification," where nearly every form of content imaginable can now be edited and reconfigured by members of a community. Supported by technological innovations that make remote editing easier to track and implement, legions of volunteer editors have emerged to help authenticate, structure and moderate the vast quantity of content available to the public. Indeed, Wikipedia benefits from the volunteer services of over 80,000 editors who comb through the more than 38 million articles hosted on the site, verifying data and flagging errors. These editors are unpaid, largely anonymous and not required to have any kind of formal training. 

Why would people donate countless hours to manage content to little or no acclaim? And a deeper question: Are these volunteer editors a boon for content managers or a danger?

"These editors are unpaid, anonymous and not required to have any kind of formal training."

Motivations and psychology 
From an economic perspective, the decision to employ volunteer editors is a no-brainer. If an organization is able to obtain free editing services, it can redirect resources to other capital-intensive projects. There is even a qualitative argument to be made based on the "crowdsource" aspect of community editing – that, by opening up editing privileges to the community at large, the ability to quickly verify and validate date on a grand scale is possibly without having to wrestle with outside interests and intellectual gatekeepers. 

To understand the value of volunteer editing, it helps to start by examining the motivations of a person who engages in it. Looking at the question of why people volunteer to edit for the site, the Behavioral Science and Policy Association recently conducted an experiment to see what factors motivated community editors working within Wikipedia. BSPA randomly assigned certain editors within the German language Wikipedia community a "Edelweiss with Star" badge, which could be displayed prominently on a user's profile or otherwise hidden. Members of this badged community exhibited higher rates of editor retention – over 20 percent after one month and 14 percent after two months.

While this would lead some to assume that public recognition could boost editor retention, the experiment found that only about 6 percent of the badge recipients opted to display the badge publicly – implying that recognition with the community may not be a strong driver of retention. This led study author Jana Gallus, a postdoctoral fellow in the Behavioral Insights Group at Harvard, to speculate that each editor's feeling of belonging to a community may drive people to volunteer in such high numbers.

Edits attract edits
Then there are the dynamics of the community/volunteer editing process itself. Stephan Seiler, an economist and associate professor of marketing at Stanford Graduate School of Business, and Aleksi Aaltonen, an assistant professor of information systems at Warwick Business School, studied the editing patterns of nearly 1,310 articles over eight years. Articles that were community edited, they found, tended to be edited frequently, attracting new edits and editors like magnets. This they dubbed the "cumulative growth effect," which basically means a snowballing of content editing that occurs once a prepopulated article attracts the attention of editors – which in turn begets more edits. 

"Simply putting [an article up] up and hoping that people will contribute won't work," Seiler told Stanford Business. "But any action that increases content can trigger further contributions."

"Inaccuracies can become increasingly hard to track and invalidate."

The dangers of volunteer editing
It's not clear that the "cumulative growth effect" is necessarily a good thing. One of the much lamented aspects of Wikipedia is that it can be edited – and reedited – at the whim of almost any registered user. This has led to an unknown number of hoaxes, frauds and vandalized pages – inaccuracies that can become increasingly hard to track and invalidate if they are used as sourcing for journalism or academia.

This is the essential danger of volunteer editing: without requiring specific qualifications – and the ability to meaningfully penalize or incentivize edit quality – inaccuracies are likely. Even in its most benign occurrence, a simple error or misunderstanding can taint the validity of a piece of content. At worst, you can get intentional and malicious obfuscation of data. This has kept many content management organizations from adopting the full-scale community editing capabilities pioneered by Wikipedia. 

Leave a Reply

Your email address will not be published. Required fields are marked *