Can Wikipedia Defeat the ‘Sockpuppets?’

It’s not just Fox News, guys. Not by a long shot.

Wikipedia is in the midst of a crackdown on ethically dubious practices by “sockpuppets” paid to write and/or edit slanted entries. As of last night, the site’s administrators have shut down more than 250 “suspicious” accounts following a Wikimedia Foundation press release lamenting the fact that they “may have been paid to write articles on Wikipedia promoting organizations or products” and violating numerous site policies that prohibit conflicts of interest.

We get it: repeated studies have shown that inaccuracies in Wikipedia profiles can significantly damage corporate and personal reputations, and the same studies indicate that the process of correcting such errors is too inefficient for such an important source of public information.

At the same time, we see this development as bad news for both Wikipedia and the PR industry at large: it could not play more perfectly into the perception that we’re all paid liars.

The inspiration for this fight came at least in part from articles in Vice and The Daily Dot, both of which named an organization called “Wiki-PR” as one of the primary sockpuppet offenders. The company claims to have a “staff of 45 Wikipedia editors and admins that can help businesses and individuals create Wikipedia pages”, and it also maintains a website as well as a Twitter/LinkedIn/Facebook presence. The fact that the company’s Twitter feed is private and that its admins chose to like their own “interns wanted” post says something about its fear of the sunlight. Here’s its own description of services:

Our team of 25+ writers are ready to develop your Wikipedia page with thorough research, rich content, and a professional tone that readers will trust.

This sounds good, but the details are murkier:

Former Wiki-PR clients told the Daily Dot that they paid between $500 and $1,000 to the company for creation of a Wikipedia page, and $50 a month for monitoring any changes made to the page and resurrection of any material deleted during subsequent edits.

In other words, we’ll create the page you want and do everything we can to make sure it stays that way. It should go without saying that this practice seriously undermines the credibility of both the organization and the very forum it’s promoting. In an email, Wiki-PR’s CEO defended his company’s practices, writing that they simply “counsel our clients on how to adhere to Wikipedia’s rules” and that their services differ from those of most PR firms which “don’t know the rules as well because they do PR work, broadly, and try to promote.”

But how does paying individuals with admin privileges to create and edit pages not create a conflict of interest?

Yesterday on our sister site Social Times, B.J. Mendelson called the fight against Wiki-PR and similar organizations “a losing battle” because Google’s reliance on Wikipedia compounds its influence and “Wiki-PR was just one of numerous, numerous entities out there doing this”. The fact that Wiki-PR managed to breach the notoriously tricky admin wall is the biggest part of what makes its service so valuable. As a spokesperson for admitted client Priceline told Vice: “We are using them to help us get all of our brands a presence because I don’t have the resources internally to otherwise manage.”

So how can we end this sockpuppet brigade?

It’s going to be tough. Any entities who paid to have their pages scrubbed or slanted should be named and shamed by the Wikipedia squad, and PR ethics organizations should more aggressively call out the responsible parties. But written statements of principle sadly won’t do much good.

At the very least, it feels like time for Wikipedia to redefine and improve the “red line” that forbids individuals with conflicts of interest from creating and editing pages. It’s an important standard that must be maintained for Wikipedia to remain credible, but it fails to achieve any of its stated goals: it doesn’t prevent sockpuppetry and it makes the process of correcting truly damaging factual errors far more difficult than it needs to be.