Reader comments foster loyalty and engagement by keeping users on a site longer. But comment streams are vulnerable to abuse, and some news organizations are eliminating them completely.
The Washington Post recently wrote about news sites cracking down on over-the-top comments, which made me think of an article that Slate ran last year called “Why We Post Nothing—Nothing—About Our Kid Online. You Should Do the Same for Your Kids.” Amy Webb, the post’s author, received hundreds of comments by parents who disagreed with her thesis and advice.
Some people found it ironic that Webb would spend hours “crawling through Google,” trying to optimize her child’s digital identity in order to protect her from bullies, potential employers, institutions of higher education, the government, corporations and facial recognition technology.
Webb wrote, “On the day of her birth, our daughter already had accounts at Facebook, Twitter, Instagram and even Github. When we think she’s mature enough (an important distinction from her being technically old enough), we’ll hand her an envelope with her master password inside.”
Many people took exception with Webb’s premise, pointing to flaws in her arguments. Other commenters went a step further, posting videos Webb herself made public in which her daughter’s first and last names and image were also present.
Shortly thereafter, Webb posted a follow-up called, “Congratulations, You Found a Photo of My Daughter Online,” and all of the comments on the original post were removed. Naturally, the community took exception. There was no outright abuse, only arguments against Webb’s argument and posts pointing to what many people considered hypocrisy.
According to Webb’s follow-up piece, “A few thousand Slate readers posted comments both in support of and against [her] central thesis.” If that is true, there was clearly no need to remove them.
Since most readers more or less took exception to the same themes in Webb’s first piece, she could have amended the post with an update. Having one’s thesis criticized does not qualify as inappropriate content, warrant removal or necessitate a follow-up.
Not to mention, a certain commenter had monitored and responded heatedly to disagreements in her defense, reaching more than 300 replies in under 72 hours until someone finally started a poll asking whether the commenter was “a) Amy’s husband, b) Amy’s bestie, c) Amy the author or d) Amy’s Internet stalker.”
Certainly, Webb couldn’t have enjoyed the unexpected backlash, but the Internet is not known for its diplomacy. Being part of the digital space means accepting criticism, even when it comes in forms we’d prefer to do without. Most community members invested in the comments section call out others who go too far. Too often, a site’s online administrators apply rules arbitrarily.
It is unclear whether Webb requested that Slate remove the comments or if the publication simply wanted to remove all traces of discord beneath an article they published. Either way, Slate should’ve known better. It may have rightly chosen to delete comments with links to images of Webb and/or her daughter (even though Webb had made them public). But evidently, it was more convenient for both parties to erase all traces of community discourse, discourse held in trust and in the public domain.
Webb’s teachable moment? “This has been a strong reminder that if we’re going to participate in the online data game, we need to continually monitor its ever-changing rules and to shift our strategies accordingly.”
In Webb’s case, the rules really haven’t changed much. If you’re going to participate in the online content game, you’d better develop a thick skin. You cannot simply delete content that you find disagreeable.
The original column ran again this year, which has thus far garnered 51 comments that remain.
The point made by the Post is that to “tamp down the ugliness, news organizations have experimented with a variety of tactics.” Several publishers have found that using Facebook registration has limited abuse.
But all pubs should should establish hard-and-fast rules that aren’t applied arbitrarily. There are many communities of commenters heavily invested in the discourse in which they participate. Gawker and sites like it often look to its commenters when considering new staff writers.
The comments on the Slate piece were arguably not “over-the-top.” They did not qualify for removal, according to the examples given in the Post article:
The worst comments tended to come from people who saw a [Chicago] Sun-Times crime article linked on the conservative Drudge Report web site and flooded the paper’s site to offer their perspective, said Craig Newman, the Sun-Times’ managing editor. “The comments were scaring [readers] off,” he said. “People didn’t want to read the articles or dip into the comments because it was so vile.”
Popular Science turned off its comments in September after product promoters and trolls — people who post deliberately inflammatory comments — “made constructive discussion impossible,” according to Digiday, a digital news site that reported on the magazine’s decision.
It is up to each publication to determine what it considers inappropriate and to be clear about it. In the Slate example, its community of readers and commenters merited, at the very least, an explanation. Of course, managing the comments section is easier said than done. But vigilance is a necessary responsibility. According the the Post:
Few news organizations can match the comments “curation” resources of The New York Times, which devotes 14 people, including seven full-time staffers, to screen comments on Times articles. The moderators read every comment submitted and approve or reject them based on criteria developed over the past seven years, said Sasha Koren, deputy editor of interactive news. Unlike many news sites, which open comments on dozens of articles each day, the Times limits comments to an average of 18 articles a day.
The idea, Koren said, is to “minimize incivility and elevate comments that include commentary and personal observations of some substance… We’re fortunate to have a large number of articulate readers who regularly share their views, their expertise and their experiences with us and with others.”
A similar system imposed by The Huffington Post in December proved controversial: A news article announcing the change from anonymous posting to Facebook-verified posts was met with nearly 6,000 comments — many of them taking exception in unpleasant terms. “Some people felt we were limiting their right to free speech,” said Tim McDonald, HuffPost’s director of community. The trade-off, he says, was a “significant decrease” in trolls and spam and an increase in more “civil conversations.”
Comments sections are often full of information that the original author may have missed or overlooked. They add context, make us laugh and offer thought-provoking insights that lead to necessary corrections, new angles or follow-up pieces. Comments further democracy by minimizing the one-to-many traditional media model and they epitomize the conversational nature of today’s digital space. Engagement is simply a added benefit, not an end in itself.
Of course there are exceptions — trolling, attention-seeking and incivility are a necessary albeit unfortunate reality. Moderation takes time, attention and resources, but it is certainly better than the alternative: silencing dissent and healthy discourse.
Perhaps news organizations could use the money they’ve saved on cutting back their news staffs and reporters for the moderation of civil discourse. Rules should be clear and applied consistently. Eliminating the comments section, or applying the rules arbitrarily, is a dangerous enterprise — an easy-way-out and a throw back to tired and broken media traditions that compromise the very nature of democracy.