How Social Media Use Interferes With Justice

While Australians are reeling from the recent deaths of an 8 year-old girl and a 12 year-old boy, virtual lynch mobs are forming on Facebook and other social media sites, calling for the death of one of the accused killers. Unfortunately, if justice is to take its course, such social media actions might hinder criminal cases or result in mistrials.

Australian media is reporting that memorial “sites” (meaning Facebook Pages) setup in tribute to the slain children are being defaced with pornography and hatred. One of these pages was set up by Australian Symone Powell, and called for the death of accused killer, 19 year-old Allyn John Slater, charged in the murder of 8 year-old Trinity Bates. Powell abandoned the page after threats aimed at her. Slater, who apparently was a Facebook friend of Bates’ parents, currently has no trial date.

The justice system in most countries requires that selected jury members be unbiased. However, the “innocent until proven guilty” rule is at risk through social media outcry. While it’s becoming natural to post online about grief, if social media is also vehicle of misinformation or simply popular sentiment of Slater’s guilt, it may be difficult or impossible to choose an unbiased jury in Australia or even anywhere in the world. The net result: criminals could go free due to mistrials and justice would not be served. If that happens, could a virtual lynch mob turn into a real one?

Yet another Facebook memorial page was defaced, this one in tribute to a 12 year-old Australian boy, Elliott Fletcher, who was stabbed at school. Queensland, Australia, Premier Anna Bligh to write a letter (PDF, 2 pgs) to Facebook CEO Mark Zuckerberg asking him what can be done to stop the defacing of memorial Fan pages.

Some people feel that Facebook does not always act quickly in these caes. Can Facebook be blamed for not responding quickly, given the large user base (400+ million) compared to the small staff size (1,000)? The Facebook site does have measures in place for members to report unacceptable behavior. (Suggestion: If you believe in the concept of a fair trial, then before you add your voice to any social media collective, when someone’s reputation is at stake, find out all the facts from as many sources as possible. If you see a Facebook page or other social media content that might be violating laws, even in other countries, report the page.)

Should Facebook or other social media sites be responsible for the illegal actions of others? Should they automatically suspend and “handle” pages that contain certain keywords that might indicate illegal discussions or activities? What do you think? Is this censorship? Should social media sites be held accountable if they don’t quickly take down pages or discussions that violate laws in one or more countries? If they were to form a permanent list of “bad” words to auto-flag, who would have the authority to decide what words go into this list? Should it be social media company employees, representatives of UN member countries, a collective of justice officers from various countries, or some international task force?

These questions point out the difficulty of implementing measures to prevent social media activities that could potentially hinder justice in criminal cases. Unfortuantely, this is a subject that will rear its ugly head over and over.