Calls for Facebook to be broken up due the influence and power it has amassed have come from many.
As part of her campaign for president, for instance, Senator Elizabeth Warren has promised to break up big tech companies including Facebook, Amazon and Google, contending that their size has reduced competition and innovation. But while such calls from politicians might elicit a sense of ennui, an unexpected voice has struck a similar tone in the form of Facebook co-founder Chris Hughes.
In his New York Times editorial, Hughes highlights the unsettling amount of control that Facebook CEO Mark Zuckerberg wields over the content that surfaces on Facebook’s platform. With consumers increasingly relying on social media as a source of news, Facebook is in a position to influence our very culture based on the content that is published to and disseminated across the platform.
This isn’t the only piece in a string of recent PR problems for the social networking behemoth. Facebook recently banned individuals who peddle violence and hate, such as Alex Jones who has claimed that the Sandy Hook shooting was a hoax. This was met with criticism from President Trump, who asserted an anti-conservative bias, charges that had already be levied against the social media platform.
Given such criticism from politicians, it is understandable that the company may tread lightly when it comes to censoring content. Facebook is also facing mounting criticism for its insufficient responses to limiting extremist content.
In the face of these recent challenges, where do we go from here? The influence that Facebook wields isn’t just a function of its acquisitions of Instagram and WhatsApp; it is also due to its approach to user data. Users made the decision to hand over such data in exchange for the service being provided, and there currently isn’t an alternative that makes disconnecting from Facebook a viable option for many. In addition to “paying” for Facebook by providing it with our data, we are also paying for its services by allowing our discourse and the information to which we are exposed to be dictated by the company’s algorithms. Despite having information literally clicks away, our reliance on Facebook actually results in exposure to less content than we might otherwise come across. Such non-financial costs must be recognized by regulators in weighing whether or not Facebook needs to be broken up.
Regardless of this decision, the potential for social media platforms to spread misinformation, whether it is political or financial, warrants regulation over the companies. Even Zuckerberg agrees that additional regulation is needed. In the U.K., regulators have proposed holding digital platforms responsible for the content posted on their platforms that cause harm, such as terrorist groups spreading propaganda and gangs inciting violence. Such regulations are needed to limit the extent to which social media platforms are weaponized to shape public opinion through the spread of information. Guidelines must also be developed to aid in the policing of content on social media platforms. Failing to do so may inadvertently result in algorithmic bias and ad hoc decisions, as far as what types of content warrant removal from platforms.
Beyond regulating content, there also need to be safeguards enacted to protect consumer data. Such regulations must speak to the way in which it is secured and how it is used. While the delivery of content based on user interests may appeal to users, they might be leery of it being collected by third-party companies. Having amassed troves of consumer data is part of what makes Facebook such a formidable entity in the digital marketing space, and this would not be corrected by Hughes’ and Senator Warren’s proposals to separate Facebook, Instagram and WhatsApp.
One potential solution may lie in separating the advertising (and data) assets from the communications platforms. Doing so may reduce the extent to which Facebook can influence our discourse while still enabling the communication platform to connect users around the world.