Facebook Isn’t Just an Industry Disruptor—It’s Destroying People and Societies

The platform prioritized growth at all costs rather than a set of values

Upgrade your sales funnel with expert insight at Commerceweek, Feb. 28-29. Broaden your audience with improved data technology and a seamless purchase experience. Register now at 50% off.

The latest Facebook struggles are clear proof that past a certain scale, tech’s “Move fast and break things” motto is not actually breaking things—it’s breaking people, and it needs to be stopped.

This comes following The New York Times op-ed by Facebook co-founder Chris Hughes advocating for strict regulation around the network’s activities and the news of auto-generated videos celebrating extremist images. If scaling is to be done responsibly, then it has to be reduced at any cost to the platform that bears ultimate responsibility for the consequences of its growth.

Companies we like to call “disruptors” pride themselves on moving fast and breaking things, but in the process, they’re also breaking people and society. Blitz scaling has to stop. Whether it is Airbnb, Uber or Facebook, they’ve all shown us across the world that scaling at all costs actually means scaling at society’s expense.

Automation is fast, but automated content unchecked and unvetted is irresponsible at this immature stage of the technology’s development.

In the case of Facebook, scaling at all costs has meant automation. Humans alone are too cost prohibitive, and Facebook has clearly shown that it is unable (or unwilling) to invest adequately in enough people and also equip them to be effective and protected from the worst excesses submitted to the platform. It is also clear Facebook prefers machines. In its quest to be a universal community, Facebook is reluctant to base its automation’s work on anything but blind data. Instead of committing to a set of values and norms and giving more weight to voices that follow these, Facebook’s algorithms simply amplify the loudest, most aggressive voice, however extreme the view may be.

Automation is fast, but automated content unchecked and unvetted is irresponsible at this immature stage of the technology’s development. Automated content is not bound by any social norm, and as a result, it is feeding into and accelerating the destruction of the very fabric of society, which is based on an inherent trust that democratically agreed upon codes of conduct will be respected. When a giant like Facebook ignores these norms on such a massive scale, the trust that is the very foundation of society breaks down.

Facebook has been moving and growing fast, and one could now argue that it is the last man standing with very few worthy competitors. Who does this benefit? Who is paying the price of this all-encompassing growth? By scaling at all costs, companies like Uber, Airbnb and Facebook have achieved the scale of nation-states at society’s expense without any political structure or systems to allow for legal or democratic checks. This leads to the growth of a form of cultural genocide, as the most extreme views overpower the more rational ones (anti-vaxxers, dangerous diets, terrorism, etc.). It would ultimately lead to societal terrorism, holding the majority in fear of the continuous and amplified promotion of destructive notions.

If the scale of the job is too big, Facebook needs to reduce the scale. One can hope this is the beginning of a new era. There are more and more responsible consumers, and audiences are definitely showing openness to the concept of responsibility over size. We have responsible food, clothes, beauty; the time has come for responsible social media platforms.

Facebook needs to make some major changes to enter an age of responsible platforms. Most importantly, Facebook has to put its hands in its pockets and dig deeper to fix what it’s created. It won’t be cheap, but it’s necessary for the future of the organization. In his opinion piece, Hughes says he is disappointed in himself “and the early Facebook team for not thinking more about how the News Feed algorithm could change our culture, influence elections and empower nationalist leaders.” The fact that they didn’t spend enough time thinking about the guardrails doesn’t absolve them from the consequences. They need to fix it at all costs.

The question is how can this staggering monster be fixed? The answer is through leadership. Hughes says he worries that “Mark has surrounded himself with a team that reinforces his beliefs instead of challenging them.” This is important. Good leaders, especially founders, absolutely need to be questioned and challenged. That’s how they grow. Zuckerberg needs to be challenged by the people he trusts enough to truly listen to and absorb the message: He needs to act big and act fast.