Why I Removed My Children's Images From Instagram and Facebook

Who knows what these data-hungry platforms will do with the photos down the line

My wife and I recently removed all images of our children from Instagram. Like most people, I don’t trust Facebook, Instagram’s parent company very much, these days.

This fact isn’t so remarkable in and of itself, but it begs the question of why. Of course, there’s the oft-cited Cambridge Analytica scandal, but across history, brands have had various scandals that touched their users and managed to emerge relatively unscathed.

So then why do I, like so many other people, have a deeply ingrained trust deficit with Facebook and, more broadly, big tech? And is the cause of this something more serious that other brands should be observing and planning for?

This isn’t a product issue per se. On the surface, Facebook is a great consumer product. It offers a host of services, largely free, that connect us with our nearest and dearest, keeping us in contact in a way that would have been unimaginable before it existed. Sounds great, right? Yet people don’t like Facebook. Indeed, the company has a serious trust issue. A 2018 Trust Index of U.S. adults by Jebbit found that Facebook had the lowest consumer trust score (3.1) of any surveyed brand. How a company that offers such a great, valuable product could come to be disliked and distrusted so strongly speaks to the changing nature of trust in the data-driven internet era.

There are two issues at play here. The first is the lack of understanding that consumers have of just how much data is being collected about them and how deeply this is mined to synthesize incredibly personal insight. The lesson that Cambridge Analytica should have taught us is not simply that elections can be manipulated, but that we can be simultaneously susceptible to deep suggestions and unaware that it’s happening. This is covert mass manipulation.

Allowing any company to accumulate a pattern of your child’s behavior or facial characteristics from birth to early adulthood is a treasure trove of data.

The second is a lack of understanding as to how this data may be used in years to come. The information we expose about ourselves or our children may not seem relevant today, but allowing any company to accumulate a pattern of your child’s behavior or facial characteristics from birth to early adulthood is a treasure trove of data that, in decades to come will be mined, analyzed and exploited in ways even engineers have not yet considered today. This is the risk. You’re placing your data (and faith) in a future state of technology driven by process automation, machine learning and artificial intelligence that no one yet quite has a grasp on.

Here’s a thought experiment, none of which is beyond even current technology. Suppose you have a public Instagram feed with photos of your children posted over several years. As a young adult, your child applies for health insurance. In this future universe, the systems that exist within the insurer’s actuarial armory have already scraped the photos from their childhood and noted an excessive amount of time in bright sunlight, and using skin pattern scanning, note some blemishes that may be early indicators of skin cancer. They’re denied insurance or even a human review.

The technology I’ve described above sounds frightening and sci-fi-like, but many of the technologies outlined here exist today with varying degrees of accuracy. Our images are regularly scraped, indexed and searched by systems, and various algorithms can be run on these. This is for data you can naturally see, notwithstanding the vast quantities of data you create without perhaps realization, such as behavioral traits, interests and physical location, all of which can be used to triangulate a detailed understanding of your personality, habits, disposition and socioeconomic status.

Consider as an individual, a parent or a company, how are you managing the data you create?

Historically, trust as a pillar of a brand’s equity was thought of as something earned with time, perhaps aided by some emotional cues of history and heritage or inter-generational inheritance of a brand (think of your parents’ preferred toothpaste brand, for example).

Trust was also imbued in the messaging. A brand you could trust was sincere, thoughtful and almost earnest. Plus, the experience you might have with that brand would reflect this. Now brands collect, process and synthesize hundreds of data sources simultaneously to derive competitive advantage across the entire spectrum of their business activities, from new product innovation to marketing optimization and improved customer experiences.

With that valuable data and insight comes a whole new set of trust issues, from a legal and regulatory perspective, but perhaps most importantly, from a perspective of consumer relationship management. In a future that is entirely data-driven, we can see that already fickle consumers strongly prefer a brand they can trust with their personal information.

There have been countless data breaches affecting users of Facebook, Experian, British Airways and Marriott, just to name a few. But more worryingly still are the visible failures or a lack of any controls in the systems within large brands to safeguard their customers’ data. As we race to a world in which data represents the greatest competitive advantage, sometimes referred to as the “new oil,” it seems reasonable that companies should be working overtime to earn consumers’ trust vis-à-vis handling such a precious resource.

Instead, corporations have tended to behave irresponsibly, to shoot first and ask questions later. Consequently, increasingly savvy governments are rapidly imposing regulations (like GDPR in Europe or CCPA in California or LGPD in Brazil) with the aim of protecting their constituents.

Regulations will of course force gradual change as they have done in banking and healthcare. But there is a greater opportunity here for brands to capitalize on for competitive advantage. In the coming years, actively courting consumer trust through responsible data management will come to be seen as a key differentiator.

In an age of distrust in globalized, data-hungry corporations, ensuring that the personal information you collect is managed ethically and with the care and integrity that your users might expect gives you a moral high ground and a brand position that is unique. It’s an opportunity to show that you do more than grab data, to prove you can be trusted as a guardian of that information while continuing to benefit from the business value it can bring.

While the lawyers, actuaries and public relations firms draft onerous response playbooks to be broken open in case of inevitable emergency, brands should take a step back and consider the value of simply doing the right thing with the information users choose to share. Make no mistake: Now that those users are becoming more privacy aware, they’ll reward the brands that can demonstrate that they’re actually aware of it, too.

But until the time comes that brands act responsibly and transparently with the data they hold, I—and, increasingly, concerned consumers like me—will tread cautiously when it comes to how much I share of my nearest and dearest on platforms like Instagram and Facebook.