When Dolly the sheep became the first mammal to be successfully cloned, it sparked a huge discussion. While many marveled at the incredible breakthrough, the scientific feat raised a very important question: just because we can, does it mean we should?
Perhaps one could say the same for the use of big data.
Advances in big data technology are certainly marvelous, but have reached the point of privacy invasion. Overt surveillance from CCTV, body cameras and drones, along with advances in facial recognition pry into our lives to an unacceptable point. Just last year, in the U.K., the government’s CCTV watchdog warned that the privacy of the public was “at risk of being invaded on a mass scale.”
And what about our behavior online? You might not care if you go on Amazon to buy a toaster and the site suggests a few food processors you might like, based on your buying history. But what if you went to another site and bought something more personal, like Viagra? Would you want to be spammed with potentially embarrassing emails and ads about erectile dysfunction? Not likely.
“Brands are losing trust when it comes to data,” said Klaus Heuser, founder of Sooloox. “Customers are looking for more transparent models. They know data is being harvested, and in many ways, they’re OK with it where it fulfills a specific need. They just want control over it.”
As recent events have illustrated, with a German court finding Facebook guilty of abusing personal data, wrongful usage can cause massive problems and create general unrest among internet users.
Yet, regulating the big data industry is not as simple as it may seem. The utilization of it employs millions of people. Just think about the data scientists themselves. Ten years ago that job title didn’t even exist and today, it’s one of the most sought-after skill sets. Plus, big data enables businesses to sell more products, which also keeps more people in jobs.
So how should businesses be using big data in an ethical way that respects everyone’s privacy, while still maintaining optimum sales? Here are three ideas:
1. Shifting company mindset
Corporate responsibility should be instilled in a company’s culture. By making trustworthiness one of your company’s core principals, it will be easier to make tough calls as it pertains to the use of customer data.
“We think of this responsibility as the building blocks of trust between the customer and the business,” said Patel. “If there is a breach of this trust, all the dollars that marketers spend acquiring customers dissipates. In addition to the ethical arguments for respecting privacy, it just makes good business sense to do so.”
2.Utilizing privacy technology
The solution to problems with technology is often other technology. Some experts are suggesting that blockchain could be a solution to solving privacy problems, since the technology is basically a decentralized ledger with no single point of ownership. Companies would no longer be able to buy and sell consumer data, since they would cease to have ownership of it.
“The ad-supported content model is not working,” said Heuser. “While industry folks are pointing fingers at Facebook for a breach of trust, brands and marketers need to recognize their own responsibility to use data with discretion. Facebook is only one part of a broken ecosystem that blockchain is uniquely poised to fix.”
Blockchain technology itself may or may not be the solution, but many can agree that a better system needs to be created to keep everyone honest.
3. Being transparent
Most people utilize free services on the internet without realizing that they are giving away their personal data in the process. Perhaps companies feel that they are providing a free service, so it’s only fair they get the data with which to make money off of. Which seems logical, as long as the user is aware. One solution then, could be better communication: understanding what data customers are OK with giving and what they are not.
Because brands are clamoring for targeted information, companies like IBM with their Personality Insights service can create detailed profiles on us, tracking our online habits and revealing aspects of our personalities. While those on the inside may claim to deliver a better customer experience, by serving us with targeted information and advertisements based on our interests; just imagine this level of detail being used against us. Insurance companies, future employers, credit checkers … the list could be potentially long.
With great power comes great responsibility, as Uncle Ben from Spiderman would say. It’s fundamental that brands and regulators ensure that consumers are aware of how their data is used and allowed to opt in or out.
At the end of the day, it’s not big data technology that is bad – it’s how we use it. And because the technology is relatively new and continuing to evolve, the most efficient regulations have not been put in place. It’s important for businesses to maintain an ongoing dialogue about this issue and stay vigilant to keep a healthy relationship with their customers.