3 Tips for Thriving in the New Data Landscape

More privacy regulations will come and you need to be prepared

If 2019 was the year of the techlash, then 2020 should be the year of atonement.

With more regulation on the horizon and both consumers and platforms restricting access to data, the industry needs to make its case for data-driven advertising. 2020 is the year to rebuild consumer trust, demonstrate the value of data-driven innovation and get the house in order for the regulatory changes to come.

To understand how to move forward, it’s helpful to take a look back.

For years, I heard companies refer to online advertising and the growing ad tech ecosystem as the wild west—a land of few rules and no sheriff. While there were always things a company shouldn’t do, there was little a company couldn’t do (legally) as long as its privacy policy disclosed the practice. While the healthcare, automotive and financial sectors labored under very prescriptive rules and requirements, data-driven advertising marched on with few restrictions.

Well, the sun is quickly setting on the days of free-wheeling data use. GDPR, the 2016 U.S. presidential election, the birth of so-called fake news and the numerous other privacy scandals and data breaches that occurred over the past few years have kept privacy top of mind for both consumers and regulators.

Beyond regulation, the platforms that most companies depend on for advertising started to change their positions as well, restricting the third-party cookies that have become a critical element of a data-driven ad strategy. With the death of the third-party cookie imminent, the industry is at an inflection point. Access to third-party data is going to become much more restrictive and companies will need to move towards a strategy that is less dependent on cookies and other companies to reach their audiences.

This environment begs the question, where do we go now? Here are three privacy-forward steps companies can take to position themselves for the future:

1. Embrace differential privacy and other privacy-safe methods of data analysis

If consumers have lost trust in publishers’ and brands’ ability to protect their data, then differential privacy promises to be a pivotal tool to regain that trust. By introducing noise at the point of collection, differential privacy allows companies to collect and use information without getting access to or exposing the privacy of any one consumer. The more noise that’s introduced, the lower the mathematical probability that any one person can be identified from a data set.

Think of someone trying to draw insights from a photograph (e.g. the time period, location, nature of the event). The surrounding details of the photo are more important for this exercise than the identities of the individuals pictured. To create a privacy-safe version of the image, you might blur the faces so that the individuals aren’t identifiable, but the details of the background are still visible.

This is how differential privacy works. The best version of differential privacy for ad tech is where the image is blurred to the point that the individual isn’t recognizable, but not so blurry that you can’t glean any information from it. Unlike merely de-linking data from its identifiable information, which poses the risk of re-linking by bad actors, differential privacy makes it statistically improbable that the individuals included in a data set could be re-identified.

Admittedly, differential privacy may have a more significant benefit for measurement, attribution and analytics, and be less helpful when it comes to ad delivery (for now). In advance of its plans to phase out support for third-party cookies, Google has offered a set of open-source differential privacy libraries to help developers. According to Google, it’s exploring the use of differential privacy to deliver ads to large groups of similar people without letting individually identifying data leave the browser. Companies that learn to build differential privacy into their data collection practices may help avoid privacy and data leakage issues that create brand damage and regulatory risk.

2. Drive transparency and privacy-forward practices

Transparency is more than posting a privacy policy. It involves clear messaging about privacy practices that help the consumer understand how their information is used and what choices they have. We should expect to see the standard privacy policy phase out as more innovative formats, such as the privacy nutrition label, become more prevalent.

Internally, companies that consider consumer privacy in their product-design will be in the best position to make their case for consumer trust. While some may argue for consent as the de facto standard, putting the burden on consumers to manage their data is unreasonable and unfair. Instead, companies should adopt internal practices designed to protect consumer data, whether or not the consumer has exercised his or her privacy rights.

In addition to being a good practice, this may become a legal requirement. Several state privacy bills would, if passed, impose obligations on companies to perform internal risk assessments and require them to limit their data collection and use only what is needed to provide their service.

New York’s proposed Privacy Act would go even further by requiring companies to be data fiduciaries, which means that companies would be legally barred from using data in a way that benefits their companies to the detriment of their users.

3. Use personalization to communicate value

Companies that want to make a case to consumers for why they should not opt-out or have their data deleted will need to help consumers understand the value they are getting for their data. Companies that can provide hyper-personalized offers and experiences, not just ads, will give a better value proposition to consumers. Personalization through recommendations, customized services, or loyalty-based pricing are all techniques that can be used to convince a consumer to keep their data on your platform. When consumers see the value they get from sharing their data and can trust the company they are sharing it with, they will be much less likely to take their data away.

The industry is at an inflection point, navigating changing consumer sentiment, new regulations and shifts in platform policies. However, we should look at this moment as an opportunity for innovation. While this period may present a challenge, I think we’ll get to the other side a little bruised, but much better and ready for the future.

 —

Jessica Lee is an attorney who focuses on the privacy and intellectual property issues that arise when launching, marketing and monetizing digital products and content. Agencies, ad tech companies and publishers look to Jessica to provide practical, business-focused advice on navigating the privacy landscape in the U.S, E.U. and Latin America. Jessica advises on advanced advertising strategies and helps clients monetize consumer insights derived from the use of emerging media platforms.