One year into her tenure as chairwoman of the Federal Trade Commission, Edith Ramirez is putting the agency front and center as the nation’s leading enforcer on privacy and data security. Through cases like Snapchat, which lied to its users about its privacy practices, and Trendnet, which failed to secure users’ private video feeds, the FTC is setting precedent for how the marketing industry should balance the need to collect consumer data with the need to protect consumer privacy.
It never occurred to Ramirez that her law degree would take her to Washington, D.C., let alone run the FTC, at a time when technology has made privacy and data security the defining issue for business and consumers. Her career path was set the minute she met fellow Harvard law school student Barack Obama. She says, “We knew he had an interest in politics, and I knew he would do important things.”
Befitting her role as the nation’s privacy arbiter, Ramirez is a no-nonsense individual, not prone to jokes or pleasantries. It’s all business all the time for the San Clemente, Calif., native. Her speeches are precise, carefully worded pronouncements, all assets for the leader of an agency being transformed by privacy and data security issues.
Adweek: Some have called the FTC the “Federal Technology Commission.” How do you react to that?
Ramirez: Technology plays an ever more important role in consumers’ daily lives and in new and existing business models. There will likely be tremendous benefits, but this new period we are entering—the era of big data and the Internet of Things—will also raise significant privacy challenges, and it will be essential that consumers retain confidence that businesses are safeguarding their data and using it responsibly. As mobile technology proliferates and connected devices become more commonplace, one of my priorities has been to address the privacy and consumer protection issues associated with these technologies at an early stage. This was clear in our recent case against Apple concerning in-app purchases in mobile apps for children, and it’s reflected in our many enforcement actions and policy reports addressing privacy issues.
How can you reassure businesses that your forward-looking approach won’t stifle innovation and growth?
That’s the fear. My answer is simply, I don’t believe that. We believe you can have innovation, but you can also safeguard consumer privacy. They aren’t at odds. They complement each other. Consumers care very deeply about privacy. In order for [these new technologies] to be successful, consumers must trust them and feel their information is handled in a responsible way. It’s incumbent for businesses to take that into account. In my mind, the way to address this is that businesses should be thinking about privacy and security from the outset of the process of designing new products and services.
Self-regulation is an important complement to the work we do, so long as the standards are sufficiently robust and protective of consumer privacy. I haven’t been satisfied to date that what advertisers have done is enough when it comes to this particular issue.
Where does advertising self-regulation fall short?
The amount of behavioral data collected and how it is used are hidden from the very consumers who are tracked and analyzed. That is not a long-term recipe for success, reflected by the numerous surveys showing consumer unease with tracking. I am pleased that the advertising industry, via the Digital Advertising Alliance, has responded to the FTC’s call for greater consumer control over online tracking, and that the DAA icon is displayed in a large number of online ads. At the same time, I have concerns about whether it is sufficiently prominent and recognizable to consumers, and that its purpose is clear.
What more should the advertising community do?
Consumers should more easily be able to opt-out. A lot of tailored ads are beneficial to consumers, and we don’t want to undermine that. But consumers should have control over what information is being collected. You have data brokers that aren’t consumer facing. They know more about us than we know about them.
What can we expect from the FTC’s upcoming data broker report?
The aim of the report is to shed some light and illuminate the practices of data brokers because it is an industry that has been operating in the dark. It’s important for everyone to see what is happening.
Do you agree with the White House report that big data has the potential to discriminate against some groups?
We gave input for the report. We have to think hard about potential adverse consequences of all these devices and business models. Discrimination by algorithm is an issue I’ve talked about for some time now. I’m not sure we’ve explored what all might happen. Once we’re better informed, we will be better positioned to recommend best practices or legislative solutions. We’ve scheduled a workshop on this for September.
Does the FTC still advocate a broader Do Not Track mechanism?
I continue to support the deployment of a Do Not Track system that satisfies the criteria that the commission called for several years ago. One that is universal, persistent, easy to find and use, comprehensive, and neutral as to the type of tracking technology used, and that opt-outs consumers out of the collection of data, not just the display of targeted ads.
You’ve advocated for comprehensive legislation to give the FTC more authority over consumer privacy and data security. But since Congress seems slow to adopt legislation, what can the FTC do?
Until a comprehensive privacy law is enacted, we will continue to vigorously enforce the FTC Act’s prohibition on deceptive or unfair commercial privacy practices. We have authority in data security, and we’ve brought cases. The agency will also continue to engage in the policy sphere by studying the privacy ramifications of new forms of technology and business models and articulating best practices for the use of consumer data.