Data Users Need to Reclaim Their Humanity Among All the Algorithms

Perhaps its time to update how we talk about digital labor

Editor’s note: Industry consultant Shelly Palmer is taking his popular newsletter and turning it into an Adweek article once per week in an ongoing column titled “Think About This.”

An invitation to walk in a Labor Day Parade got me thinking about labor law as an alternative path to meaningful data privacy regulation.

There is a popular meme used to explain the exchange of our data for free online services that says, “When something online is free, you are not the customer; you are the product.” This idea is generally adapted to fit the regulatory argument du jour. Another popular way to explain the exchange of our data for free services is to say that if the services we enjoy are not free, then we are paying for them with a new form of currency: our data.

There is some validity to each of these ideas, but both assume that users inevitably exist to serve the commercial interests of the data elite. I reject this notion. At the moment, most users are humans, and in America, being human comes with some inalienable rights.

We are a workforce

We are neither digital products nor possessors of a fungible currency that represents the marginal value of our data. We are underpaid digital workers whose labor (behaviors) generates raw data that is used in the manufacture of digital products. These digital products, such as interactive advertisements, generate hundreds of billions of dollars of revenue for the organizations we work for.

We need new language to describe what is actually happening in the transition from the Information Age to the Age of Machine Intelligence.

Our employers

Our de facto employers include Google and its family of products like YouTube, Waze and Gmail; Facebook and its family of products like Instagram and WhatsApp; Twitter; and every other social network, search service or digital product (website or app) we are offered free access to.

The current approach to regulation

This past week, Google agreed to pay the Federal Trade Commission and the New York Attorney General a record $170 million to settle allegations that YouTube violated the Child Online Privacy Protection Act (COPPA) by collecting personal information from viewers of child-directed channels without first notifying parents and getting their consent.

Some may see this fine as a victory—it is anything but. If Google had obtained permission from these children’s parents, it would not have been fined. But Google would still have acquired all of the same data. Google was fined for a protocol error. Nothing in this lawsuit attempted to do anything to protect the children or the parents or any of us from how Google will use the data, how the data is classified (or misclassified) or what deals are made with it.

In charging Google to pay $170 million for its alleged violation, regulators viewed the issue of data privacy through last century’s lens. It is as if the regulators did not have the appropriate language or understanding required to draft a law that would protect us from the abuse or misuse of data so they could only regulate its collection.

A different approach to regulation

If we want meaningful transparency regarding the use of our personal data, which I assert are the fruits of our labor, maybe we should be thinking differently about how to use the laws of the land. We have only vague definitions for data privacy, and there are years of regulatory hurdles to defining an appropriate and fair national digital authentication schema.

But if we can make the case that we are employees of the data elite organizations that use our data, we can collectively bargain for the work conditions and wages we think we deserve. To do this, one human resources lawyer suggests that we ask the data elite organizations what they would have to pay researchers, pollsters and other gatherers of data if we refused to provide our services to them. This would set a value on our labor.

The American Federation of Users and Data Generators

Imagine forming a union: the American Federation of Users and Data Generators. With enough members, the union could go to Google, Facebook and the other data elite organizations that use the data generated by the members’ labor (online behaviors) to collectively bargain for total transparency with regard to first-, second- and third-party data usage, rules around data provenance and ownership and other rights.

This might be the very best way for ordinary Americans that are not part of the data elite to gain control of their data destiny.

There is a legal mechanism in place to do this. In 1935, Congress enacted the National Labor Relations Act (NLRA) to protect the rights of employees and employers, to encourage collective bargaining and to curtail certain private-sector labor and management practices that can harm the general welfare of workers, businesses and the U.S. economy.

Legally speaking, this is a heavy lift. But we need new language to describe what is actually happening in the transition from the Information Age to the Age of Machine Intelligence.

Intelligence and consciousness are fully decoupled and algorithms make decisions for other algorithms that make decisions for other algorithms, which make decisions about what we see, where we go and how we get there. Then other algorithms use long-accumulated, poorly calculated proxy data to make decisions that influence other decisions made by other algorithms that we can’t even comprehend but which directly impact our lives.

We are going to look back and wish that sometime during this year, we put a stake in the ground and declared: “We are humans, and we choose humanity.”