Editor’s note: Industry consultant Shelly Palmer is taking his popular newsletter and turning it into an Adweek article once per week in an ongoing column titled “Think About This.”
We have entered the age of personalized politics, and it is very important for us to understand what that means.
When you see a candidate on stage or on TV, they speak in broad generalizations. Today, politicians are basically crowd-sourced AI algorithms, fed by polling data, played by an actor. This becomes even more evident when you get an email from a candidate or visit their site. When you digitally interface with a candidate, you are interacting with that candidate’s customized persona created specifically for you by a complex set of algorithms that very few people fully understand. This means we are all test subjects in an unprecedented sociopolitical experiment that, quite frankly, scares the hell out of me.
AI is empowering each candidate to present themselves as if that candidate were speaking to us one-on-one. This has always been possible in small groups, at political rallies or even in specially crafted messages. But no politician in history has had the ability to speak to every individual voter one-on-one. Human politicians still can’t, but their AI-generated political avatars can. And frighteningly, these AI-generated political avatars know more about our real hopes and dreams than any human candidate ever could. Need proof? You already know how this works.
Predictive analytics are now so good that most people believe their devices are spying on them. But that’s not what’s happening. The algorithms predict what you will care about by analyzing what you search, click and hover over—AI-generated political avatars are simply advanced digital marketing tools.
Data you don’t even know you generate
Whenever you interact with an app (Facebook, Twitter, Instagram, Google) or website or any other online data aggregator (Nest, Alexa, Waze, your smartphone), you are creating two sets of data.
The first set of data is the data required to enable the technology you are using to work. This might include the location of your device, if you’re using Waze or your smartphone, or the current temperature of your home, if you’re using a Nest thermostat, or what you are interested in at the moment, if you are using social media.
But you also create a second set of data. Sometimes referred to as “surplus data,” this data is not specifically required to achieve your immediate objective. For example, your location when you tap a like button, the time of day you are usually in your home when you adjust your thermostat or the kinds of images that get your attention when you stop scrolling on a social network.
Surplus data is collected with the explicit purpose of improving the engineering of bespoke online environments and messaging that you will find irresistible. Said differently, these are the data used by algorithms to feed your social media addiction.
Where does XYZ candidate stand on universal healthcare, reproductive rights, tax reform, the environment, gun violence prevention? Ask 100 million people, and you’ll get 100 million answers informed by customized messaging because the algorithm will deliver the message it believes matters most to you exactly the way you want to hear it. This is truly new.
Here’s how it’s done:
Personalizing your politics
The goal of all targeted messaging (in fact, the goal of all advertising ever created) is simple: put the right message in front of the right person in the right place at the right time. This is not new, and it is not news.
But AI has changed the game. The amount of (big) data that is available about each and every one of us is simply staggering. We willingly contribute to data-rich organizations such as Google, Facebook and Twitter when using data, and any candidate can appear to be a champion for the causes or issues you are most passionate about.
Convincing a single-issue voter that a candidate believes what that voter believes is truly child’s play. But life is not that simple. Most voters are not single-issue. Most voters are nuanced and passionate about lots of issues.
Pandering at scale
To customize messaging for multi-issue voters, behavioral data is fed into algorithms designed to score those behaviors and then predict what attributes should be crafted into the customized persona of the particular candidate. You can call it “pandering at scale.”
While this technology is table stakes in best practices digital advertising, dynamic apps and websites, it is relatively new for politicians. They may be late to the game, but they are now using the schooling they received in 2012 from the Obama campaign and in 2016 from the Trump mar-tech machine, and we’re about to get an up-close and personal view of the unintended consequences of the lessons learned.
The 2020 elections are going to be politically, technically and physically hacked, and there is almost nothing any of us can do about it. But they are also going to be socially hacked. Personalized politics is the number one way that politicians, PACs, criminal hackers, casual hackers and nation-states are going to socially hack the 2020 elections.
What’s to be done about it?
We all love to be blanketed in the comfort of the information we want to hear. But for the 2020 elections, we’re going to have to get out of our comfort zones. If you feel strongly about an issue, go visit some politician’s website in incognito mode. No cookie? No personalization. Don’t rely on inbound messaging; formulate questions and go seek the answers.
Some people believe that breaking up big tech will somehow magically solve the societal issues we are facing due to the insane use and misuse of big data. Breaking up big tech will certainly not have an impact on AI-generated political avatars, customized campaigning or nation-states socially interfering using hyper-targeted messaging for 2020.
We are going to have to figure out some thoughtful, future-focused way to frame the conversation around data regulation. And while I do not think our current elected officials are the right people for the job, we—not algorithms—get to make that choice every Election Day.