Editor’s note: Industry consultant Shelly Palmer is taking his popular newsletter and turning it into an Adweek article once per week in an ongoing column titled “Think About This.”
We have entered the age of personalized politics, and it is very important for us to understand what that means.
When you see a candidate on stage or on TV, they speak in broad generalizations. Today, politicians are basically crowd-sourced AI algorithms, fed by polling data, played by an actor. This becomes even more evident when you get an email from a candidate or visit their site. When you digitally interface with a candidate, you are interacting with that candidate’s customized persona created specifically for you by a complex set of algorithms that very few people fully understand. This means we are all test subjects in an unprecedented sociopolitical experiment that, quite frankly, scares the hell out of me.
AI is empowering each candidate to present themselves as if that candidate were speaking to us one-on-one. This has always been possible in small groups, at political rallies or even in specially crafted messages. But no politician in history has had the ability to speak to every individual voter one-on-one. Human politicians still can’t, but their AI-generated political avatars can. And frighteningly, these AI-generated political avatars know more about our real hopes and dreams than any human candidate ever could. Need proof? You already know how this works.
Predictive analytics are now so good that most people believe their devices are spying on them. But that’s not what’s happening. The algorithms predict what you will care about by analyzing what you search, click and hover over—AI-generated political avatars are simply advanced digital marketing tools.
Data you don’t even know you generate
Whenever you interact with an app (Facebook, Twitter, Instagram, Google) or website or any other online data aggregator (Nest, Alexa, Waze, your smartphone), you are creating two sets of data.
The first set of data is the data required to enable the technology you are using to work. This might include the location of your device, if you’re using Waze or your smartphone, or the current temperature of your home, if you’re using a Nest thermostat, or what you are interested in at the moment, if you are using social media.
But you also create a second set of data. Sometimes referred to as “surplus data,” this data is not specifically required to achieve your immediate objective. For example, your location when you tap a like button, the time of day you are usually in your home when you adjust your thermostat or the kinds of images that get your attention when you stop scrolling on a social network.
Surplus data is collected with the explicit purpose of improving the engineering of bespoke online environments and messaging that you will find irresistible. Said differently, these are the data used by algorithms to feed your social media addiction.
Where does XYZ candidate stand on universal healthcare, reproductive rights, tax reform, the environment, gun violence prevention? Ask 100 million people, and you’ll get 100 million answers informed by customized messaging because the algorithm will deliver the message it believes matters most to you exactly the way you want to hear it. This is truly new.
Here’s how it’s done:
Personalizing your politics
The goal of all targeted messaging (in fact, the goal of all advertising ever created) is simple: put the right message in front of the right person in the right place at the right time. This is not new, and it is not news.
But AI has changed the game. The amount of (big) data that is available about each and every one of us is simply staggering. We willingly contribute to data-rich organizations such as Google, Facebook and Twitter when using data, and any candidate can appear to be a champion for the causes or issues you are most passionate about.