Today, I’m kicking off a series of deep dives into that massive, proprietary conversational dataset, starting with unique insights into an audience segment very much in the news: Trump supporters. Over the course of the 2020 election cycle, via deployments for five Democratic presidential campaigns, seven Democratic Senate campaigns (including Mark Kelly and John Hickenlooper), and other progressive clients, our Conversational AI platform engaged with more than 500,000 Americans who told us they were planning to vote to re-elect Trump. And somewhat surprisingly, a strong majority of them went on to voluntarily answer some very personal questions, including how they identify politically and whether they had voted for Obama. Much of what we learned from these AI-driven conversations may surprise you.
How did we “conversationally engage” more than a half-million Trump supporters?
Because we were working on behalf of Democratic clients, we didn’t set out to “virtually canvass” Trump supporters. Our clients measured us by how well we could leverage Messenger and Conversational AI to engage with and hear from as many voters as possible. The goal was to motivate supporters and potential supporters to sign up for emails and texts, to donate, to volunteer, and to vote! As a result, in our very first deployment for a then-leading presidential primary contender, we launched without any conversational flow for Trump supporters — other than to thank them for engaging and wish them a good day.
What surprised us was that once we thanked them, Trump supporters still wanted to talk. “Oh, that’s it?” they replied. It quickly became clear that the Trump supporters who engaged with our platform wanted to be heard, so we quickly added a Trump supporter survey to see what we could learn from them, and to discover if they could be won over.
Before sharing the results, I know many of you are skeptical that conversations between human voters and software would reveal actual views. But we saw a pattern of answers that varied little over time or between clients. Judge for yourself, but we are convinced that the vast majority of these answers provide genuine insights into the 70+ million Americans who voted for Donald Trump in 2020.
First up, who are they?
We’ll get to more personal responses in a moment, but every conversation started with a Trump supporter electing to engage with our platform via Messenger, typically in reply to our smart Comment Responder after they had made a comment on a Facebook post from our client. When a person engages, our platform automatically gets a few pieces of info via Facebook’s API to help us personalize the conversation; things like name, gender, and time zone.
Said another way, we got to engage on a first-name basis with more than a half-million Trump supporters!
What are the most common names that we encountered? The top five are Michael, John, Robert, David, and James. Sensing a pattern? It reads like a list of top names for baby boys in the early 1960s! Across all of our clients, we observed a gender gap amongst Trump supporters that is much more pronounced than anything seen in polling, with male Trump supporters outnumbering women nearly two-to-one.
But it’s too soon to tell whether that reveals problems with polling in 2020, or whether men tend to engage more politically on Facebook. My hunch is the latter, and that our dataset skews more male than actual Trump turnout, but that makes the following revelations even more surprising.
Let’s assume that what follows is the slightly more masculine version of Trump support.
Our Trump “Loyalty Test”
Once a voter self-identified as supporting Trump, we would invite them into deeper engagement, with some variant of, “Got it. Up for answering a few questions? We’re interested in understanding the views of all voters.” More than two-thirds agreed, even though they were engaging with software on the page of a Democratic candidate or progressive organization.
Our first question was always a softball: “At this stage, how committed are you to voting for Donald Trump in November?” Trump supporters are famous for their commitment, but we were still surprised that nine out of 10 answered “Absolutely certain” and another nearly 8 percent answered “Quite likely” or “Likely.” Only a tiny fraction equivocated by answering “It depends” or demurred with “Rather not say.” (Those two samples were so small that they didn’t rate a label on the pie chart below.)
What did surprise us was how “Absolutely certain” was not sufficient for expressing the commitment level of so many Trump supporters. We offer clickable answers for multiple reasons. First, it eliminates the friction of typing, and, second, it allows for consistency by minimizing interpretation of free form responses. That said, our NLP (natural language processing) engine is always on, and we soon found the need to train it to accept many phrases as equivalent to “Absolutely certain,” including a wide variety of variations on “100%,” such as “150%,” “200%,” and so on. More than 20 Trump supporters typed in “100000000000%”! (That’s 100 billion percent. Some serious loyalty.)
While these answers are not particularly revelatory, they strengthen the case for confidence in the truthfulness of the answers that follow.
To GOP or not to GOP
The next question in the survey is a lot more personal, and it’s the kind of question that when asked by a person might typically get a none-of-your-business type response: “And how do you identify politically?” We offered clickable choices of Republican, Independent, Democratic, No affiliation, and Other. To our surprise, “Republican” accounted for less than 50% of the total!
Also, it’s surprising to see 7.5% of Trump supporters currently identifying as Democratic.
Together, these suggest that while Trump may have “taken over” the Republican Party, less than half of his supporters consider themselves to be inside the GOP tent.
The Obama question
The next question is even more personal, moving from party affiliation to specific voting history. Knowing this might be harder to get an answer, we went the most informal of wording, “Just curious — did you vote for Obama?” We offered three clickable answers: I did not, One time, and Both times. And while the most commonly typed response was “hell no,” we were surprised that over 30% of Trump supporters revealed that they voted for Obama at least once — and that more than 10% of them had voted for him twice!
The ZIP code question
Once we saw how well the survey was working, we decided to add a question that could greatly enhance the value of the Trump supporter dataset — ZIP code. But without a good reason to ask, no benefit to offer, the wording of the question would matter. We went with, “Thanks! Last question for now: What part of this great country are you proud to call home? (Please share your zip code.)” While the top three responses were “no,” “nope,” and “90210,” more than half responded with a legitimate ZIP code, and spot checking showed high correlation to the locations displayed on the voters’ public profiles.
And one more thing. In recent months, we extended the survey not with another question, but with an opportunity for Trump supporters to really be heard, in their own words, offering: “If there’s anything else you’d like us to know about why you plan to vote to re-elect President Trump feel free to share.”
Here’s a phrase cloud created from more than 50,000 responses in September and October.
Where do we go from here?
I think it’s clear that what we have pioneered in this election cycle is a game-changer for canvassing at a minimum, but with a bit more thought, these techniques could be modified to achieve much higher fidelity polling than is possible via the continued reliance on the ever-less-relevant communication channel of landline phones. Expect to see a lot more AI-driven voter engagement in future elections — and more insights to come from our unique U.S. political conversational dataset.
(The author is the CMO of Amplify.ai, a global leader in conversational AI.)