WASHINGTON—Taylor Swift did not endorse Donald Trump. Nor did Lady Gaga or Morgan Freeman. And Bruce Springsteen was not photographed in a “Keep America Trumpless” shirt. Fake celebrity endorsements and snubs are roiling the US presidential race.
Dozens of bogus testimonies from American actors, singers and athletes about Republican nominee Trump and his Democratic rival Kamala Harris have proliferated on social media ahead of the November election, researchers say, many of them enabled by AI (artificial intelligence) image generators.
The fake endorsements and brushoffs, which come as platforms such as the Elon Musk-owned X knock down many of the guardrails against misinformation, have prompted concern over their potential to manipulate voters as the race to the White House heats up.
Last month, Trump shared doctored images showing Swift throwing her support behind his campaign, apparently seeking to tap into the pop singer’s mega star power to sway voters.
The photos—including some that Hany Farid, a digital forensics expert at the University of California, Berkeley, said bore the hallmarks of AI-generated images—suggested the pop star and her fans, popularly known as Swifties, backed Trump’s campaign.
What made Trump’s mash-up on Truth Social “particularly devious” was its combination of real and fake imagery, Farid told AFP.
Last week, Swift endorsed Harris and her running mate Tim Walz, calling the current vice president a “steady-handed, gifted leader.”
The singer said she was motivated to speak up by manipulated images of her as they “conjured up my fears around AI and the dangers of spreading misinformation.”
Following her announcement, Trump fired a missive on Truth Social saying: “I HATE TAYLOR SWIFT!”
‘Confusion and chaos’
A database from News Literacy Project (NLP), a nonprofit which recently launched a misinformation dashboard to raise awareness about election falsehoods, has so far listed 70 social media posts peddling fake “VIP” endorsements and snubs.
“In these polarizing times, fake celebrity endorsements can grab voters’ attention, influence their outlooks, confirm personal biases, and sow confusion and chaos,” said Peter Adams, senior vice president for research at NLP.
NLP’s list, which appears to be growing by the day, includes viral posts that have garnered millions of views.
Among them are posts sharing a manipulated picture of Lady Gaga with a “Trump 2024” sign, implying that she endorsed the former president, AFP’s fact checkers reported.
Other posts falsely asserted that Oscar awardee Morgan Freeman, who has been critical of the Republican, said a second Trump presidency would be “good for the country.”
Digitally altered photos of Springsteen wearing a “Keep America Trumpless” shirt and actor Ryan Reynolds sporting a “Kamala removes nasty orange stains” shirt also swirled on social media sites.
“The platforms have enabled it,” Adams said. “As they pull back from moderation and hesitate to take down election related misinformation, they have become a major avenue for trolls, opportunists and propagandists to reach a mass audience.”
Simple prompt
In particular, X has emerged as a hotbed of political disinformation after the platform scaled back content moderation policies and reinstated accounts of known purveyors of falsehoods, researchers say.
Musk, who has endorsed Trump and has over 198 million followers on X, has been repeatedly accused of spreading election falsehoods.
US officials responsible for overseeing elections have also urged Musk to fix X’s AI chatbot Grok after it shared misinformation.
Lucas Hansen, co-founder of nonprofit CivAI, demonstrated to AFP the ease with which Grok can generate a fake photo of Swift fans supporting Trump using a simple prompt: “Image of an outside rally of woman wearing ‘Swifties for Trump’ T-shirts.”
As the technology develops, it will be “harder and harder to identify the fakes,” said Jess Terry, Intelligence Analyst at Blackbird.AI.
“There’s certainly the risk that older generations or other communities less familiar with developing AI-based technology might believe what they see,” he said.