Swift probably won’t like Trump using her reputation to falsely claim she endorsed him, but rules on AI in political campaigns are murky.
This article was originally published by The 19th.
On Sunday, former President Donald Trump shared multiple fake images of mostly young, white blond women clutching iced coffees wearing “Swifties for Trump” T-shirts. He included an AI-generated image of Taylor Swift dressed in Uncle Sam regalia, imploring the American people to vote for Trump.
One of the images is clearly marked “satire” and was posted by a popular conservative influencer on X the day before. Another is actually a real picture of Jenna Piwowarczyk, a freshman at Liberty University who wore a handmade “Swifties for Trump” T-shirt to his rally in Racine, Wis., on June 18.
Swift had not endorsed Trump, but he declared “I accept!” in his post, implying that maybe she had. The message couldn’t be further from the truth, as the pop star made her support for the Biden-Harris campaign clear in 2020 and tweeted at Trump, “We will vote you out in November.” Trump likely would not face legal repercussions related to campaigning, though he could based on using Swift’s likeness. The danger, policy experts say, is less about whether people will genuinely mistake the images as real—the Taylor Swift post is obviously an illustration—but more about the overwhelming quantity of disinformation and how quickly it can spread on social media.
The largest AI-related threats to the election come not from star-spangled images, but rather misinformation about election protocols. Several secretaries of state sent a letter to Elon Musk asking him to correct misinformation about ballot deadlines spouted by Grok, his company X’s chatbot. While a little over 800,000 users had access to the chatbot, the officials claimed the false information circulated to millions of people on the platform.
Only some states have their own guidelines about the use of artificial intelligence in campaigning, and clear directives from the federal government might not be coming. Just last week, in a Wall Street Journal editorial, Federal Election commission chair Sean Cooksey proposed dropping any potential rulemaking on the use of artificial intelligence in political ads. Cooksey argues the agency has neither the congressional authority nor technical expertise to regulate how political campaigns use artificial intelligence.
Since nonconsensual AI-generated sexually explicit images of Swift went viral in January, there has been a renewed push to provide federal recourse for victims of computer-generated image-based sexual abuse, often called deepfakes. The White House has been pushing for solutions as abuse has escalated in schools.
The bipartisan DEFIANCE Act would allow victims of computer-generated image-based sexual abuse to sue the creators of the images for damages. It passed the Senate in July and now awaits action in the House.
Sens. Ted Cruz, a Texas Republican, and Amy Klobuchar, a Minnesota Democrat, in June introduced the TAKE IT DOWN Act, which would also require platforms to take down image-based sexual abuse within 48 hours. The bill is currently being evaluated by the Senate Committee on Commerce, Science and Transportation.
States have been trying to tackle the issue for years, most commonly in the context of nonconsensual intimate image sharing, often inaccurately called “revenge porn.”
Artificial intelligence regulation, with an eye toward equity and preventing violence against women, was included In the official Democratic National Committee platform released Sunday ahead of the convention in Chicago.
The platform also promises to ban AI-generated voice impressions. Earlier this year, thousands of New Hampshirites received a robocall with a voice purporting to be that of President Joe Biden discouraging them from voting in the state’s primary the next day. The call’s creator is facing criminal charges and a large fine.
While laws on using generative artificial intelligence in political campaigns remain somewhat piecemeal, Swift could take action against her likeness being used by Trump. Tennessee, where Taylor Swift is headquartered, recently passed a bill protecting the property rights of artists’ name, likeness and voice, as reported by 404 Media. Swift could also pursue action under defamation laws.
Swift became more political in 2016 after years of staying on the sidelines. “I need to be on the right side of history,” she said in her documentary 2020 “Miss Americana.” She endorsed the Biden-Harris campaign with a note on X saying she was cheering for Kamala Harris in the vice presidential debate. Also in 2020 she posted a real picture of herself holding a plate of “Biden-Harris” emblazoned cookies. Swift has not yet endorsed a candidate in the 2024 presidential race.
To check your voter registration status or to get more information about registering to vote, text 19thnews to 26797.
Up next:
U.S. democracy is at a dangerous inflection point—from the demise of abortion rights, to a lack of pay equity and parental leave, to skyrocketing maternal mortality, and attacks on trans health. Left unchecked, these crises will lead to wider gaps in political participation and representation. For 50 years, Ms. has been forging feminist journalism—reporting, rebelling and truth-telling from the front-lines, championing the Equal Rights Amendment, and centering the stories of those most impacted. With all that’s at stake for equality, we are redoubling our commitment for the next 50 years. In turn, we need your help, Support Ms. today with a donation—any amount that is meaningful to you. For as little as $5 each month, you’ll receive the print magazine along with our e-newsletters, action alerts, and invitations to Ms. Studios events and podcasts. We are grateful for your loyalty and ferocity.