The former president Donald Trump intensified its presence on social networks, highlighting the use of images generated by artificial intelligence (AI) as a key tool in your campaign strategy. This practice, Although novel, it raises concerns about the integrity of information and trust in the content it contains. circulate on digital platforms.
Image generators by IA, like those used in the former president’s campaign, operate through neural networks known as generative models. These networks analyze large amounts of data visual data to create new images that can appear surprisingly real, even though they are completely fabricated.
One of the most recent examples was the publication of an image of Taylor Swift dressed like him Uncle Samsupposedly endorsing the Republican leader. Although Swift never publicly endorsed the former president’s campaign, the image was widely shared, generating confusion among her followers. In a post on her social network Truth Social, The real estate mogul even added the comment “I do!”, suggesting that he had the artist’s support.
Likewise, prior to the start of the Democratic National Convention in Chicago, The former president posted an image showing a figure resembling Vice President Kamala Harris at what appeared to be a communist rally, accompanied by a red banner with a communist symbol.
Another notable case was the publication of a fake video in which the conservative politician was seen dancing with the billionaire Elon Musk, one of his most fervent supporters. Although the video was obviously fake, its spread on social media attracted considerable attention, reflecting AI’s ability to create viral content, regardless of its veracity.
The use of these AI-generated images is not only intended to score political points, but also to create a parallel reality that can trump the news. misinforming votersExperts in the field warn of the dangers of this practice, stressing that the increase in AI-generated content could erode trust in information circulating on social media.
Related
2024-08-27 00:23:32