Artificial intelligence is a powerful piece of technology that has the potential to be misused at the cost of other people’s livelihoods; even celebrity Taylor Swift isn’t an exception from this form of exploitation.
As technology has progressed over the past decade, artificial intelligence (AI) has also improved and made enormous strides in becoming a part of everyday life. This has proven to both benefit and negatively affect society. AI creates potential for advancements in daily life, healthcare services, and the state of our environment. Despite these advantages, however, AI also presents disadvantages that can be easily abused. A common drawback includes blackmail and deep fakes that can damage mental health and create unhealthy experiences with AI. Many people have been victim to AI-generated explicit images despite their falsity, and not everyone is safe. None other than famous pop star Taylor Swift has become a victim of this.
Swift chose to speak up about these circulating images, using her platform to speak out against these harmful and fake images. One of the images was posted on the social media site X, attracting over 45 million views. “These fake AI generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge,” a source close to the 34-year-old singer reported to the Daily Mail. The issue is real and is happening to so many people on smaller scale levels. Since AI is so accessible, almost anyone with a piece of technology can create deepfakes and scam people for money or power.
This isn’t the first time Swift has used her platform to speak out against sexual exploitation. She has been known to speak up for herself and issues that she feels strongly about. In 2017 according to the New York Times, she sued a DJ for sexual assault and battery for $1 after he groped her at a pre-show meet-and-greet in 2013. Before that, the DJ had sued her for wrongful termination from his job; however, Swift felt an obligation to use her platform to express her feelings towards her situation and help others feel comfortable speaking up about their individual struggles. “I just wanted to say I’m sorry to anyone who ever wasn’t believed because I don’t know what turn my life would have taken if somebody didn’t believe me when I said something had happened to me,” Swift said on the one-year anniversary of her sexual assault trial verdict. Using her platform to communicate about the awful use of deep fake images is nothing new for her. Due to her platform being made up mostly young women, her work is important and she aims to create a more vocal nature about sexual assault and how wrong it is.
AI is becoming more and more accessible to the public, allowing almost anyone to create these deep fakes. More often than not, people will generate AI images of someone they know, tell their victim they have sexually exposing images, and blackmail them for money and power. This cycle of abuse and harassment is dangerous for everyone, and unfortunately there aren’t many plausible solutions to AI generated deep fakes. Going forward, protecting people from blackmail and exploitation is a top priority, but only time will tell what the future of AI will bring.