A user of TikTok published a video in which he reported having been the victim of an attempted fraud with WhatsApp audios created with artificial intelligence.
In the clip, the young man shows audio messages sent by his mother on his cell phone, but he assures that she is in the same place as him, so it is all a scam.
You might be interested in: Oh the love! Subject steals a buchón bouquet for his partner, she shows it off online and they arrest him
In the first of the audios, a woman’s voice is heard, which the young man claims is his mother’s, saying: “Son, I just got into big trouble, I’m dialing and dialing and you’re not answering me. I am sending you messages because I need you, please answer me.”
Later, the boy explains that the audios were, surely, made with some artificial intelligence application and reveals the second:
“Son, I need you to send me three thousand pesos, don’t worry, I’ll explain to you when I get home…”
The young man decided to play along with the alleged extortionists and sent an audio saying that he had already deposited the sum of money to an account known to him and his mother; However, they responded, still in the supposed voice of her mother, that she would give her the account number of a friend, because at that time she did not have that card.
You might be interested in: Prepare your costume to celebrate Martian’s Day 2024
A few minutes later, the young man again receives messages in which his mother insistently asks him if he has already made the deposit and also asks for proof of the deposit to supposedly show it to her friend.
Finally, the young man warns his followers to be careful with these types of messages that use artificial intelligence to commit fraud and other crimes.
@karolortiz777♬ original sound – Karol salinas
The alerts went on
The video has caused concern on social networks and in users of the WhatsApp messaging application due to the great risk posed by voice cloning in the rise of the use of audio to hold conversations.
You might be interested in: Top of the best Halloween costumes
Many have chosen to find a way to disable the Meta AI of their WA due to the risk posed by a possible cloning of both their voice and their accounts to become victims of fraud.
Other users were skeptical about the young man’s video, highlighting that the user had already added the number from which he received the messages as his mother’s contact and that the voice is too realistic to be an audio made with AI.
Currently, the theft of WA accounts and the use of Deep Fake could explain this alleged new method of conducting fraud remotely.
Related
Interview between Time.news Editor and AI Fraud Expert
Time.news Editor (TNE): Welcome to Time.news! Today we’re delving into a troubling new development in the realm of online fraud—a case involving an individual who became a victim of an attempted scam using AI-generated audio. I’m here with Dr. Emily Carter, an expert in digital security and emerging technologies. Thank you for joining us, Dr. Carter!
Dr. Emily Carter (EC): Thank you for having me! It’s an important topic, and I’m glad we’re discussing it.
TNE: Let’s jump right in. A young man posted on TikTok about receiving audio messages from someone posing as his mother, requesting money. What do you make of this incident?
EC: It’s a chilling example of how advanced technology can be misused. The rise of AI-generated audio means that it’s now possible for fraudsters to create convincing impersonations of individuals, making it even more challenging to discern authenticity in communication.
TNE: Right! In this case, the young man was able to determine that the person behind the messages was not his mother, as she was physically present with him. But what if he hadn’t had that context?
EC: Exactly, that’s the real danger. If someone isn’t aware of their loved one’s whereabouts, they might easily fall victim to such scams. These AI-based manipulations can mimic voices with startling accuracy, increasing the likelihood of a successful scam. It’s crucial for people to verify any significant or unusual requests, especially for money.
TNE: The young man reportedly played along and even sent a fake confirmation of a deposit back to the scammers. What does this tell us about how victims might inadvertently engage with their attackers?
EC: It highlights how scammers exploit emotional connections. The victim, wanting to help who they believe to be their loved one in distress, fell into the trap of compliance. In situations like this, scammers can manipulate emotions to provoke quick reactions without critical thinking. That’s why educating the public on the signs of such scams is vital.
TNE: When it comes to AI technology, how can individuals protect themselves from such schemes?
EC: Awareness is the first step. People should be educated about the existence of these types of scams and learn to recognize the red flags, such as unexpected requests for money or urgent situations that don’t fit the usual pattern. Additionally, implementing secure communication channels—like directly calling the person involved—can help verify suspicious requests.
TNE: That makes sense. Would you say there’s a growing trend in these types of scams as AI technology becomes more accessible?
EC: Absolutely. We are seeing a rise in scams that utilize AI, and I expect the trend to continue as the technology progresses. It’s crucial for social media platforms and tech companies to also take responsibility by developing better tools to detect and prevent such misuse.
TNE: What about law enforcement? Are they keeping pace with these evolving tactics?
EC: That’s a challenging aspect. Law enforcement agencies are increasingly aware of these scams, but technology is always one step ahead. Continuous training and resource allocation are necessary to help them adapt to these digital challenges.
TNE: Dr. Carter, thank you for sharing your insights on this pressing issue. As AI technology continues to evolve, it’s imperative that we stay informed and vigilant to protect ourselves and our loved ones.
EC: Thank you! I appreciate the opportunity to discuss this important topic and remind everyone to stay cautious in their digital communications.
TNE: That’s it for today’s discussion. Remember to always verify information and protect yourself from potential scams. Stay safe online!