I thought it was my son calling for help

Artificial Intelligence (AI)

The new speech synthesis programs are free and very good at reproducing people’s voice, tone, and timbre by listening to a few seconds of audio. Now they are used by criminals to extort money.

Turn on notifications to receive updates on

Artificial Intelligence (AI)

The tactics are always the same, a scammer plays the role of a friend, relative, lover in difficulty and asks the victim to pay sums of money. The problem is that now there is artificial intelligence, capable of mimicking anyone’s voice, timbre, and tone by listening to a few seconds of audio, and so the scams become extremely believable.

For this Benjamin Perkin’s parents paid $21,000 on behalf of an alleged lawyer. He had called them in trouble saying that his son had killed an American diplomat in a car accident, as explained by the Washington Post. A strange case yes, but it was there that convinced them Benjamin’s voice who on the other end of the phone was pleading for their help. So they hastily drained their funds to cover legal fees dictated over the phone by the lawyer. It was just a scam, they figured it out when a call came from their son a few hours later wanting to know how they were doing. But it was too late,”the money is out. There is no insurance, or a way to recover them“. The Perkins are just the latest victims of a more widespread phenomenon. Many phones rang, and everyone told the police officers: “We thought it was our loved ones asking us for help, it was their voice”.

Telephone scams with artificial intelligence

According to the data of Federal Trade Commissionin the United States, during 2022, have been over 36,000 reports of people scammed by someone posing as a friend or family member. Deception works best on the phone and indeed they are 5,100 cases of fraud via telephone callsand the criminals managed to extort 11 million dollars. Now with artificial intelligence capable of faithfully replicating people’s voices using a few seconds of audio sample, it becomes even easier to deceive victims. Thanks to online tools easily accessible and cheap, someone’s voice can be reconstructed faithfully, so much so that not even relatives are able to recognize the fake. There’s always fear to disorient, the old trick, which makes you let your guard down and act on impulse. The request for help is in fact the perfect ploy to trigger anguish and silence possible doubts.

Meta is also working on its artificial intelligence, it wants to get “virtual characters” on WhatsApp

As often happens, new technologies make everyone stand out, and in fact the experts explained to the Washington Post that i Federal regulators, law enforcement agencies and the courts are not equipped to curb this type of scam. AND difficult to trace calls and the funds of the scammers, there are few court records, so it is not even clear whether the companies that make certain tools are also responsible. Will Maxsonan assistant manager at the FTC’s Marketing Practices Division, explained that tracking voice scammers can be “especially difficult” because they could be using a phone based anywhere in the world, making it nearly impossible to even identify which agency has jurisdiction over them. a particular case.

The new speech synthesis programs

“It’s terrifying,” he said Hany Farid, professor of digital forensics at the University of California at Berkeley. “It’s kind of a perfect storm with all the ingredients needed to create chaos.” AI can reconstruct timbre, pitch, voice, cadence, “AI voice generation software analyzes what makes a person’s voice unique, including age, gender and accent, and searches a vast database of voices to find similar and predict patterns, Farid said. “Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice,” Farid said. “Now… if you have a Facebook page… or if you recorded a TikTok and your voice is in there for 30 secondspeople can clone your voice.”

Several AI text-to-speech tools are already available on the market, among them Eleven Labs, founded in 2022, which has both a free and paid version (which can range from 5 to 330 dollars per month). The company has already been criticized. Indeed, in January on 4Chan, a user posted a clip of Emma Watson reading a passage from Mein Kampfuna clip of Emma Watson reading a passage from Mein Kampf. Then the voice of Ben Shapiro throwing racist comments about Alexandria Ocasio-Cortez, e Rick Sanchez from the animated series Rick & Morty which says: “I’m going to beat my wife Morty. I will beat my fucking wife Morty. I’m going to beat the shit out of you, Morty.” Justin Roiland, who voices Sanchez, recently appeared in court for a preliminary hearing on felony domestic violence charges. Clips range from harmless, to violent, to transphobic, homophobic, racist, even anime and video game characters ended up inside the fake blender. ElevenLabs replied on Twitter explaining that it was exploring additional security measures“these include requesting information, payment, or full ID identification.”

Source link

Leave a Comment