Global EditionASIA 中文雙語Fran?ais
    World
    Home / World / Americas

    AI becoming a handy tool for US fraudsters

    By BELINDA ROBINSON in New York | China Daily Global | Updated: 2023-07-28 07:02
    Share
    Share - WeChat

    Technology being employed to clone people's voice for ransom, govt warns

    People in the United States are being warned to stay vigilant against a growing number of scams using artificial intelligence that mimics a person's voice during a phone call to a concerned relative or friend, who is then asked to send money for ransom.

    The Federal Trade Commission, or FTC, issued the consumer warning alert this year after an increase in the number of people reporting they had been asked to send money after receiving a frantic phone call from a person who they believed was their loved one but was in fact a cloned voice using AI.

    Jennifer DeStefano from Scottsdale, Arizona, experienced the crime firsthand. She told a US Senate judiciary hearing last month that she got a call from an unlisted number in April, and when she picked up, she could hear her daughter, Briana, crying.

    "Mom! I messed up," her daughter said sobbing on the phone call.

    DeStefano asked her daughter, "OK, what happened?"

    She then heard a man's voice on the phone telling her daughter to "lay down and put your head back".

    He then told the worried mother: "Listen here, I have your daughter. You tell anyone, you call the cops, I am going to pump her stomach so full of drugs."

    DeStefano was at her other daughter Aubrey's dance rehearsal when she picked up the phone. She put the phone on mute and asked nearby parents to call 911.

    The scammer first asked her to send $1 million, but when she said she did not have access to that much money, he asked for $50,000 in cash and arranged a meet-up spot.

    The terrified mother said the man on the phone told her that "if I didn't have all the money, then we were both going to be dead".

    However, she contacted her husband and daughter and found out Briana was safe, and it was a hoax.

    Cybercrimes on rise

    Last year, frauds and scams rose 30 percent compared with the previous year, the FTC said. Cybercrimes are also increasing with losses of $10.2 billion last year, the FBI said.

    Scammers use AI to mimic a person's voice by obtaining "a short audio clip of your family member's voice from content posted online and a voice-cloning program", the consumer protection watchdog said. When they call, they will sound just like the person's loved one.

    In another scam, a Canadian couple was duped out of C$21,000($15,940) after listening to an AI voice that they thought was their son, The Washington Post reported in March.

    According to a recent poll by McAfee, an antivirus software organization in San Jose, California, at least 77 percent of AI scam victims have sent money to fraudsters.

    Of those who reported losing money, 36 percent said they had lost between $500 and $3,000, while 7 percent got taken for anywhere between $5,000 and $15,000, McAfee said.

    About 45 percent of the 7,000 people polled from nine countries — Australia, Brazil, France, Ger — many, India, Japan, Mexico, the United Kingdom and the US — said they would reply and send money to a friend or loved one who had asked for financial help via a voicemail or note.

    Forty-eight percent said they would respond quickly if they heard that a friend was in a car accident or had trouble with their vehicle.

    Although phone scams are nothing new worldwide, in this AI version, fraudsters are getting the money sent to them in a variety of ways, including wire transfers, gift cards and cryptocurrency.

    Consumers are being encouraged to contact the person that they think is calling to check if they are OK before ever sending cash.

    FTC Chair Lina Khan warned House lawmakers in April that fraud and scams were being "turbocharged" by AI and were of "serious concern".

    Avi Greengart, president and lead analyst at Techsponential, a technology analysis and market research company in the US, told China Daily: "I think that it is hard for us to estimate exactly how pervasive (AI) is likely to be because this is still relatively new technology. Laws should regulate AI."

    The software to clone voices is becoming cheaper and more widely available, experts say.

    AI speech software ElevenLabs allows users to convert text into voice-overs meant for social media and videos, but many users have already shown how it can be misused to mimic the voices of celebrities, such as actress Emma Watson, podcast host Joe Rogan and columnist and author Ben Shapiro.

    Other videos mimicking the voices of US President Joe Biden and former president Donald Trump have also appeared on platforms such as Instagram.

    Most Viewed in 24 Hours
    Top
    BACK TO THE TOP
    English
    Copyright 1995 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
    License for publishing multimedia online 0108263

    Registration Number: 130349
    FOLLOW US
    亚洲中文字幕无码一区二区三区| 中文字幕AV中文字无码亚| 婷婷综合久久中文字幕蜜桃三电影 | 国产精品中文久久久久久久| 成人午夜福利免费无码视频| 中文人妻av高清一区二区| 人妻无码视频一区二区三区| 人妻少妇AV无码一区二区| 亚洲一区二区三区无码中文字幕 | 国产成人亚洲综合无码| 亚洲Av综合色区无码专区桃色| 亚洲欧美中文日韩在线v日本| 亚洲国产av无码精品| 潮喷大喷水系列无码久久精品 | 国产在线拍偷自揄拍无码| 日韩精品无码一区二区三区四区 | 国产aⅴ无码专区亚洲av麻豆| 中文字幕无码精品三级在线电影 | 久久亚洲中文字幕精品一区| 18禁黄无码高潮喷水乱伦 | 国产福利电影一区二区三区久久老子无码午夜伦不 | 无码国产成人午夜电影在线观看| 无码精品久久久久久人妻中字| 中文字幕人妻无码一区二区三区| 最近2019中文字幕| 台湾佬中文娱乐中文| 三级理论中文字幕在线播放| 亚洲中文字幕无码不卡电影| 熟妇人妻中文a∨无码| 亚洲最大av无码网址| 中文字幕亚洲男人的天堂网络| 免费a级毛片无码免费视频| 国产成人无码精品一区在线观看| 国产成人无码A区在线观看视频| 国产a级理论片无码老男人| 国产爆乳无码一区二区麻豆| 国产成人亚洲综合无码精品| 狠狠精品干练久久久无码中文字幕| 草草久久久无码国产专区| 夜夜精品无码一区二区三区| 中文字幕av无码一区二区三区电影|