About a year ago, an elderly woman surnamed Ding living in Huangshi, Hubei province, received a phone call she believed was from her grandson urgently requesting financial help. What she didn't know at the time was that the call was the beginning of a sophisticated scam.
When Ding answered the landline, the voice on the other end sounded exactly like her grandson. Tearfully, the caller explained that he had accidentally injured a classmate at school and needed 40,000 yuan ($5,500) to settle the matter privately. He told her the injured classmate's father would come to her home to collect the money.
Trusting the voice, Ding quickly gathered the cash and handed it to a man surnamed Wu, who arrived at her doorstep. To dispel any doubts, Wu even made a call to Ding's "grandson" in front of her, allowing her to hear his voice.
That day, Ding was swindled out of 20,000 yuan. The voice she heard on the phone was a clone of her grandson's voice generated by cutting-edge artificial intelligence.
Ding was not the only victim of the AI scam.
Two other elderly residents in the city fell for Wu's deception. It was only after speaking with their family members that they realized they had been duped, by which time a total of 60,000 yuan had been swindled.
After the victims reported the scam to local police, Wu was tracked down and arrested. In November 2025, the Huangshigang District People's Court found him guilty of fraud, and sentenced him to two years and one month in prison, and imposed a fine of 15,000 yuan.
The court also warned of the sophisticated use of AI to mimic voices to exploit vulnerable seniors who prioritize family and are unfamiliar with the technology.
The case also caught the attention of the Supreme People's Court, China's top court.
"Over the past year, scams leveraging AI for voice cloning and deepfake technology have become alarmingly frequent, specifically targeting the elderly and posing new challenges for the justice system and society," said Zhou Jiahai, head of the court's Research Office.
Doesn't sound right
At the end of last year, a Beijing resident surnamed Zhang was also the target of an AI scam similar to Ding's.
"The voice on the landline phone was my grandson's," Zhang recalled. "He said that the previous night he had gotten into an argument during dinner, and in the scuffle, the other person hit their head on the corner of a table and was injured. He was taken to the hospital and needed 80,000 yuan for treatment."
Concerned by her "grandson's" urgent tone, Zhang assured him,"Grandma will get the money ready for you."
At around 2 pm that afternoon, Zhang handed over 80,000 yuan outside her apartment to the contact person the impostor had told her would collect the money.
The next day, Zhang received another call from the swindler, who said that the person's injuries were more serious than initially thought and he required an additional 30,000 yuan.
It was then that Zhang felt something was amiss. After talking with other family members and calling her real grandson's mobile phone, she realized she had been scammed and reported the deception to the police.
Gao Shan, from the Beijing Public Security, said there has been a surge in cases in recent months of scammers using AI to impersonate grandchildren and defraud elderly people, adding police had detained eight suspects in Zhang's case.
The fraudsters typically obtain personal information through illegal channels and call the landlines of elderly residents, as the phones usually don't show the caller's number, Gao said.
"The scammers use AI voice synthesis technology to mimic the voices of the victims' grandchildren and create a sense of urgency by usually claiming that money is needed for compensation due to a fight, bail due to detention from a dispute, or medical expenses, exploiting the elderly's concern for their family members," he explained.
Growing concern
Xu Hao, a lawyer from Beijing Jingsh Law Firm, said that the technical expertise required for AI voice cloning and face-swapping is not very difficult to attain, adding there are many software options available online.
"Fraudsters only need to capture a few seconds of voice information from short video platforms and input it into the software to generate a voice that closely resembles that of the video user, which they then use to scam the user's family," he said.
The simplicity and low cost of this process are among the reasons for the recent surge in AI-based scams, Xu said.
He noted that the targets of such scams are often elderly people who live alone, as they have less frequent contact with their children and grandchildren and are less familiar with their daily work and lives. Additionally, they may not be well-versed in using smartphones and understanding AI, making them more susceptible to being deceived.
Another lawyer from the firm, Zhao Li, said that scammers also exploit the emotional vulnerabilities of many elderly people, particularly their strong care for and unconditional support of their children and grandchildren.
"Seniors generally have some savings and would rather scrimp and save than see their children and grandchildren suffer," he said. "So, when they hear cries and pleas for help from 'children' and 'grandchildren' on the phone, they immediately soften their hearts and fall for the scams."
The two lawyers said while AI makes people's lives more convenient, it also means scammers' tactics have become more sophisticated. Although the cost of fraud is often not high and legal redress can be pursued, the lawyers said solving and combating such crimes is a challenging issue.
Wang Bin, chief judge of the top court's Third Criminal Division, pointed out that using AI to replicate the facial features and voices of victims' family members, and carrying out scams through highly realistic videos and voice calls, have a higher success rate compared with traditional scamming methods. Wang added it poses more difficulties for law enforcement and prevention.
Preventive measures
In response to the challenge, Chinese courts have focused more on efficiently handling AI-related fraud cases in recent years, Wang said. This has been done by increasing punishment for organizers and perpetrators of the scams, as well as repeat offenders.
"Individuals involved in the development and sale of AI face-swapping and voice-mimicking tools, the sale of personal information, and the transfer of illicit funds are also firmly penalized," he said.
"Our aim is to achieve comprehensive, full-chain crackdowns to cut off the technical assistance that facilitates telecom fraud," he said.
He also highlighted the importance of preventing AI-related scams from the start, through strengthening supervision of online platforms and providing stronger legal education for senior citizens.
Zhao agreed with the approach. He said operators of social media platforms — such as those that provide short-video apps and podcasts — should enhance measures to protect users' personal data. "For example, they can employ technical methods to prevent anyone other than the users from downloading videos," he added.
Xu suggested that mobile operators, payment service providers, and banks establish a cooling-off period for transfers made by elderly users age 65 or 70 and above.
"This period would allow time for manually verifying the recipient's information to prevent financial losses," he explained.
He also called on police officers and social workers to pay special attention to elderly people who live alone and provide them with better care, relevant education, and improve understanding of their emotional demands.
Both the judges from the top court and the lawyers reiterated the need to further enhance seniors' legal awareness to prevent fraud in AI sectors. Knowledge about fraud and anti-fraud tips should also be shared in communities, rural areas, and with families.
Zhou said that the top court will also strengthen research on AI issues and disclose notable cases to refine judicial policies, guiding the healthy development of the industry, and effectively protecting people's property.
















































京公网安备 11010202009201号