Young Man Scammed Out of 300,000 Yuan by AI Face-Swapped 'Cousin' Who Claimed to Work in Government
-
Recently, the Criminal Investigation Brigade of Shanghe County Public Security Bureau in Jinan, Shandong Province successfully cracked a fraud case involving AI face-swapping technology. This typical case involved a young man being deceived by someone posing as his 'cousin', ultimately losing 300,000 yuan.
According to reports, the incident occurred in May this year when a young man named Huang received a private message while browsing short videos. The sender claimed to be Huang's cousin. After a few pleasantries, the scammer gained Huang's trust and asked him to add a QQ account. The 'cousin' claimed that due to working in a government position, it was inconvenient to use his own identity and requested Huang to help transfer money to a friend in need of funds. To show sincerity, the scammer asked for Huang's bank account number, stating that the friend would first transfer money to Huang, who would then forward it. Despite receiving a 'transfer receipt', Huang did not immediately transfer the money when he noticed no actual funds had arrived.
Police investigations revealed that this was an online fraud case utilizing AI technology. The criminal gang operated overseas and laundered money domestically. Through technical means, Shanghe police traveled twice to Dongguan, Guangdong, successfully arresting seven suspects. The case has now entered the court trial stage.
During the investigation, police consulted technical experts who pointed out that real-time video synthesis technology is quite challenging to implement, and even small companies would find it difficult to achieve. However, such scams can still be prevented. Captain Zhang Zhenhua emphasized that for the general public, protecting personal privacy information is crucial. When faced with requests for money from acquaintances or relatives, it is important to verify through phone calls. During video calls, one can ask the other party to make specific gestures or partially cover their face to verify their identity.
AI fraud mainly takes two forms: First, using technical means to synthesize face-swapped or voice-altered videos/audio, primarily employed in urgent online scam scenarios. To prevent such cases, relevant authorities have issued management regulations requiring deep synthesis service providers to audit users' input data and synthesis results. Additionally, facial recognition technology users should conduct annual security assessments of their devices and implement effective measures to protect them from attacks.
Police further emphasize that protecting personal information is crucial for preventing such fraud at its source. Key precautions include: not casually answering unknown calls, avoiding excessive personal information exposure on social accounts, being cautious when providing facial photos, and maintaining vigilance when using niche software. Finally, the public should consciously develop anti-fraud awareness and verify any financial requests through alternative channels when receiving suspicious calls from alleged acquaintances.