Written by 3:21 PM Culture

“A daughter on vacation cries ‘Mom, save me’ while detained… The truth behind the kidnapping video”

The National Police Agency’s National Investigation Headquarters has issued an alert warning about a recent incident involving a foreign-targeted telephone financial scam using deepfake technology. This scam involved creating fake videos with synthesized images of a child to deceive parents into believing their child had been kidnapped and demanding money for their release.

According to the investigation headquarters, in October, foreign parents received a video of their daughter, supposedly being held captive in a room and crying for help while traveling in Korea. The perpetrator threatened the parents, stating, “We have kidnapped your daughter. Send the ransom if you want to save her.” The parents immediately informed the consulate, which then reported the case to the Korean police. The police quickly confirmed the safety of the child involved.

The police explained that, although there were no victims in this case, it demonstrated the potential for artificial intelligence (AI) technology to be misused for criminal purposes, as the daughter’s appearance in the video was confirmed to be a fake created using deepfake technology.

Deepfake refers to the AI technology that creates fake personas that appear like real individuals. It goes beyond simply altering the face in a video to replicating expressions and movements similar to those of real people.

In addition to deepfake, the police noted that AI technologies like deepvoice could be exploited for phishing crimes. Deepvoice is an AI technology used to clone a specific person’s voice, allowing criminals to impersonate a child’s voice, call the parents, pretend to be kidnapped, and demand money. This approach takes advantage of the parent-child familial relationship, pressuring the parents to send money urgently without the chance to assess the situation properly.

Deepfake and deepvoice require learning from real individuals. Therefore, videos, photos, and voices shared on social networks or publicly accessible platforms can become targets for criminal organizations.

Therefore, the police advised against posting content openly on social network services (SNS) as distinguishing between real and fake content with the naked eye becomes increasingly difficult as deepfake technology advances. They also recommended immediately reporting any suspicious calls to the police, especially those alleging kidnapping.

There were 174 reported cases of kidnapping-related telephone financial fraud up until September this year. When a kidnapping report is filed, the police prioritize initial responses such as identifying the victim’s location. Thus, reporting to the police not only ensures the safety of the at-risk individuals but also helps prevent financial losses. In cases where the perpetrator does not hang up, making it difficult to report to the police, individuals should seek help from those nearby to report on their behalf or discreetly report to 112 via text message while on the call to ensure the child’s safety.

An Chan-su, head of the Narcotics and Organized Crime Department of the Police, stated that to prevent phishing crimes exploiting AI technology, the police would produce promotional content and disseminate it domestically and through overseas representatives and Korean associations to ensure public safety.

Visited 1 times, 1 visit(s) today
Close Search Window
Close