
Redazione RHC : 7 December 2025 08:45
Criminals are increasingly using photos and videos from public sources to pass them off as evidence of an alleged kidnapping. The FBI warns that criminals are altering the images they find to make them look as real as possible, depicting a specific person being held against their will and then sending them to relatives along with threats .
Often, these are completely fabricated stories, while the victim sleeps peacefully at home, unaware of what’s happening. But investigators are also noticing a more disturbing trend: scammers are monitoring missing persons posters, selecting photos of missing persons, and using them to pressure families.
Essentially, these schemes replicate old telephone scams in which seniors were approached with the story of a recently deceased relative. The scammers then demanded money for “treatment” or “release,” hoping to shock and dismay the recipient. Last year, the FBI received 357 complaints about such incidents, with total losses amounting to $2.7 million. The new version of this scheme works similarly, but is supplemented by generated “evidence” that, at first glance, appears convincing. The person is typically depicted as frightened, exhausted, or subjected to unfamiliar conditions—this is enough to create a sense of real threat in the recipient.
Impersonation is made possible by the fact that virtually everyone has numerous public photos online. Social media platforms allow attackers to quickly identify a potential victim’s circle of friends and family. Artificial intelligence-based tools can alter facial expressions, backgrounds, or image details, and sometimes even create completely synthetic images . However, as experts point out, upon closer inspection, such materials often contain errors: characteristic features disappear, proportions change, or distortions appear.
To prevent people from calmly verifying the authenticity of an image, criminals often use self-destructing messages. The image disappears after a few seconds, leaving little time to compare it with real photos or consult with someone they know. This race against time is an important part of the plan.
Meanwhile, cybersecurity analysts admit that they sometimes encounter fakes so convincing they almost pass for the real thing. Meanwhile, underground resources sell tools like WormGPT , which help attackers write phishing scams, create manipulation scripts, and automate attacks.
To protect yourself and your loved ones, the FBI recommends avoiding sharing personal information while traveling and agreeing on a code word known only to family members. If you receive threats, try contacting the person mentioned in the messages—this often immediately reveals the scam.
Similar deception methods have long plagued the corporate world. Companies are increasingly encountering fake candidates seeking remote IT jobs . The U.S. Department of Justice reported that one such network netted its participants at least $88 million over six years.
In most of these cases, the trail leads to North Korea: people using fake identities find jobs at companies, work as developers, and transfer the proceeds. Now, they’re aided not only by fake documents, but also by generative tools that create resumes, interview scripts, and alter appearances in video calls. As a result, the employer doesn’t communicate with the person they see on the screen. But that’s not so scary, is it?
Redazione