The FBI is warning the public about a surge in virtual kidnapping scams that use altered photos and videos to convince victims their loved ones have been abducted.
In a new public service announcement, the FBI’s Internet Crime Complaint Center says fraudsters are fabricating proof-of-life media to pressure families into paying ransom demands.
The FBI says the scheme begins with a sudden message claiming a family member has been taken.
“Criminal actors typically will contact their victims through text message, claiming they have kidnapped their loved one and demand a ransom be paid for their release.”
The scammers then escalate the fear by threatening harm if payment is not made immediately.
“Oftentimes, the criminal actor will express significant claims of violence towards the loved one if the ransom is not paid immediately.”
To make the threat appear real, the actors send manipulated images or clips of the supposed victim. The FBI notes that the scammers know that the doctored photos tend to have inaccuracies, so they limit the time victims have to scrutinize the evidence.
“The criminal actor will then send what appears to be a genuine photo or video of the victim’s loved one, which, upon close inspection, often reveals inaccuracies when compared to confirmed photos of the loved one. Examples of these inaccuracies include missing tattoos or scars and inaccurate body proportions. Criminal actors will sometimes purposefully send these photos using timed message features to limit the amount of time victims have to analyze the images.”
The FBI says families should be cautious when posting missing person information online and urges travelers to avoid sharing personal details with strangers. It also recommends establishing a private code word with loved ones, contacting the supposed victim before paying any demand and preserving any proof-of-life images.
The warning comes amid a surge in deepfake social engineering scams, where fraudsters use generative AI tools to drain money from their victims. The FBI alert is also timely, as a recent study showed that 85% of Americans say scams such as deepfake videos and voice cloning are harder to detect due to AI.
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

