Close Menu
    X (Twitter) LinkedIn
    CapitalAI DailyCapitalAI Daily
    X (Twitter) LinkedIn
    • Markets & Investments
    • Big Tech & AI
    • AI & Cybercrime
    • Jobs & AI
    • Banks
    • Crypto
    Wednesday, March 4
    CapitalAI DailyCapitalAI Daily
    Home»AI & Cybercrime»Scammers Drain $662,094 From Widow, Leave Her Homeless Using Jason Momoa AI Deepfakes: Report

    Scammers Drain $662,094 From Widow, Leave Her Homeless Using Jason Momoa AI Deepfakes: Report

    By Henry KanapiNovember 30, 20252 Mins Read
    Share
    Twitter LinkedIn

    A British widow lost her life savings and her home after fraudsters used AI deepfakes of actor Jason Momoa to convince her they were building a future together.

    The victim began following a fan page for the Aquaman star after the death of her husband of 50 years and soon received what she believed was a personal message from the actor, reports LAD Bible.

    The woman, who wants to stay anonymous, says she couldn’t believe that the actor reached out to her. But AI deepfake videos of Momoa led her to believe that she was forming a real relationship with the celebrity.

    The losses escalated quickly as the scammers pushed her to fund a £500,000 ($662,094) supposed dream home in Hawaii. The fraudsters told her that the actor’s money was locked up in film projects and needed support to complete the Hawaii property, pressuring the widow into transferring the funds.

    To help the supposed illiquid Aquaman star, the victim sold her home and sent the funds to the scammers. After getting the money, the scammers cut all contact with the victim.

    Says the Cambridgeshire Police,

    “This might sound far-fetched, but it’s a true story, and it left a vulnerable woman without a home.”

    The widow remains without a home as police investigate the impersonation scheme.

    A similar scam in Newcastle saw a grandmother lose £80,000 ($106,011) to a fake Momoa who asked for money for flights, presents and even a marriage certificate.

    Says the victim,

    “I was gullible and paid it.”

    Deepfakes are highly sophisticated online content, including videos, audio recordings and images, produced by scammers using generative AI for malicious purposes or to spread fake information.

    Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

    AI Deepfake AI Scams Jason Momoa News Romance Scams
    Previous ArticleElon Musk Names Nvidia and One xAI Competitor As Top Stock Picks, Predicts AI Will Usher Universal High Income
    Next Article Ray Dalio’s Bridgewater Sees AI Circular Financing As Life or Death for Tech Giants, Not a Bubble Signal – Here’s Why

    Read More

    Arizona Attorney General Kris Mayes Warns of Growing AI Scam Threat After Recovering $4,000,000

    March 3, 2026

    U.S. Postal Inspection Service Warns AI Scams Making Old Cons ‘Feel Legitimate’

    March 2, 2026

    Scammers Draining Crypto Using AI, Pose as FBI in Fake Recovery Scheme, According to OpenAI

    February 27, 2026

    OpenAI Says Chinese-Linked Influence Campaign Tried To Use ChatGPT To Target Japanese Prime Minister

    February 26, 2026

    AI-Enabled Crypto Fraud Surges 500% As Illicit Flows Shatter $158,000,000,000: TRM Labs

    February 24, 2026

    Sam Altman, Dario Amodei Sound Alarm About AI in Hands of Dictators and Totalitarian Regimes

    February 20, 2026
    X (Twitter) LinkedIn
    • About
    • Author
    • Editorial Standards
    • Contact Us
    • Privacy Policy
    • Terms of Service
    • Cookie Policy
    © 2025 CapitalAI Daily. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.