A new report from blockchain analytics firm Elliptic warns that artificial intelligence is reshaping the landscape of financial crime, with scammers deploying AI deepfakes of politicians and celebrities to promote fraudulent investment schemes.
The findings, published in Elliptic’s Typologies Report, reveal that criminals are now using generative AI to produce highly convincing fake content, combining cloned voices, synthetic faces, and fabricated endorsements, to trick victims into sending crypto funds.
Elliptic says these scams are part of a broader trend where AI tools originally built for legitimate business and marketing are now being turned into engines of deception. The firm’s researchers describe a rapid rise in AI-generated videos, fake IDs, and synthetic communications being used to bypass controls across the crypto ecosystem.
“As with other technological innovations, criminals have been among the early adopters of AI. Just like legitimate entrepreneurs and innovators, criminals use AI to increase profits and scale their operations.”
One of the most striking cases documented by Elliptic involved scammers circulating deepfake videos of US President Donald Trump during the 2024 election campaign. In the clips, the president appeared to solicit crypto donations for his reelection fund, an entirely fabricated pitch that diverted tens of thousands of dollars in cryptoassets before investigators exposed the scheme.
“Before the scam was exposed as fake, numerous victims sent cryptoassets worth more than $24,000 to the scammers — whose subsequent money laundering activity Elliptic followed on the blockchain.”
The report warns that similar techniques are now being used to build fake investment websites and trading platforms, often promoted through social media ads that feature AI-generated endorsements from politicians and celebrities. Victims, believing the offers to be legitimate, transfer funds to wallets controlled by fraudsters, losing their savings in minutes.
Last week, Interpol warned that scammers are learning to weaponize artificial intelligence faster than law enforcement can adapt to the tech. ChatGPT creator OpenAI also said scammers are increasingly using its tools to supercharge old financial scams by mass-producing fake investment pitches and social media personas promising “zero-risk” profits.
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

