A new report from TRM Labs finds that crypto scammers extracted tens of billions of dollars in digital assets last year as fraud networks rapidly adopted artificial intelligence tools to scale deception, impersonation and psychological manipulation.
In its 2026 Crypto Crime Report, the blockchain intelligence company says total crypto scams last year amounted to $35 billion, only slightly below the $38 billion recorded in 2024.
Investment-related schemes continue to dominate losses, accounting for 62% of all fraud inflows observed in 2025. Within that category, pig butchering scams and pyramid and Ponzi schemes remain among the most prevalent and damaging forms of fraud, continuing a multi-year trend.

TRM Labs says the scale of fraud attacks dramatically changed in 2025. The firm says AI-enabled scam activity surged by roughly 500% over the past year, as fraud networks integrated large language models, image generation tools, voice cloning and deepfake video into their operations.
“Large language models (LLMs) enable scams to cross language and cultural contexts with less friction, while AI-generated images, voice cloning, and deepfake videos reduce the cost of creating convincing personas. These capabilities are expanding impersonation-style scams across messaging platforms, recruitment campaigns, and investment fraud — and they increase the likelihood that victims can be deceived even when aware of scam warnings.”
According to TRM Labs, AI scam tactics are fueling impersonation scams, fake job recruitment sites and giveaway campaigns. The firm also highlights that deepfakes are proliferating, using icons like Elon Musk to dupe victims into giving up their money.
TRM Labs adds that scammers are posing as lawyers, tax authorities or finance experts to manipulate victims into sending funds. Fraudsters are also using AI to generate images, text and video to create professional-looking investment fraud campaigns.
“Scam operators now routinely employ generative tools to create professional-looking branding assets for websites and social media, including logos, images, and in some cases videos featuring deepfake avatars. This reduces setup costs and makes it easier to rapidly rebrand, recycle infrastructure, and launch new scam iterations at scale.”
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

