JPMorgan sounds the alarm on a surge of AI-powered scams, warning that fraudsters are now using convincing synthetic voices, deepfakes and personalized messages to target ordinary Americans.
In a new update, the largest bank in the US says it is seeing a shift in criminal tactics as artificial intelligence transforms old social-engineering schemes into harder-to-spot attacks.
The bank cites data from the FBI showing that fraud losses have hit $16 billion, marking a 33% jump from 2023.
JPMorgan says criminals are using AI to impersonate bank executives, government officials and family members, matching emotional tone and urgency to push victims into quick decisions.
“AI can create realistic audio or video clips (i.e., deepfakes and synthetic media) that impersonate executives or celebrities, for example, tricking victims into transferring money or sharing sensitive information. Convincing imitators can falsely urge targets to resolve an issue quickly, often demanding immediate payment or information to avoid further consequences.”
JPMorgan also says extortion scams are becoming harder to distinguish from legitimate emergencies because synthetic voices can now replicate personal details, inflection and speech patterns.
“AI can craft convincing phishing emails or texts that can mimic legitimate communications with trusted contacts, making it harder for recipients to spot fakes; it can also leverage synthetic media to mimic the voices of people you know for defrauding purposes.”
The bank adds that fraudsters are scaling social engineering scams at a rapid pace with AI.
“AI-generated profile pictures and videos can be used to create fake social media accounts or dating profiles, and it can also be used to analyze publicly available data to personalize these phishing attempts, increasing the likelihood of success. What’s more, the convenience of messaging platforms enables fraudsters to blast these types of communications to hundreds of targets instantly.”
To help protect its customers, JPMorgan says it has deployed transaction monitoring, behavioral biometrics and predictive prevention, identifying unusual transactions and potential vulnerabilities before they can be exploited.
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

