Close Menu
    X (Twitter) LinkedIn
    CapitalAI DailyCapitalAI Daily
    X (Twitter) LinkedIn
    • Markets & Investments
    • Big Tech & AI
    • Fraud & Scams
    • Hacks
    • Banks
    • Crypto
    Wednesday, September 24
    CapitalAI DailyCapitalAI Daily
    Home»Fraud & Scams»AI Becomes Core Tool in Scams As Thieves Exploit Tech To Coerce Victims, Warns ITRC
    Shadowy thief reaching toward glowing silver and teal AI circuit patterns spilling from a smartphone, with red and black shards breaking around a victim’s silhouette, symbolizing criminals exploiting AI as a core tool in scams.

    AI Becomes Core Tool in Scams As Thieves Exploit Tech To Coerce Victims, Warns ITRC

    By CapitalAI Daily TeamSeptember 2, 20252 Mins Read
    Share
    Twitter LinkedIn

    Artificial intelligence is now central to how criminals commit identity theft, according to a new report from the Identity Theft Resource Center (ITRC).

    The nonprofit, which tracks data breaches and fraud trends in the United States, says advances in AI are changing the scale and precision of attacks, leaving victims exposed to greater losses.

    In its 2025 Trends in Identity Report, ITRC says criminals no longer need to cast wide nets to ensnare victims. Instead, AI enables them to target fewer people more effectively and extract larger sums from each case.

    “Criminals are using technology like artificial intelligence (AI) to target victims more precisely, so they don’t need to attack as many people, but those they do attack lose more money.”

    The nonprofit also warns that AI is changing the methods used to trick individuals into handing over sensitive details.

    “Artificial Intelligence (AI) technology makes it easier for thieves to coerce unsuspecting victims into giving away their identity credentials…

    Tactics used to lure victims into a scam include using AI to spoof legitimate websites, posting ads on search engines with fake customer service numbers for well-known businesses or sending legitimate-looking emails that pretend to be from a large company. They also send text messages that seem to come from legitimate sources. AI tools allow scammers to operate on a much larger scale and target more victims efficiently.”

    ITRC notes that criminals are leveraging AI to maximize what they extract from victims.

    “And the thieves don’t just ask for money. They will work to get as many personal identifiers as possible to take over accounts, establish new ones or sell the information to make money.” 

    Earlier this year, Federal Trade Commission Chair Andrew Ferguson told lawmakers that scam and impersonation calls have been “supercharged by AI” and described them as “a truly terrifying experience.” He emphasized that combating AI scams requires “a whole-of-government approach.”

    AI artificial intelligence Fraud identity theft Scams

    Read More

    Elon Musk Vows One Terawatt of Compute As AI Arms Race Heats Up With Nvidia’s up to $100 Billion Deal With OpenAI

    September 23, 2025

    Fundstrat Names Two AI Giants Fueling S&P 500 Rise, Says This Stock Group Now ‘Very Bullish’ Amid Fed Rate Cuts

    September 23, 2025

    $3.3 Billion Firm Solus Calls AI Trade ‘Most Dominant’ Story, Predicts New Record Highs for S&P 500

    September 23, 2025

    Citi Taps Google’s Gemini and Anthropic’s Claude for 5,000-Staff AI Pilot

    September 23, 2025

    $2.1 Billion Wealth Manager Warns AI Trade ‘Most Overvalued in Market,’ Calls Group Fad Stocks

    September 22, 2025

    Sam Altman Says ChatGPT Rolling Out ‘Compute-Intensive’ Upgrades in Coming Weeks

    September 22, 2025
    X (Twitter) LinkedIn
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms of Service
    • Opt-out preferences
    © 2025 CapitalAI Daily. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.