A new Stanford study is sounding an alarm for American users as researchers reveal that the biggest AI developers in the United States are quietly feeding personal chat conversations back into their models.
The report comes from the Stanford Institute for Human Centered AI, where scholars examined 28 privacy documents linked to six frontier developers.
The review focused on the privacy policies of Amazon Nova, Anthropic Claude, Google Gemini, Meta AI, Microsoft Copilot and OpenAI ChatGPT. Stanford researchers evaluated the policies by following a California Consumer Privacy Act-based framework to determine what data is collected, how long it is retained and whether users can meaningfully opt out of training.
The researchers found that all six companies use chat inputs by default to train or improve their models. In some cases, the information can be retained indefinitely. Lead author Jennifer King says users should worry about how their personal conversations with chatbots are being used by the AI giants.
“Absolutely yes. If you share sensitive information in a dialogue with ChatGPT, Gemini, or other frontier models, it may be collected and used for training, even if it is in a separate file that you uploaded during the conversation.”
The study also highlights how personal health details, biometric information and lifestyle indicators typed into chat windows can be used to generate inferences that follow users across a company’s ecosystem.
Says King,
“You start seeing ads for medications, and it’s easy to see how this information could end up in the hands of an insurance company. The effects cascade over time.”
The Stanford team warns that Americans are operating inside a fragmented privacy landscape with no clear federal protection. They note inconsistent rules for children’s data, blurred boundaries across multi-product platforms and weak disclosures about how long conversations are stored.
“We have hundreds of millions of people interacting with AI chatbots, which are collecting personal data for training, and almost no research has been conducted to examine the privacy practices for these emerging tools.”
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

