AI and Crypto Czar David Sacks warns that the future of AI may look less like The Terminator and more like 1984.
In a new interview on the a16z podcast, Sacks says large-scale AI systems could become vehicles of censorship, distortion and political manipulation if steered by the wrong hands.
Sacks notes that mainstream fears about killer robots miss the more immediate threat: information control through algorithmic distortion and surveillance.
“I almost feel like the term woke AI is insufficient to explain what’s going on, because it somehow trivializes it. I mean, what we’re really talking about is Orwellian AI. We’re talking about AI that lies to you, that distorts an answer, that rewrites history in real time to serve a current political agenda of the people who are in power. I mean, it’s very Orwellian.”
Sacks highlights that generative AI models could become central to how people consume information, with systemic bias baked in from the start.
“So to me, this is the biggest risk of AI, actually. It was not described by James Cameron. It was described by George Orwell. In my view, it’s not the Terminator. It’s 1984… It’ll be used by the people in power to control the information we receive, that it’ll contain ideological bias, that essentially it’ll censor us.”
He adds that AI surveillance risks are just as severe.
“AI is going to know everything about you. It’s going to be your personal assistant. It’s the perfect tool for the government to monitor and control you. And to me that is, by far, the biggest risk of AI, and that’s the thing that we should be working towards preventing.”
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

