Early Groq backer and billionaire investor Chamath Palihapitiya says that his firm’s deal with Nvidia could dramatically lower costs and expand global adoption.
In a new episode of the All-In Podcast, Palihapitiya lays out why Groq focused on a very specific bottleneck in AI systems and how that decision could reshape the economics of AI infrastructure.
According to Palihapitiya, who bought nearly 33% of Groq for $62.3 million, the company understood that early on, AI architectures were inefficient because they were built for compute, not memory.
“So when you send a prompt to AI, what happens is that the model processes it. This is called the reading phase or pre-fill; it reads your entire prompt all at once. And then it does a bunch of math, calculates all these relationships between all the words, and it stores them in temporary memory. The problem is that this is really compute-bound. So it requires a brute force, and Nvidia GPUs crush you, and their architecture is designed for massive parallel processing…
So the model starts to generate a response, one token at a time, and then to pick the next token, to pick the next word, it has to look back at everything it has said already so that it doesn’t hallucinate. The problem is that this is incredibly memory bandwidth-constrained.”
Palihapitiya says Groq built the SRAM chip that emphasizes memory speed rather than brute-force computation.
“Imagine like a building and from getting from point A to point B, you’ve got to take an elevator, go up to the 10th floor, take an elevator, go back down to the ground floor, then walk across. That would never make any sense. You’d want to just walk across the hallway…
And we used a lot of what’s called SRAM… so that we could do this decode thing as well or better than everybody else.”
Palihapitiya says the result is a structural cost advantage that could reshape the AI ecosystem and drive mass adoption as more apps are built.
“And so now when you put these two things together, I just think it’s going to create a huge acceleration in the ability for this entire infrastructure layer to get much cheaper and much more valuable, which I suspect then it’ll have a lot more developer pool, you’ll get a lot more applications being built, billions and billions of more people using it.”
Last week, reports emerged that Nvidia inked a $20 billion non-exclusive licensing agreement with Groq for its inference technology. Investor Gavin Baker explained how the deal strengthened Nvidia’s position in the AI race.
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

