Machine Learning | Accelerated Computation | Artificial Intelligence
This article discusses Groq, a new approach to computer hardware that’s revolutionizing the way AI is applied to real world problems.
Before we talk about Groq, we’ll break down what AI fundamentally is, and explore some of the key components of computer hardware used to run AI models. Namely; CPUs, GPUs, and TPUs. We’ll explore these critical pieces of hardware by starting in 1975 with the Z80 CPU, then we’ll build up our understanding to modern systems by exploring some of the critical evolutions in computer hardware.
Armed with an understanding of some of the fundamental concepts and tradeoffs in computer hardware, we’ll use that understanding to explore what Groq is, how it’s revolutionizing the way AI computation is done, and why that matters.
Naturally there’s a lot to cover between early CPUs and a cutting edge billion dollar AI startup. Thus, this is a pretty long article. Buckle up, it’ll be worth it.
Who is this useful for? Anyone interested in artificial intelligence, and the realities of what it takes to run AI models.
How advanced is this post? This post contains cutting edge ideas from a cutting edge AI startup, and explains them assuming no prior knowledge. It’s relevant to readers of all levels.
Pre-requisites: None, but there is a curated list of resources at the end of the article for related reading.
Disclaimer 1: this article isn’t about Elon Musk’s chat model “Grok”. Groq and Grok are completely unrelated, besides the fact that their names are based on the same book.
Disclaimer 2: During the time of writing I am not affiliated with Groq in any way. All opinions are my own and are unsponsored. Also, thank you to the…