Savvy in Sixty Seconds #1: GPU
Jan 30, 2025In case you haven’t already been furiously googling,Deepseek is a Chinese AI startup that develops open-source LLMs.
This week, they shook up the AI world (understatement of the year) by running powerful AI models with fewer NVIDIA GPUs or Microsoft nuclear data centers than we previously thought possible.
Let’s get savvy to GPUs:
CPU (The Generalist)
- What: The “brain” of a computer, built for general tasks
- Strength: Great at sequential processing (one thing at a time)
- Weakness: Struggles with process-intensive AI workloads
GPU (The Specialist)
- What: A powerhouse chip built for parallel processing (many tasks at once)
- Strength: Critical for AI training, ML workloads, and graphics rendering
- Weakness: Expensive, power-hungry, and supply-constrained.
Once just for gamers, GPUs now rule AI.
Just two years ago, Nvidia hit a trillion-dollar valuation because its GPUs power the server infrastructure behind most of the major AI models we know today.
Now? Startups like Deepseek are figuring out how to do more with less.
Will we still need Nvidia’s GPUs?
Absolutely—if you’ve taken my course, you know this.
And, if you haven’t, here’s why:
- Hardware are the physical components key to computers.
- Software is the coded instructions that get hardware to perform tasks.
- Electrical signals are the physical manifestation of software on hardware at the lowest level.
Computers need both hardware and software to actually provide value.
AI models are a type of software designed to process data and make predictions.
No AI can happen without hardware.
So don’t worry - Jensen will be fine.
Life is certainly not over for Nvidia.
We’ll always need GPUs…but just how many is the question.
Cntrl + T
Tech In the Wild
Why everyone is freaking out about DeepSeek (The Verge)
What to know about DeepSeek (NY Times)
Jensen Huang's Keynote at CES 2025 (YouTube)