Chip Builder GPT
Chip Builder GPT
Chip Builder GPT
FOR TOMORROW:
YOU, ME,
AND CHATGPT®
Dr Jawad Nasrullah | Palo Alto Electron
February 07, 2024
Agenda
Chips to
enhance AI
AI to improve
chip design
ChatGPT and
®
Chip Design
Chip design uses coding
Manufacturing design kits and
Chip design collaterals
Vdd
Engineering
Prompt
Chip Design Fine- EDA
Tuned tools
LLM
Chip Design
Training Corpora
(Style, IP, Kits)
Design Database Architecture
Retrieval Implementation
Chiplet modularity
helps simplify this
Verify, Sign off
pipeline
GPT4 Training Cost
Estimates
X ~2000-3000 needed for a
month to train GPT
~78mm
MI300 OAM
• 304 GPUs
170mm
• 192 GB
• 750W
GPU CPU
CPU
GPU CPU
IOD IOD
HBM
Stack
Passive Si Interposer
Substrate
GPU CPU
CPU
GPU CPU
IOD IOD
HBM
Stack
Passive Si Interposer
Substrate
102mm
~78mm
MI300 OAM
• 304 GPUs
• 192 GB
170mm
• 750W
102mm
+3 Years
• 456 GPUs
~110
mm
• 1TB
170mm
• 1.5kW
150mm
+10 Years
• >1000 GPUs
170mm
• 4 TB
• 3kW
HBM and beyond
Still “there is plenty of room at the bottom”
HBM System Trends
100000
10000
1000
HBM
Stack 100
10
1
2020 2030 2040
1024 data bus
~50um
• DRAM capacity (#stacks x stack capacity)
• Bandwidth/stack (#wires x symbol rate)
• DRAM power budget (system design)
745um
GPU
Silicon Interposer
Now +10 years
1024 data bus • Cu-Cu bonding, New DRAM devices, large interposers
~50um • Substrate/interposer technology improvement
• New physical/logical layer circuitry
• DTCO, circuit design
745um
GPU
Silicon Interposer
Now +10 years
GPU
Silicon Interposer
Challenges and
opportunities of AI HW
Power efficiency and scale out
Manufacturing
• Beyond CMOS (multi-gate)
• Vdd scaling (target 200mV)
• True 3D transistor stacking
• Fine pitch backend
Chip System
Design Kits Design Kits
Design
Automation