Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
freem
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Openai/6960ed20-8dfc-8005-a089-1bb9023ccffa
(section)
Add languages
Page
Discussion
English
Read
Edit
Edit source
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
Edit source
View history
General
What links here
Related changes
Special pages
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Assistant: Short answer: yes, largely because companies like Google, Amazon, and Microsoft already built massive infrastructure ov… === Short answer: yes, largely because companies like Google, Amazon, and Microsoft already built massive infrastructure over decades, while OpenAI is scaling very fast all at once. But there are a few important nuances. ==== That number (when it appears in reports) usually refers to capacity reservations or long-term supply agreements, not OpenAI alone physically buying every wafer off the market. ==== Big players like: * Google * Amazon (AWS) * Microsoft * Meta are also buying enormous amounts of memory — just less visibly, and often through: * captive supply chains * internal chips * long-standing contracts They don’t need headline-grabbing deals because they already locked this in years ago. ==== Google, Amazon, and Microsoft started building hyperscale data centers 15–20 years ago. ==== They already have: * thousands of data centers worldwide * pre-negotiated DRAM + HBM supply contracts * in-house silicon (TPUs, Trainium, Inferentia) * mature cooling, power, and networking stacks So when AI demand exploded: * they expanded existing pipelines * not panic-scaled from scratch OpenAI, by contrast: * went from research lab → global AI platform in ~3 years * had to scale instantly, not gradually That forces very aggressive forward buying. ==== Another key reason: architecture choices. ==== ===== - Uses TPUs ===== * TPUs are designed with tighter memory–compute integration * Often need less HBM per unit of compute ===== - Heavy reliance on H100 / H200 / Blackwell GPUs ===== * These demand huge amounts of HBM * HBM is the real bottleneck, not compute So OpenAI’s path is HBM-hungry, while Google’s is more memory-efficient. ==== Companies like Google: ==== * diversify suppliers * spread purchases across regions * avoid public attention to prevent price spikes OpenAI: * needs guaranteed supply * doesn’t have decades-old leverage * must signal seriousness to partners and governments So deals become visible and dramatic. Think of it like: * Google = owns oil fields * OpenAI = buying oil futures aggressively ==== OpenAI is willing to: ==== * over-reserve capacity * lock into massive long-term contracts * bet on AI demand continuing to explode Google/Amazon: * optimize for utilization * avoid stranded capacity * already monetizing AI internally Different business incentives. ==== You’re basically right, with refinements: ==== * ✅ Yes, Google and others already built infrastructure over decades * ✅ They are buying wafers — just not suddenly or publicly * ✅ OpenAI is compressing 20 years of infrastructure growth into ~5 years * ✅ GPU-based AI requires far more HBM, forcing aggressive wafer deals That’s why it looks like OpenAI is buying everything — when in reality, it’s the newcomer scaling at maximum speed, not the incumbents standing still. If you want, I can also explain why HBM is the real choke point, not GPUs themselves.
Summary:
Please note that all contributions to freem are considered to be released under the Creative Commons Attribution-ShareAlike 4.0 (see
Freem:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)