Amazon Web Services (AWS) and OpenAI have entered into a multi-year, strategic partnership worth $38 billion, enabling OpenAI to run and scale its artificial intelligence (AI) workloads on AWS’s advanced cloud infrastructure.
The agreement, announced jointly by the companies, marks one of the largest infrastructure collaborations in the AI industry to date.
Under the seven-year deal, OpenAI will leverage hundreds of thousands of NVIDIA GPUs hosted on AWS, with the capacity to expand to tens of millions of CPUs to support large-scale, agentic workloads. The deployment will be completed before the end of 2026, with additional expansion expected through 2027 and beyond.
AWS, which operates clusters exceeding 500,000 chips, will provide the secure and high-performance infrastructure necessary to power OpenAI’s frontier models. The companies said the partnership combines AWS’s leadership in cloud computing with OpenAI’s pioneering work in generative AI to deliver faster, more reliable performance for millions of users worldwide.
“Scaling frontier AI requires massive, reliable compute,” said Sam Altman, Co-founder and CEO of OpenAI. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
The infrastructure designed by AWS for OpenAI features a sophisticated architecture that clusters NVIDIA GB200 and GB300 GPUs via Amazon EC2 UltraServers. This setup provides low-latency interconnectivity and optimized processing performance for diverse AI workloads, from ChatGPT inference to next-generation model training.
“As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” said Matt Garman, CEO of AWS. “The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads.”
The deal builds on the companies’ ongoing collaboration to make cutting-edge AI models accessible to businesses worldwide. Earlier this year, OpenAI’s open-weight foundation models became available on Amazon Bedrock, allowing millions of AWS customers to integrate OpenAI’s technologies into their applications.
OpenAI has since become one of the most widely adopted model providers on Amazon Bedrock, with clients including Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health using its models for agentic workflows, coding, scientific analysis, and mathematical problem-solving.
The partnership underscores the growing demand for computing power in the AI industry as developers race to build more capable and intelligent models at an unprecedented scale.