Google Unveils Ironwood AI Chip to Boost Inference Performance
- tech360.tv
- Apr 10
- 1 min read
Google has launched its seventh-generation artificial intelligence chip, Ironwood, aimed at accelerating the performance of AI applications.

Unveiled at a cloud conference, the Ironwood chip is designed specifically for inference computing, the process of running AI models to generate responses, such as those used in chatbots like OpenAI’s ChatGPT.

The chip can operate in clusters of up to 9,216 units and integrates features from previous chip designs, offering increased memory and improved efficiency for serving AI applications.
Amin Vahdat, Google Vice President, said the Ironwood chip delivers twice the performance per unit of energy compared with the Trillium chip introduced last year.

Google’s tensor processing units (TPUs), including Ironwood, are available only to the company’s engineers or through its cloud services.
The Ironwood chip consolidates Google's previous approach of splitting TPU designs between model training and inference, focusing now on optimising cost and performance for running AI applications.
Google did not reveal which manufacturer is producing the Ironwood chip.
The company continues to build and deploy its Gemini AI models using its proprietary chips.
Google launched its seventh-generation AI chip, Ironwood
Ironwood is designed for inference computing and AI application performance
The chip can run in clusters of up to 9,216 units
Source: REUTERS