Skip to content

Google's record-breaking supercomputer, boasting a staggering 1.77 petabytes of shared memory, reportedly smashes previous records for multi-CPU systems.

TPU (Tensor Processing Unit) made by Ironwood is now operational within Google Cloud's data center facilities.

Google's leading supercomputer boasts an unprecedented 1.77PB memory, breaking existing records for...
Google's leading supercomputer boasts an unprecedented 1.77PB memory, breaking existing records for collective memory in multi-CPU configurations worldwide

Google's record-breaking supercomputer, boasting a staggering 1.77 petabytes of shared memory, reportedly smashes previous records for multi-CPU systems.

Google has unveiled its latest innovation in AI compute technology, the Ironwood chip, at the Google Cloud Next 25 conference in April 2025. Developed in collaboration with Broadcom, Ironwood employs a 3-nanometer process, marking a significant advancement in semiconductor technology.

The Ironwood chip is cooled by a cold plate solution, supported by the third generation of Google's liquid cooling infrastructure. This design ensures efficient heat dissipation, enabling the chip to operate at optimal temperatures.

Ironwood includes logic repair functions to improve manufacturing yield, a feature that is crucial for the production of complex chips. The chip's architecture also emphasizes RAS (Reliability, Availability, and Serviceability), ensuring robust performance and minimal downtime.

One of the key innovations in Ironwood is the incorporation of AI techniques within its design. These techniques help optimise ALU (Arithmetic Logic Unit) circuits and floor plan, leading to improved efficiency and performance.

Dynamic voltage and frequency scaling in Ironwood further improves efficiency during varied workloads. This feature allows the chip to adjust its power consumption and performance based on the demands of the task at hand, resulting in significant energy savings.

Google claims a twofold improvement in performance per watt with Ironwood compared to its predecessor, Trillium. This improvement is a testament to the chip's efficiency and potential for powering demanding AI workloads.

A fourth generation SparseCore has been added to Ironwood to accelerate embeddings and collective operations. This feature supports workloads such as recommendation engines, further expanding the chip's capabilities in AI computing.

Ryan Smith of ServeTheHome commented that the presentation of Ironwood at Hot Chips 2025 was "awesome" and showed Google's continued innovation in AI compute, from chips to interconnects, and physical infrastructure.

The new Blackwell Ultra GPU series, claimed to be the most powerful Nvidia AI hardware yet, is a formidable competitor to Ironwood. However, Google's focus on AI-optimised chip design and infrastructure could give it an edge in the race for AI supremacy.

Deployment of Ironwood is currently underway at hyperscale in Google Cloud data centers, marking a significant step forward in the company's AI capabilities. As more data centers adopt Ironwood, we can expect to see improvements in the speed, efficiency, and reliability of AI services on Google Cloud.

Read also: