Islamabad, Apr 10, 2025: Alphabet introduced its seventh-generation AI chip, Ironwood, on Wednesday, designed to enhance the performance of AI applications.

This new processor is tailored for handling data-intensive tasks when users interact with applications like OpenAI’s ChatGPT.

Referred to as “inference” computing in the tech world, Ironwood performs high-speed calculations to generate responses, whether it’s in a chatbot or other AI-driven applications.

The search giant’s multi-billion dollar, nearly decade-long project is one of the few serious competitors to Nvidia’s powerful AI chips.

Read More: Pakistan Plans 5G Launch Nationwide Ahead of Spectrum Fee Collection

Google’s Tensor Processing Units (TPUs), which are exclusively available to the company’s engineers or through its cloud services, have provided Google with a distinct edge in its internal AI development.

Read More: Pakistan Plans 5G Launch Nationwide Ahead of Spectrum Fee Collection

In one iteration, Google divided its TPU lineup into chips designed for building AI models from the ground up, while a second iteration focused on optimizing chips for running AI applications at lower costs.

Ironwood is designed specifically for inference tasks and can operate in groups of up to 9,216 chips, according to Amin Vahdat, Google’s Vice President.

The new chip merges features from previous designs and offers more memory, making it an ideal option for executing AI tasks.

Vahdat explained, “The importance of inference computing is growing rapidly.” Ironwood delivers twice the performance per energy unit compared to Google’s Trillium chip announced last year.

The company uses its own chips to build and deploy its Gemini AI models.

Although Google has not revealed the specific manufacturer for the Ironwood chips, Alphabet’s stock rose 9.7% during regular trading after former President Donald Trump announced a reversal on tariffs.

Share.
Leave A Reply Cancel Reply
Exit mobile version