NG Solution Team
Technology

Why is Samsung’s memory the top choice for Google’s AI chips?

Nvidia recently recognized Samsung’s HBM4 chip as the best on the market, and now Broadcom and Google appear to agree. Google, in collaboration with Broadcom, is developing its next Tensor Processing Unit (TPU) to manage AI workloads, utilizing HBM4 technology. Broadcom has been evaluating HBM4 chips from Micron, Samsung, and SK Hynix, the leading manufacturers capable of producing these chips. A recent assessment revealed that Samsung’s HBM4 chip achieved an operating speed of 11 Gbps, outperforming its competitors in speed. Additionally, it excelled in thermal performance, a critical factor for high-bandwidth memory chips. Broadcom’s evaluation was conducted in a system-in-package (SiP) environment, where logic chips and high-bandwidth memory are integrated into a single unit, marking the final testing phase before the HBM is incorporated into AI chips.

Related posts

Is AI Cooperation the Key to Global Technological Advancements?

James Smith

What Can We Expect from CES 2026?

Emily Brown

Is Apple offering free two-hour delivery for last-minute holiday shoppers?

Emily Brown

Leave a Comment

This website uses cookies to improve your experience. We assume you agree, but you can opt out if you wish. Accept More Info

Privacy & Cookies Policy