NG Solution Team
Technology

Why is Samsung’s memory the top choice for Google’s AI chips?

Nvidia recently recognized Samsung’s HBM4 chip as the best on the market, and now Broadcom and Google appear to agree. Google, in collaboration with Broadcom, is developing its next Tensor Processing Unit (TPU) to manage AI workloads, utilizing HBM4 technology. Broadcom has been evaluating HBM4 chips from Micron, Samsung, and SK Hynix, the leading manufacturers capable of producing these chips. A recent assessment revealed that Samsung’s HBM4 chip achieved an operating speed of 11 Gbps, outperforming its competitors in speed. Additionally, it excelled in thermal performance, a critical factor for high-bandwidth memory chips. Broadcom’s evaluation was conducted in a system-in-package (SiP) environment, where logic chips and high-bandwidth memory are integrated into a single unit, marking the final testing phase before the HBM is incorporated into AI chips.

Related posts

Was 2025 the worst year for global cyber breaches?

Emily Brown

Why has a major Chinese power bank maker halted production?

Jessica Williams

Will Samsung’s tri-folding phone debut at an international summit?

Jessica Williams

Leave a Comment

This website uses cookies to improve your experience. We assume you agree, but you can opt out if you wish. Accept More Info

Privacy & Cookies Policy