NG Solution Team
Technology

Why is Samsung’s memory the top choice for Google’s AI chips?

Nvidia recently recognized Samsung’s HBM4 chip as the best on the market, and now Broadcom and Google appear to agree. Google, in collaboration with Broadcom, is developing its next Tensor Processing Unit (TPU) to manage AI workloads, utilizing HBM4 technology. Broadcom has been evaluating HBM4 chips from Micron, Samsung, and SK Hynix, the leading manufacturers capable of producing these chips. A recent assessment revealed that Samsung’s HBM4 chip achieved an operating speed of 11 Gbps, outperforming its competitors in speed. Additionally, it excelled in thermal performance, a critical factor for high-bandwidth memory chips. Broadcom’s evaluation was conducted in a system-in-package (SiP) environment, where logic chips and high-bandwidth memory are integrated into a single unit, marking the final testing phase before the HBM is incorporated into AI chips.

Related posts

How does Samsung’s Exynos 5G modem enhance satellite connectivity for Galaxy S26?

James Smith

Has AdsGency Secured $12 Million in Seed Funding?

Michael Johnson

Is AI Investment Set to Soar in 2025?

Emily Brown

Leave a Comment

This website uses cookies to improve your experience. We assume you agree, but you can opt out if you wish. Accept More Info

Privacy & Cookies Policy