Cryptopolitan
2025-09-12 09:15:25

SK Hynix achieves critical milestone in next-gen HBM4 chips

On Friday, SK Hynix completed the development of HBM4, its next-generation memory product for ultra-high-performance AI. The company also established a mass production system for the high-bandwidth memory chips for customers. The South Korean firm said the semiconductor chip vertically interconnects multiple DRAM chips and increases data processing speed compared to conventional DRAM products. The company believes that the mass production of the HBM4 chips will lead the AI industry. SK Hynix prepares mass production of HBM4 SK Hynix developed its product based on the recent dramatic increase in AI demand and data processing, which requires high-bandwidth memory for faster system speed. The company also believes that securing memory power efficiency has become a key requirement for customers, as power consumption for data center operation has surged. The semiconductor supplier hopes that HBM4’s increased bandwidth and power efficiency will be the optimal solution to meet customers’ needs. The production of the next-generation of HBM4 involves stacking chips vertically to save space and reduce power consumption, which helps to process large volumes of data generated by complex AI applications. “Completion of HBM4 development will be a new milestone for the industry. By supplying the product that meets customer needs in performance, power efficiency and reliability in timely manner, the company will fulfill time to market and maintain a competitive position.” – Joohwan Cho , Head of HBM Development at SK Hynix. The South Korean company revealed that its new product has the industry’s best data processing speed and power efficiency. According to the report, the chip’s bandwidth has doubled from the previous generation by adopting 2,048 I/O terminals, and its power efficiency has surged by over 40%. SK Hynix also maintained that HBM4 will improve AI service performance by up to 69% when the product is applied. The initiative aims to solve data bottlenecks and reduce data center power costs. The firm revealed that HBM4 exceeds the Joint Electron Device Engineering Council’s (JEDEC) standard operating speed (8Gbps) by incorporating over 10Gbps in the product. JEDEC is the global standardization body that develops open standards and publications for the microelectronics industry. SK Hynix also incorporated the Advanced MR-MUF (Mass Reflow Molded Underfill) process in HBM4, which allows stacking the chips and injecting liquid protective materials between them to protect the circuit between the chips and harden them. The company stated that the process has proved reliable and more efficient in the market for heat dissipation compared with the method of laying film-type materials for each chip stack. SK Hynix believes that its advanced MR-MUF technology helps secure a stable HBM mass production by providing good warpage control and reducing the pressure on the stacked chips. The firm also adopted the 1bnm process, the fifth generation of the 10-nanometer technology in HBM4. SK Hynix said it helps minimize the risk in market production. Justin Kim, head of AI Infra at the company, said SK Hynix plans to grow into a full-stack AI memory provider by supplying memory products with the best quality and diverse performance required in the AI industry. SK Hynix shares surge Following the release of HBM4, SK Hynix’s share price hit a record high on Friday, surging by as much as 6.60% to 327,500 KRW ($235.59). The company’s stock price has also increased by roughly 17.5% in the past five days and nearly 22% in the past month. Senior analyst at Meritz Securities, Kim Sunwoo, forecasted the company’s HBM market share will remain in the low 60% range in 2026. He argued that it will be supported by early HBM4 supply to key customers and the resulting first-mover advantage. SK Hynix supplies the most HBM semiconductor chips to Nvidia, followed by Samsung Electronics and Micron, which supply smaller volumes. The company’s head of HBM business planning, Choi Joon-yong, projected that the AI memory chip market will surge by 30% a year until 2030. Joon-yong stated that the end user’s demand for AI is very strong. He also sees the billions of dollars in AI capital spending from cloud computing companies being revised upwards in the future. Get seen where it counts. Advertise in Cryptopolitan Research and reach crypto’s sharpest investors and builders.

Crypto 뉴스 레터 받기
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.