Cryptopolitan
2025-10-01 19:15:48

Microsoft's CTO confirms long-term goal to replace Nvidia/AMD with in-house silicon

div]:bg-bg-000/50 [&_pre>div]:border-0.5 [&_pre>div]:border-border-400 [&_.ignore-pre-bg>div]:bg-transparent [&_.standard-markdown_:is(p,blockquote,h1,h2,h3,h4,h5,h6)]:pl-2 [&_.standard-markdown_:is(p,blockquote,ul,ol,h1,h2,h3,h4,h5,h6)]:pr-8 [&_.progressive-markdown_:is(p,blockquote,h1,h2,h3,h4,h5,h6)]:pl-2 [&_.progressive-markdown_:is(p,blockquote,ul,ol,h1,h2,h3,h4,h5,h6)]:pr-8"> _*]:min-w-0 standard-markdown"> The chips and servers housed in data centers have been essential to building artificial intelligence models and applications. While Nvidia has led this market, major cloud computing companies, including Microsoft, have started creating their own specialized chips. During a fireside discussion at Italian Tech Week moderated by CNBC, Kevin Scott, Microsoft’s chief technology officer, explained the company’s approach to AI chips. Currently, Microsoft relies mainly on Nvidia and AMD chips in its data centers. The company’s priority has been selecting the right semiconductors that deliver “the best price performance” for each chip. Scott says they’re flexible about chip choices. Nvidia has simply offered the best performance for the price over the years. They’re willing to look at any supplier to make sure they have enough capacity for the demand. Meanwhile, Microsoft has already started incorporating some of its own chips into its operations. The company introduced the Azure Maia AI Accelerator in 2023, built for AI workloads, along with the Cobalt CPU. Reports indicate the company is developing its next batch of semiconductor products. Just last week, Microsoft revealed new cooling technology that uses “microfluids” to address chip overheating problems. When questioned whether Microsoft’s long-term goal is to use mostly its own chips in company data centers, Scott responded: “Absolutely,” noting that the company already uses “lots of Microsoft” silicon currently. Custom chips are just the beginning According to Scott, the chip strategy is part of a broader plan to eventually create a complete system for data centers. Scott explains that it goes beyond just the hardware. The focus is on networking, cooling systems, and having the flexibility to make choices that best match computing power to whatever tasks are being run. Microsoft, Google, and Amazon are all creating custom chips, not only to reduce their dependence on Nvidia and AMD, but also to better tailor the hardware to their specific requirements. Major technology companies, including Meta, Amazon, Alphabet, and Microsoft, have pledged over $300 billion in capital expenditures this year, with a large portion directed toward AI investments as they attempt to keep up with surging AI demand. Severe Computing Shortages Persist Scott pointed out that computing capacity remains in short supply. Scott says calling it a massive shortage of computing power doesn’t even capture the full scale. Since ChatGPT’s launch, building capacity quickly enough has been nearly impossible. Microsoft has been expanding capacity through new data centers , but the CTO cautioned that it still falls short of meeting demand. Scott mentioned that even their most aggressive predictions keep falling short. They’ve rolled out huge amounts of capacity recently, and the expansion will be even bigger in the next few years. Get up to $30,050 in trading rewards when you join Bybit today

Crypto 뉴스 레터 받기
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.