Connect with us

Tech

Meta to deploy in-house custom chips this year to power AI drive – memo

Meta to deploy in-house custom chips this year to power AI drive – memo

Published

on

Meta to deploy in-house custom chips this year to power AI drive - memo

 Facebook owner Meta Platforms (META.O), opens new tab plans to deploy into its data centers this year a new version of a custom chip aimed at supporting its artificial intelligence (AI) push, according to an internal company document seen by Reuters on Thursday.

The chip, a second generation of an in-house silicon line Meta announced last year, could help to reduce Meta’s dependence on the Nvidia (NVDA.O), opens new tab chips that dominate the market and control the spiraling costs associated with running AI workloads as it races to launch AI products.

The world’s biggest social media company has been scrambling to boost its computing capacity for the power-hungry generative AI products it is pushing into apps Facebook, Instagram and WhatsApp and hardware devices like its Ray-Ban smartglasses, spending billions of dollars to amass arsenals of specialized chips and reconfigure data centers to accommodate them.

At the scale at which Meta operates, a successful deployment of its own chip could potentially shave off hundreds of millions of dollars in annual energy costs and billions in chip purchasing costs, according to Dylan Patel, founder of the silicon research group SemiAnalysis.

Advertisement

The chips, infrastructure and energy required to run AI applications has become a giant sinkhole of investment for tech companies, to some degree offsetting gains made in the rush of excitement around the technology.

A Meta spokesperson confirmed the plan to put the updated chip into production in 2024, saying it would work in coordination with the hundreds of thousands of off-the-shelf graphics processing units (GPUs) – the go-to chips for AI – the company was buying.

“We see our internally developed accelerators to be highly complementary to commercially available GPUs in delivering the optimal mix of performance and efficiency on Meta-specific workloads,” the spokesperson said in a statement.

Meta CEO Mark Zuckerberg last month said the company planned to have by the end of the year roughly 350,000 flagship “H100” processors from Nvidia, which produces the most sought-after GPUs used for AI. Combined with other suppliers, Meta would accumulate the equivalent compute capacity of 600,000 H100s in total, he said.

The deployment of its own chip as part of that plan is a positive turn for Meta’s in-house AI silicon project, after a decision by executives in 2022 to pull the plug on the chip’s first iteration.

Advertisement

The company instead opted to buy billions of dollars worth of Nvidia’s GPUs, which have a near monopoly on an AI process called training that involves feeding enormous data sets into models to teach them how to perform tasks.

The new chip, referred to internally as “Artemis,” like its predecessor can perform only a process known as inference in which the models are called on to use their algorithms to make ranking judgments and generate responses to user prompts.

Reuters last year reported that Meta is also working on a more ambitious chip that, like GPUs, would be capable of performing both training and inference.

The Menlo Park, California-based company shared details about the first generation of its Meta Training and Inference Accelerator (MTIA) program last year. The announcement portrayed that version of the chip as a learning opportunity.

Despite those early stumbles, an inference chip could be considerably more efficient at crunching Meta’s recommendation models than the energy thirsty Nvidia processors, according to Patel.

Advertisement

“There is a lot of money and power being spent that could be saved,” he said. 

Tech

Microsoft to invest 2.2bn dollars in cloud and AI services in Malaysia

Microsoft to invest 2.2bn dollars in cloud and AI services in Malaysia

Published

on

By

Microsoft to invest 2.2bn dollars in cloud and AI services in Malaysia

Microsoft (MSFT.O) said on Thursday it will invest $2.2 billion over the next four years in Malaysia to expand cloud and artificial intelligence (AI) services in the company’s latest push to promote its generative AI technology in Asia.

The investment, the largest in Microsoft’s 32-year history in Malaysia, will include building cloud and AI infrastructure, creating AI-skilling opportunities for 200,000 people, and supporting the country’s developers, the company said.

“We want to make sure we have world class infrastructure right here in the country so that every organisation and start-up can benefit,” Microsoft Chief Executive Satya Nadella said during a visit to Kuala Lumpur.

Microsoft will also work with the Malaysian government to establish a national AI Centre of Excellence and enhance the nation’s cybersecurity capabilities, the company said in a statement.

Advertisement

Prime Minister Anwar Ibrahim, who met Nadella on Thursday, said the investment supported Malaysia’s efforts in developing its AI capabilities.

Microsoft is trying to expand its support for the development of AI globally. Nadella this week announced a $1.7 billion investment in neighbouring Indonesia and said Microsoft would open its first regional data centre in Thailand.

Continue Reading

Tech

Nvidia supplier SK Hynix says HBM chips almost sold out for 2025

Nvidia supplier SK Hynix says HBM chips almost sold out for 2025

Published

on

By

Nvidia supplier SK Hynix says HBM chips almost sold out for 2025

South Korea’s SK Hynix (000660.KS) said on Thursday that its high-bandwidth memory (HBM) chips used in AI chipsets were sold out for this year and almost sold out for 2025 as businesses aggressively expand artificial intelligence services.

“The HBM market is expected to continue to grow as data and (AI) model sizes increase,” Chief Executive Officer Kwak Noh-Jung told a news conference. “Annual demand growth is expected to be about 60% in the mid-to long-term.”

SK Hynix which competes with U.S. rival Micron (MU.O) and domestic behemoth Samsung Electronics (005930.KS) in HBM was until March the sole supplier of HBM chips to Nvidia, according to analysts who add that major AI chip purchasers are keen to diversify their suppliers to better maintain operating margins. Nvidia commands some 80% of the AI chip market.

Micron has also said its HBM chips were sold out for 2024 and that the majority of its 2025 supply was already allocated. It plans to provide samples for its 12-layer HBM3E chips to customers in March.

Advertisement

“As AI functions and performance are being upgraded faster than expected, customer demand for ultra-high-performance chips such as the 12-layer chips appear to be increasing faster than for 8-layer HBM3Es,” said Jeff Kim, head of research at KB Securities.

Samsung Electronics (005930.KS) which plans to produce its HBM3E 12-layer chips in the second quarter, said this week that this year’s shipments of HBM chips are expected to increase more than three-fold and it has completed supply discussions with customers. It did not elaborate further.

Last month, SK Hynix announced a $3.87 billion plan to build an advanced chip packaging plant in the U.S. state of Indiana with an HBM chip line and a 5.3 trillion won ($3.9 billion) investment in a new DRAM chip factory at home with a focus on HBMs.

Kwak said investment in HBM differed from past patterns in the memory chip industry in that capacity is being increased after making certain of demand first.

By 2028, the portion of chips made for AI, such as HBM and high-capacity DRAM modules, is expected to account for 61% of all memory volume in terms of value from about 5% in 2023, SK Hynix’s head of AI infrastructure Justin Kim said.

Last week, SK Hynix said in a post-earnings conference call that there may be a shortage of regular memory chips for smartphones, personal computers and network servers by the year’s end if demand for tech devices exceeds expectations.

Advertisement

The Nvidia (NVDA.O) supplier and the world’s second-largest memory chipmaker will begin sending samples of its latest HBM chip, called the 12-layer HBM3E, in May and begin mass producing them in the third quarter.

Continue Reading

Tech

Qualcomm jumps as AI sparks rebound in Chinese smartphone market

Qualcomm jumps as AI sparks rebound in Chinese smartphone market

Published

on

By

Qualcomm jumps as AI sparks rebound in Chinese smartphone market

Qualcomm (QCOM.O) shares rose 4% in premarket trading on Thursday after the smartphone-focused chipmaker signaled an AI-fueled rebound in demand, especially in China, after a two-year slump.

Sales to Chinese smartphone makers jumped 40% in the first half of its fiscal year, the company said on Wednesday, as buyers there gravitate toward higher-priced devices that can accommodate AI chatbots.

“Chinese vendors who traditionally relied more on MediaTek, are going to start leveraging Qualcomm’s high-end chips more as they push hard into the AI Agenda,” said IDC analyst Nabila Popal.

“They further represent an upside for Qualcomm because majority of the recovery is also going to be driven by Chinese OEMs this year, coming from a tough last two years.”

Advertisement

Qualcomm on Wednesday projected third-quarter sales that were above estimates as it also benefits from its IoT (Internet of things) and auto segments.

The company, the biggest supplier of smartphone chips, was on course to add more than $8 billion to its market value based on premarket movements. Other semiconductor firms such as Arm and Broadcom (AVGO.O) rose 2.8% and 2.4%, respectively.

According to preliminary data from research firm IDC, in the high-end segment, the AI buzz and the foldable products allowed the Android smartphone vendors to further differentiate themselves from Apple (AAPL.O) and garnered increased interest from Chinese consumers in the first quarter of 2024.

“We’re optimistic that numbers can be driven higher, given last year’s muted Android cycle and the likelihood of IoT(internet of things) improvement as inventory normalizes,” analysts at Wolfe Research said.

At least 14 analysts raised their price targets on Qualcomm, according to LSEG data.

Advertisement

Qualcomm’s shares have gained 13.5% this year following a 31.5% rise in 2023.

Shares of Apple, which is set to report earnings after market closes on Thursday, were up 1.05% in premarket trading.

Continue Reading

Trending

Copyright © GLOBAL TIMES PAKISTAN