SK Group, the parent of South Korean memory chipmaker SK Hynix Inc., said today it’s planning to invest $74.6 billion into its chip business over the next three years, while setting aside an additional $58 billion, especially for artificial intelligence-related technologies and shareholder returns.
The new investment is apparently in addition to the $90 billion that the conglomerate had already set aside to build a new “mega fab complex” that is currently under construction in the city of Gyeonggi.
The $58 billion investment will be used to build out SK Hynix’s AI semiconductor manufacturing capabilities so that it can “improve its competitiveness by focusing on its AI value chain,” Reuters reported. In addition, part of that money will also go toward funding shareholder returns and streamlining the operations of SK Group’s 175 subsidiaries.
The announcement followed a two-day strategy session involving top executives from SK Group and also officials from a number of affiliated organizations. One source, who was familiar with the talks at the meeting, told Reuters that SK Group is considering various options, including mergers and divestments of some of its affiliated companies.
A spokesperson for SK Group told Reuters that the changes discussed during the session are part of the conglomerate’s “routine management activity,” aimed at helping the company pivot in response to a “changing business environment.”
In a later statement issued to the press, SK Group Chairman Chey Tae-won said that “pre-emptive and fundamental change is necessary” for the company to remain competitive. “In the U.S., the wind of AI-related change is so strong that there’s nothing to talk about except that,” he added.
The announcement comes after SK Hynix delivered its first loss in 10 years in fiscal 2022, losing almost $3 billion. However, the company has since recovered, thanks in part to an AI boom that has been good to memory chipmakers. SK Hynix is, along with the U.S. chipmaker Micron Technologies Inc., one of just a handful of companies in the world that’s able to design and build the high-bandwidth dynamic random-access memory chips needed to power advanced AI workloads. Whereas Nvidia Corp.’s graphics processing units provide the processing power for AI systems, DRAM chips are just as vital, providing the memory and storage capacity those workloads need.