The two firms’ high-profile exhibition reflects the vast demand for memory products in China, the world’s second-largest economy.
There are lots of ways that we might build out the memory capacity and memory bandwidth of compute engines to drive AI and ...
In an industry-first, SK hynix has announced its 16-Hi HBM3E memory, offering capacities of 48GB per stack alongside other bleeding-edge NAND/DRAM products.
Nvidia has asked SK hynix to move up its delivery timeline for next-generation HBM4 memory chips by six months, according to ...
Summary Nvidia’s CEO urged South Korea’s SK Hynix to expedite delivery of its advanced HBM4 chips amid a global AI chip shortage. In response, SK Hynix plans to deliver 16-layer HBM3E samples by early ...
Nvidia is urging SK Hynix to fast-track the production of its high-bandwidth memory (HBM4) chips as demand for AI hardware ...
SK hynix unveils the industry's first 16-Hi HBM3E memory, offering up to 48GB per stack for AI GPUs with even more AI memory in the future.
Chinese authorities are expected to announce more details on fiscal support when the country’s highly anticipated parliament ...
The head of AI computing giant Nvidia asked South Korea's SK hynix to speed up delivery of newer, more advanced HBM4 chips by six months, the head of SK Group said Monday.
SK Group Chairman Chey Tae-won revealed that Nvidia's CEO asked SK hynix to speed up supply of HBM4 chips by six months.
SK Hynix Inc. is accelerating the launch of its next-generation AI memory chips after Nvidia Corp. Chief Executive Officer ...
At the SK AI Summit 2024, SK hynix CEO Kwak Noh-Jung unveiled the worlds first 16-high 48GB HBM3E memory solution, pushing AI memory capabilities to unprecedented levels. The advanced HBM3E solution ...