In a fresh and risky move, Nvidia (NVDA) plans to use low power memory chips found in phones for its next wave of AI servers. The chips are called LPDDR and are set to replace DDR5 chips used in most ...
The role of memory to handle an avalanche of data expected in future leading-edge applications such as automotive and artificial intelligence has led to product innovations from several companies, the ...
At the 2023 Hot Chips forum, in addition to Intel's announcement of its data center chip product, the latest report from Korea's TheElec also pointed out Samsung Electronics has announced its research ...
A new flash memory chip claims to offer real-time code execution for safety-critical applications serving domain and zonal controllers in semi-autonomous vehicles while exceeding the capability of ...
The move by Nvidia to use Low-Power Double Data Rate (LPDDR) memory chips, commonly found in smartphones and tablets, instead of the traditional DDR5 chips used in servers, is expected to cause a ...
In context: Low-power double data rate (LPDDR) is a standard for low-power SDRAM memory chips designed for mobile applications and power-constrained devices. Despite consuming less energy than ...
Samsung has developed the industry’s fastest LPDDR5X DRAM capable of reaching data transfer speeds of up to 10.7Gbps. Optimized for AI applications, the chip offers improved performance, power ...
This article is part of the TechXchange: Smart and Secure Storage. Modern cars are among the most complex electronic systems in the world. As software comes to dominate more of the system, the ...
BEIJING, Nov 19 (Reuters) - Nvidia's (NVDA.O), opens new tab move to use smartphone-style memory chips in its artificial intelligence servers could cause server-memory prices to double by late 2026, ...
This week, Google unveiled its AlphaChip reinforcement learning method for designing chip layouts. The AlphaChip AI promises to substantially speed up the design of chip floorplans and make them more ...