Nvidia has asked SK hynix to move up its delivery timeline for next-generation HBM4 memory chips by six months, according to ...
NVIDIA currently uses SK hynix's HBM3E memory for its AI chips and plans to use HBM4 in its upcoming Rubin R100 AI GPU.
In an industry-first, SK hynix has announced its 16-Hi HBM3E memory, offering capacities of 48GB per stack alongside other bleeding-edge NAND/DRAM products.
Citing the growing demand for high-capacity, energy-efficient semiconductors, Nvidia CEO Jensen Huang asks SK Hynix to ...
He lifted his earnings per share estimates for 2024 and 2025, citing likely strong demand for its new Blackwell chip. Demand ...
Nvidia through Thursday had added $2.2 trillion in market value this year, in a year in which the S&P 500’s value has climbed ...
Use precise geolocation data and actively scan device characteristics for identification. This is done to store and access ...
High-Bandwidth Memory 5 is intended to further increase the number of memory layers. This requires modern stacking technology ...
AMD's new Instinct MI350 series shifting to TSMC's new 3nm process node will bring a more level playing field in the advanced process node field, with NVIDIA's new Rubin R100 AI GPU also made on ...
But Wall Street doesn't see Nvidia stopping here, with some analysts seeing Nvidia surpassing Apple, and hitting a $5 ...
NVDA is on track to overtake Apple as the world's most valuable company, topping $3.5 trillion in market capitalization.
And the analysts see the demand continuing with the launch of Rubin, which will probably come out at the end of 2025 or beginning of 2026. Of note, Nvidia announced last year it would be ...