Sandisk and SK hynix push High Bandwidth Flash (HBF) standard via OCP to cut AI inference costs and boost scalability.
The shift from training-focused to inference-focused economics is fundamentally restructuring cloud computing and forcing ...
In line with this, SK hynix and Sandisk are proactively pursuing HBF solution's standardization and commercialization based on their design, packaging and mass production experience in HBM and NAND.
Interactive LLMs (chat, copilots, agents) with strict latency targets Long‑context reasoning (codebases, research, video) with massive KV (key value) cache footprints Ranking and recommendation models ...
Upstart's 5th-gen RDU aims to undercut Nvidia's B200 on speed and cost AI infrastructure company SambaNova has raised $350 ...
Innodisk, a leading provider of industrial-grade memory solutions, announced the CXL Add-in Card (AIC), a major addition to its CXL product p ...
Micron Technology is poised for explosive growth, driven by surging AI demand and its dominant position in high-bandwidth memory for leading GPUs. MU's HBM products are sold out through 2025, with ...
An analog in-memory compute chip claims to solve the power/performance conundrum facing artificial intelligence (AI) inference applications by facilitating energy efficiency and cost reductions ...
“The rapid growth of LLMs has revolutionized natural language processing and AI analysis, but their increasing size and memory demands present significant challenges. A common solution is to spill ...
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
FriendliAI also offers a unique take on the current memory crisis hitting the industry, especially as inference becomes the dominant AI use case. As recently explored by SDxCentral, 2026 is tipped to ...