
Navigating the Memory Supply Chain Shake-Up
Crazy times in our industry. Micron’s announcement that it would abandon the consumer market and shut down Crucial, its consumer business, worldwide in February 2026, came as a shock to the entire market.
Yes, the supply chain is tight, but this really marks a tipping point in the industry: it’s no longer customers choosing the memory; it’s the memory manufacturers choosing their customers and markets.
TechSpot summarized what is currently happening in the market. If you don’t follow the market closely, this is a good read to understand the complex dynamics that have led to the current state of the market.
In our latest webinar, we took a deeper dive into the DRAM and NAND market outlook and supply dynamics for 2026 and beyond. If you missed it, you can watch the recording here.
The outlook might be a bit gloomy, but as Nikolaos Florous pointed out: Without a plan, there is no support!
Reach out with any questions you have. We do our best to help you navigate the supply chain shake-up.
Enjoy the read, and happy holidays.
Webinar: Memory Market Trends 2026 and Beyond
There’s no denying the truth: The memory market’s current super-cycle is expected to continue at least through 2026. It is driven by robust demand from AI servers, HBM (High Bandwidth Memory), and DDR5 migration.
In our last webinar, we took an honest look at the good, the bad, and the ugly in the memory market. We explain the background behind the current supply tightness and why this will persist until late 2026.
Missed the live event? Then you can watch the recording here.


To Take the Order or Not To Take the Order
The semiconductor memory industry is caught between soaring prices and shrinking supply. This environment presents a genuine ethical dilemma at the intersection of business deals, responsibility, and reputation, as Alistair Jones from Intelligent Memory explains in this article.
He highlights that there is a deeper question about responsibility in a cyclical, capital-intensive industry to which there is no easy answer.
Read all about it here.
New 8Gb DDR4 DRAM from Winbond
Winbond has released a new 8Gb DDR4 DRAM for TV, servers, networking, industrial PCs, and embedded applications. The device supports a data rate of up to 3600Mbps, surpassing existing DDR4 standards and enabling faster data transfer for high-speed computing applications.
Built on Winbond’s 16nm technology, the smaller die size increases capacity within the same package and helps reduce overall system cost.
Read more here.


NAND with 96% Reduced Power Consumption
Conventional NAND flash stores data by injecting electrons into each cell, and to boost capacity, manufacturers stack more layers of these cells. However, taller stacks require higher voltages to push signals through—driving up read/write power consumption.
By combining oxide semiconductors with ferroelectric polarization control, Samsung was able to sharply reduce the operating voltage of NAND cell strings.
Read more here.
Top Stories 2025
December is a time to look back. We regularly share stories on LinkedIn, and some hit a nerve. Here are the top stories this year:
- With our April Fools' Joke we were actually scarily accurate.
- Is in-memory compute still alive? Or is it really “near memory” compute?
- How will DRAM wafer production develop by family? An insight from one of our webinars.
- You obviously enjoy our memes, or maybe it’s just the topic? Here are your two highlights: DDR4 Supply and What do you Want


Neumonda Octopus DRAM Test Board Elektra Finalist
The Neumonda Octopus Test Board is an Elektra Award 2026 Finalist. As the organizers said: “This year’s finalists represent the very best in innovation, leadership, and excellence across the global electronics industry. Being named a finalist is an outstanding achievement.”
Unlike other testers, Octopus can test multiple technologies, including DDR4, DDR5, LPDDR4, and LPDDR5, simulating the target application. It is also the only test board that can test at speeds of up to 6.4 Gigabit per second, speeds needed for AI workloads.
Read more here.
Like our stories? Then sign up for our newsletter and get it directly in your inbox.
