January 13, 2020 By Bradley Knapp 6 min read

Persistent memory is a new technology that provides ultra-fast memory, and in this video, I’m going to explain how it works and relates to other storage options.

Using the idea of a “storage pyramid,” I’ll explain the relationship between cost and performance when it comes to the different storage methods, including tape, HDD, SSD, PCIe, persistent memory (PMEM), and RAM. I’ll also take a look at how persistent memory works in both Memory Mode and App Direct Mode.

Make sure you subscribe to our YouTube channel to see more lightboarding videos like this one!

Learn more

Video Transcript

What is persistent memory (PMEM)?

Hey guys, welcome to the channel; my name is Bradley Knapp from IBM Cloud, and I wanted to talk with you a little bit about persistent memory.

Persistent memory is a new technology—it just came onto the market this last spring, so the spring of 2019—and its ultra-, ultra-fast memory.

The storage pyramid

So if we think about our storage pyramid—we’ve got a pyramid over here. And a storage pyramid, I like to draw it out this way because we’ve kind of got two arrows, right? 

As you go up the storage pyramid like this, the cost goes up. 

And as you go down the storage pyramid like this one, your performance goes down.


And, so keeping this kind of storage pyramid in mind—down here at the bottom this is tape. Tape is still around; tape isn’t going anywhere anytime soon.


The next level up from a performance perspective—slightly more expensive but more performant as well—is when you get into our good old fashioned hard disk drives, right—the spinning disks. 


Next level up from that is where you’re gonna get into your SSDs, right? Your different SSD form factors—U.2, M.2, NVMe, all of the different letters.


The next level up in performance (but again, adding cost) is gonna be PCIe drive.


And then the next level up—this is the one that we’re talking about today, this is PMEM (persistent memory).


And then up at the very top of our pyramid, this one right here—that’s RAM. 

The relationship between cost and performance

So, as you go up the cost goes up, but the performance level does. Why? Well, it’s because the access times go down, the seek time goes down, and the bandwidth goes up.

So, tape takes a long time to get the data to and from the processor, hard disks less time, SSD less time, right? These are limited for a number of different factors—hard disks and SSDs, they’ve gotta talk back and forth through a Raid card, going through the PCI bus.

This next level up, a PCIe drive, this goes right into the PCIe bus. So this could be a NVMe M.2 drive or one that goes in an actual PCI-E slot itself. So again, faster than SSDs, faster than hard drives; same general technology as SSD (it’s still using NAND chips), but it’s getting to that processor faster.

PMEM—if we look over here—PMEM talks back and forth to the processor directly. You don’t have to go through a PCIe bus—you’re going through the memory bus, which, again, lower latency, higher bandwidth—so it’s much, much faster. 

And then at the very top of the pyramid, that’s RAM—that’s your traditional DRAM that is the fastest storage medium.

Memory Mode

And so if we come over here I want to talk a little bit about the two modes that we run in, right? The first mode is Memory Mode. So PMEM can be switched at the BIOS level into either of these modes. 

And so if we consider our processor, right? I’m just going to mark the processor with a P. 

The processor, out of each processor you get 6 channels—we didn’t draw all of them out here—but in each channel, you’re gonna get a DIMM—a RAM DIMM—and you’re going to get a PMEM DIMM.

And then as you go down, right—so that’s slot 0 and then on slot 1 you get a RAM DIMM again, and you get a PMEM DIMM. In 0, 1, 2, 3, 4, and 5 for each processor, right?

What makes PMEM valuable?

So, in a dual-socket server, you’re gonna end up with 12 sticks of RAM, 12 sticks of PMEM. What makes PMEM valuable? Well, it’s lower cost than RAM, slightly lower performance than RAM, but it’s much larger.

So if you think about typical RAM DIMM sizes right, you got a 16, you got a 32, you get a 64, got a 128, and now you’ve got 256s, but the cost goes up dramatically as you go up in these sizes.

On the PMEM side, you start with the 128, and then you’ve also got a 256, and you’ve got a 512. 

And so, if you’ve got 512s and you’re putting 512s into this server right, you have six 512s, which is 3 terabytes of storage per processor.

So on a 2-socket server—2-processor server—you’re gonna actually have 6 terabytes of memory because when you’re running in memory mode, the RAM acts as cache and the PMEM acts as your RAM. So you’ve got 2 sockets, 6 terabytes of RAM.

App Direct Mode

In App Direct Mode, same kind of idea, right? You’ve got your processor, you’ve got your RAM, and then you’ve got those PMEM DIMMs. 

But what makes this different? So in App Direct Mode, rather than the PMEM operating as RAM, it operates as storage—it’s a persistent storage, right?

And so your RAM—that’s what adds up that’s your RAM—and then you can lay a namespace on top of this PMEM. You can put a filesystem on top of it, but because it’s talking back and forth through the memory bus, it’s ultra-, ultra-high performance. 

Where is this App Direct important? This is your in-memory databases, this is your big data workloads, this is where you’re really looking to take advantage of having an insanely fast connection between your storage and your processor so that you can write back and forth very easily.

So that’s kind of an overview—so you’ve got your in-memory database, like SAP HANA and your big data workloads like Hadoop.

And if you want to learn more about this go ahead and hit the links in the comments and we’ll take you through kind of an individual use, case-level description.

Was this article helpful?

More from Cloud

Enhance your data security posture with a no-code approach to application-level encryption

4 min read - Data is the lifeblood of every organization. As your organization’s data footprint expands across the clouds and between your own business lines to drive value, it is essential to secure data at all stages of the cloud adoption and throughout the data lifecycle. While there are different mechanisms available to encrypt data throughout its lifecycle (in transit, at rest and in use), application-level encryption (ALE) provides an additional layer of protection by encrypting data at its source. ALE can enhance…

Attention new clients: exciting financial incentives for VMware Cloud Foundation on IBM Cloud

4 min read - New client specials: Get up to 50% off when you commit to a 1- or 3-year term contract on new VCF-as-a-Service offerings, plus an additional value of up to USD 200K in credits through 30 June 2025 when you migrate your VMware workloads to IBM Cloud®.1 Low starting prices: On-demand VCF-as-a-Service deployments begin under USD 200 per month.2 The IBM Cloud benefit: See the potential for a 201%3 return on investment (ROI) over 3 years with reduced downtime, cost and…

The history of the central processing unit (CPU)

10 min read - The central processing unit (CPU) is the computer’s brain. It handles the assignment and processing of tasks, in addition to functions that make a computer run. There’s no way to overstate the importance of the CPU to computing. Virtually all computer systems contain, at the least, some type of basic CPU. Regardless of whether they’re used in personal computers (PCs), laptops, tablets, smartphones or even in supercomputers whose output is so strong it must be measured in floating-point operations per…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters