Intel's New Cascade Lake Processors With Optane DC Memory
Saanvi Araav
With the new Cascade Lake processors from Intel, customers might have to spend time to match it to the applications and work out how to deploy its memory of Optane DC. But there are real benefits with this new processor.
In its announcement, Intel started its Q1 by releasing many enhanced and new products with the aim of consolidating their position in the data centre of corporations. On top of that list is a tweaked and extended series of 2nd gen Xeon processors together with Optane DC memory.
But that announcement was back in April, so did they actually applied it to the DC and how did they perform?
The processors
Let's start off this article with the processors, but it is important to know that the 2nd gen Intel Xeon Scalable processors are far from people's expectation. That's true because this new architecture has very little changes in comparison with the 1st gen while using the identical 14nm process. However, they have tweaked the clock rates on some SKUs, some have more cores, and all of them all support DDR4 RAM. That is quite a lots of tech to beneficial from the Optane DC memory, accelerate the processing speed of AI, and handle security at the hardware level.
What happened?
Because they could be fit into same sockets at the 1st gen processors, it is no surprise that Supermicro offers the 2nd gen CPUs for all of its server range. So the servers here are the ones we have seen before. One is the 1U dual-socket Supermicro Ultra, and the other is 2U Twin Pro Supermicro. Both of these motherboards have the LGA 3647 socket, so they could be fitted with either 2nd or 1st gen processors with little to none modification needed.
The memory
One advantage of the Optane DC is that it has the standard DIMM that can deliver to 512GB with each module. Intel says that means 36TB of memory with a low-cost per server, sounds great - but here are the catches.
First, Optane DC needs a new type of memory controller that comes with the 2nd gen processors, therefore you must swap out the processors before plugging new Optane DC. And you can also pug in 6 Optane modules in each socket, and each of the sockets also requires at minimum one DRAM DIMM. While in Memory Mode, they use it as a high-speed data cache for Optane modules.
Then the cache of DRAM doesn't count for the memory capacity, still, it supports Optane modules to deliver almost-DRAM performance to support the application with predictable patterns in data usage. Plus, this could be done without changes to OS or software, which benefits apps like large data analytics
But for apps with lesser predictable patterns in data usage, Optane DC would slow the performance than DRAM. And just like a DRAM setup, all data would be lost if the server loses power.
Persistency pays
We need to use Optane DC memory in the App Direct mode to get the advantages of its capabilities of the persistence and better app support. This mode will enable data to direct through persistent memory or DRAM. Moreover, in this mode, we could use Optane DC as block storage like an SSD to deliver high-speed storage without the usual bottlenecks.
But we would need to use a hypervisor or/and an OS that can distinguish differently techs of memory. Currently, that means either 6.7 VMware vSphere or 2019 Windows Server. We also need to update the apps to take advantage of the benefits.
For example, we have could vendors like SAP that are working to leverage Optane DC on its platforms.
Performance numbers
Before the launch, Intel has marketed the benefits in the performance of both the Optane DC memory and the Xeons. So the Boston engineers have confirmed that claims of Intel. Here is the result:
These improvements are in fact worth having. Still, the true numbers monster is the Xeon Platinum 9200, which yield much more strong than any other 2nd gen Xeon Scalable processors.
Double jeopardy
Marketing to have Advanced Performance, with each 9200 processor you will get two Xeon Scalable 8200 inside a single package. That is double the processor cores - from 28 up to 56 on the 9282 SKU - with double associated memory. accessing via 12 DDR4 channels.
AI Boost
But the customer will still see huge benefits from other 2nd gen Xeon Scalable processors, especially with deployed persistent Optane DC memory. As the consumers want to save money from using Optane DC rather than DRAM, that could mean lower performance in comparison with all-DRAM structure. The differences in performance also depend on the application, that is where Intel pulls its trick with the Deep Learning Boost ability which aims at speed up AI inferencing apps
Not like training AI, inferencing apps are less draining. Yes, the data volumes of inferencing could be much smaller and done with processing of lower precision. The Deep Learning Boost of Intel facilitates this requirement by using the AVX512 included in the architecture of Xeon Scalable. With that, the 2nd gen processors could process 32 16-bit or 64 8-bit integers simultaneously using a single instruction at the hardware level. And together with the fused operations support, it could have a g impact of the performance of inferencing.
However, real numbers could be different from these numbers, but there is a sure benefit with it and without using additional GPU, that will help to reinforce the proposition of data centre of Intel.
The state
So we need to take these figures from Intel with some cautious, but there are real benefits to gain. However they will not be automatic and the customers would have to spend time and effort to match these new processors to their applications, and also how to use Optane DC memory in the best way.