Memverge CEO says big memory-based architecture will replace performance-tier storage

Memverge CEO says big memory-based architecture will replace performance-tier storage

“I think the future of infrastructure will be multicloud and it will be memory-centric because of data-centric applications,” Charles Fan said.

technological background on servers in data center, futuristic design. Server room represented by several server racks with strong dramatic light.

Getty Images/iStockphoto

This week, big memory software company Memverge announced the release of its Memory Machine software version 1.2, which offers customers performance and capacity leveraging up to 40 cores in 3rd Gen Intel Xeon Scalable processors (code-named Ice Lake) and up to 6TB of memory capacity per socket with Intel Optane persistent memory 200 series.

Data Center Must-Reads

Charles Fan, CEO of MemVerge, said Memory Machine software version 1.2 is designed to allow application vendors and end-users to take full advantage of Intel’s latest Xeon processor and Optane memory technology.

“We started by providing access to new levels of performance and capacity without requiring changes to applications,” he said in an interview. 

SEE: MSP best practices: Server deployment checklist (TechRepublic Premium)

Fan added that there is now an “expanding universe of new memory-centric applications that need a variety of different processors and tiers of memory to scale and operate efficiently.”

According to Fan, the CXL open interconnect standard will soon be the “defining technology” for the future of the big memory industry and will serve as the heart of a new ecosystem of processors that share heterogeneous memory

The release of Memory Machine software version 1.2 has coincided with the launch of Intel’s 3rd Gen Intel Xeon Scalable processors, which many in the industry refer to by its code name “Ice Lake.”

“The product we created was called Memory Machine and we originally shipped the first version, version 1.0, in September of 2020, about seven months ago. This latest version has official support for the new Intel Ice Lake platform,” Fan said. 

“Our belief is that the data-centric applications are driving the rise of memory-centric infrastructure. Essentially, they’re demanding an infrastructure that allows the application to process data at larger quantities and at higher speed. Only memory-centric infrastructures can allow them to do so and we are helped by the emergence of persistent memory, which has higher capacity, lower cost and also persistence that allow interesting data services to be developed on top of it.”

Fan explained that memory-centric infrastructures are more cost-effective and can address the bottlenecks that many enterprises face, among a host of other benefits. With features like snapshot, organizations can recover from crashes in a more seamless manner.

One cloud service provider working with Memverge uses its memory machine to essentially increase the memory available for the computing file service that they provide to their customers. Another company working in the bioscience field on genomic sequencing and analytics uses Memverge’s tools essentially like a DVR.

“For their pipelining, they need to go through various stages, so we can increase the speed of the processing by eliminating the storage OS. But instead of just doing snapshotting, our tools instantly allow them to roll back to any of the previous stages to perform whatever analysis as well as to rerun things to reproduce the results,” he said. 

“I believe this trend of big memory will have a profound impact on data center architecture and to the entire industry, which I think will be memory-centric because of the data-centric applications. The underlying resources will become more heterogeneous and will be consumed through software abstractions by the application,” he added. 

“Over the next 10 years, a big memory-based architecture will gradually replace the performance-tier storage we’re in today. We are seeing a rise of real-time applications that are demanding real-time processing of data at large quantity. We essentially are delivering the software-defined memory for a multi-cloud future.”

Also see

Source of Article