NVM Express boosts computational storage standards

The NVM Express consortium has announced additions to its standard that will allow datacentre NVMe flash-based storage systems to take advantage of computational storage and tap into local memory resources.

That will come in the form of two new command sets, namely NVM Express’s Computational Programs and Subsystem Local Memory commands.

These command sets are designed to enable NVMe devices to process data within the NVM subsystem directly and to access local memory without intermediaries.

The Computational Programs command set puts in place a framework for customers to shift compute loads from the host to the storage device. That shift is central to the idea of computational storage and allows for more rapid data processing closer to where it is stored. According to NVM Express, it will provide “a host-driven, modular approach to computational programs”.

The core aim of computational storage, which falls under the umbrella of so-called Near Data Processing (NDP), is to reduce the need to move data – such as for edge compute workloads – and to enhance response times, in particular for latency-sensitive applications like databases and AI processing. Spin-off benefits can also be to reduce energy usage, network bandwidth use, as well as to enhance security.

Meanwhile, the Subsystem Local Memory command set is designed to allow NVMe hardware to access local memory directly. This is often a requirement of computational storage, which can make use of local memory as a source of data and as a target to output the results of processing. NVM Express said it will provide “the ability to access memory in an NVM subsystem via computational programs and via an NVMe transport, accessible through NVMe I/O commands”.

Computational storage puts processing onto the storage sub-system and aims to offer far greater efficiency especially in cases of data growth from, for example, the proliferation of sensors and the internet of things (IoT), or where rapid processing is needed for artificial intelligence (AI) and machine learning (ML) use cases. Other applications include encryption and decryption, data compression and deduplication, and storage management

Only a handful of suppliers so far offer computational storage hardware, although a larger number are part of the Storage Network Industry Association’s (SNIA) working group. According to SNIA, “computational storage solutions typically target applications where the demand to process ever-growing storage workloads is outpacing traditional compute server architectures.”

“NVM Express Computational Storage is part of our efforts to help enterprises and hyperscale datacentres meet the ever-evolving demands of the storage industry,” said Bill Martin, NVMe Computational Storage task group co-chair and board member.

“Computational Storage is a standardised approach that enables an open, interoperable ecosystem. By offloading compute to the device, we anticipate that these industries will experience reduced total cost of ownership and overall performance boosts.” 

Despite the much-vaunted benefits of computational storage, significant challenges remain to be tackled. There are a small number of suppliers working on computational storage and their approaches vary significantly, which means interchangeability is not yet possible. Meanwhile, existing applications may need to be re-factored to integrate with computational storage systems.

Exit mobile version