Samsung says its CXL "Memory-semantic" SSDs are up to 20x faster in random performance

The Samsung logo on a black background

Earlier today at the Flash Memory Summit 2022 (FMS2022) event, Samsung unveiled its new CXL-based "Memory-semantic" SSDs. While SSDs utilizing DRAM cache are not new, the new Memory-semantic SSDs, as stated above, will utilize the super-fast CXL (Compute Express Link) interface.

As such, Samsung promises a 20x or a 1900% improvement to random read speeds and latency. This is exciting since random speeds are generally the Achilles heel of most SSDs, especially SATA-based ones with no DRAM.

In the announcement press release for FMS2022, Samsung says:

Samsung announced its ‘Memory-semantic SSD’ that combines the benefits of storage and DRAM memory. Leveraging Compute Express Link (CXL) interconnect technology and a built-in DRAM cache, Memory-semantic SSDs can achieve up to a 20x improvement in both random read speed and latency when used in AI and ML applications. Optimized to read and write small-sized data chunks at dramatically faster speeds, Samsung’s Memory-semantic SSDs will be ideal for the growing number of AI and ML workloads that require very fast processing of smaller data sets.

Here is the image of a Memory-semantic SSD:

CXL based Memory semantic SSD from Samsung

For now, at least, these SSDs and the CXL interface in general are meant for high performance computing (HPC) only that include large scale processing of AI and ML data, among others. However, it will be interesting to see if the standard comes to consumers in the future and what it could do in tandem with Microsoft's new DirectStorage API.

Source: Samsung

Report a problem with article
ebook offer
Next Article

Efficiency Best Practices for Microsoft 365 ($28.99 Value)

AMD Zen 4 launch is in 2H 2022
Previous Article

Alleged announcement and launch dates for AMD Ryzen 7000 and Socket AM5 boards surface

5 Comments - Add comment

Advertisement