Intel today released a version of its SSD-based Cache Acceleration Software (CAS) for Linux servers, which it said can offer up to 18 times the performance for read-intensive applications, such as on-line transaction processing systems.
Additionally, Intel's CAS now supports caching to NAND flash (solid-state drives and PCIe cards) in storage arrays. And it supports VMware vMotion, which allows virtual machines to migrate between systems while maintaining hot data on cache, regardless of the host machine.
"The advantage is not just to provide better performance, but ensure no matter what happens, that performance remains consistent," said Andrew Flint, Intel's CAS product manager.
Intel acquired its CAS technology in September from its buyout of Canadian startup Nevex. Nevex sold the software as CacheWorks, but Intel quickly rebranded it CAS.
The use of cache acceleration and management software for NAND flash memory is a hot market. More than a dozen vendors are shipping products, and acquisitions are on the rise. Earlier last year, SanDisk acquired FlashSoft for its flash cache acceleration and management software. That was followed by Samsung's buyout of Nvelo for its Dataplex SSD caching software. The software identifies data experiencing high levels of reads and moves it to NAND flash in the form of SSDs to boost performance.
Intel announced CAS support for Windows systems in December. The latest support, for Linux, also allows admins to select applications that will benefit from the higher performance SSDs or allow the CAS software to automatically redistribute I/O-intensive data to the flash memory.
Intel said its CAS product can target hot data on back-end storage, such as a SAN, for both Windows and Linux machines and allow virtual machine migration while maintaining high I/O performance with flash cache.
"We took the problem of the I/O bottleneck from the side of accelerating applications," Flint said. "We have technology to direct performance to applications. Because we do that, we find most of our sales are to the DBAs and the app admins at companies.
"You don't have to rearchitect or configure your applications in any way, shape or form. You don't have to do anything on backend storage," Flint continued. "Neither end even knows the caching is happening. It sits in the middle, automatically identifies hot, active data, places a copy on high-speed media and the applications by extension go faster."
Flint said on standard databases, the caching software can triple performance. On OLTP applications, which are more read intensive, performance can jump 18-fold, he said.
While application servers perform a certain amount of caching in volatile DRAM already, the amount of caching is limited by the 4GB to 8GB of memory typically on board a server. Intel's CAS software takes advantage of higher-capacity SSDs, which can have as much as a terabyte of capacity, to improve the performance for far more data.
Performance improvement varies on systems and depends on the ratio of backend data to active data on the server and whether the data is read or write intensive. CAS is read-acceleration cache.