Deploying an update of its DB2 database, IBM is pitching its SmartCloud infrastructure as a service (IaaS) for use in data reporting and analysis.
"We're the only player in the marketplace that has [a cloud service] for data-in-motion -- being able to analyze data in real time," said Bob Picciano, IBM's general manager for Information Management.
Starting in the second half of this year, the IBM SmartCloud IaaS will start using version 10.5 of IBM's DB2 database, which should be generally available by early June. One new set of technologies that will come with this database, collectively called BLU Acceleration, can speed data analysis by 25 times or more, IBM claimed.
IBM also announced that SmartCloud can now run copies of SAP's HANA in-memory database, initially for test and development jobs.
BLU (a code name that stood for Big data, Lightening fast, Ultra easy) is a set of technologies that optimize the speed of DB2 queries for in-memory databases. It offers columnar processing, in which only the needed columns of a database table are read in order to speed performance. It skips over unneeded data, such as duplicate entries. It can use multiple processor cores to execute a query, using parallel vector processing techniques. It also includes a compression technology that minimizes the amount of space a data set needs, while still providing easy accessibility for quick analysis.
A customer can continue to use preferred business intelligence (BI) software and just redirect the SQL and OLAP (Online analytical processing) queries from the BI software to the IBM service. Each DB2 instance on SmartCloud can run on up to 16 processor cores, which collectively could manage a terabyte or more of memory.
"You don't need a lot of cores to be able to take advantage of this kind of parallelism, because we're so efficient in how we're using the processors and memory," Picciano said.
With this technology, a customer could set up a system that would analyze data as it comes off the wire, Picciano said. Typically, organizations may store their data in a data warehouse or, if data is collected in sufficient quantities, in a Hadoop cluster. Then they would use various BI tools to extract patterns and other intelligence from the data set. IBM proposes keeping the data in-memory and analyzing it on the fly, using the same BI tools. The in-memory approach can cut the query times by a thousand times, Picciano said.
"Queries that used to run in seven minutes on a well-known data warehouse systems can run in 8 milliseconds on our systems," Picciano claimed.
IBM joins a number of companies that are pitching cloud computing as an ideal platform for large-scale data analysis. Google, for instance, offers its BigQuery service for crunching data online. VMware, along with parent company EMC, recently spun out a cloud service subsidiary, called Pivotal, that will focus on a providing data analysis services.
IBM also is offering to run copies of SAP HANA on its SmartCloud, where it can be used for data analysis. The company did not specify whether it will provide its own licenses for the services, or if customers should procure their own. One early customer of this service has been the department store Globus, which tracks sales for about 800,000 items on a SmartCloud instance of HANA.
Picciano did not disclose pricing information for the SmartCloud database service, stating that IBM works with each customer on a case-by-case basis.
Other cloud providers also made announcements this week. Google has cut the prices of its Google Cloud Datastore, a NoSQL data storage service, by up to 25 percent. Also, Microsoft announced that it is expanding its Azure cloud service to China, making it the first major cloud provider to do so, the company claimed.