Intel, increasingly customising server chips for customers, is now tuning chips for workloads in big data.
Software is becoming an important building block in chip design, and customization will help applications gather, manage and analyse data a lot quicker, said Ron Kasabian, general manager of big data solutions at Intel.
Through hardware and software improvements, the company is trying to figure out how its chips can perform better in areas like predictive analytics, cloud data collection and specific task processing. The company has already released its own distribution of Hadoop, a scalable computing environment that deals with large data sets, and now chip improvements are on tap.
Kasabian said Intel is starting with the software. "It takes a while to get silicon to market," he said. "We understand where we can optimize for silicon, and there are certain things to [improve] for performance and optimization."
The company is taking lessons from software implementations and then looking to enhance the silicon to fill any software gap, Kasabian said, adding that the chip-design process takes about two years.
Server makers have been customizing servers specifically to carry out big data workloads, and improvements at the chip and instruction-set level could speed up task execution.
The plan includes developing accelerators or cores for big-data type workloads. For example, Intel is working with Chinese company Bocom to implement the Smart City project, which tries to solve counterfeit license plate problems in China by recognizing plates, car makes and models. The project involves sending images through server gateways, and Intel is looking to fill software gaps by enhancing the silicon. One improvement could be implementing accelerators to decode video, Kasabian said.
Intel has a big software organization, and the appointment of Renee James — formerly head of the software unit — earlier this year as the company's president was a sign of the chip maker's intent to dig deeper into software. The company does not want to become a packaged software distributor, but wants to enable software to work better on Intel architecture hardware. For a long time, Intel has backed open-source software and has hundreds of coders contributing to the development of Linux.
Different industries have different implementations of big data, Kasabian said. For example, a big data problem in genomics could differ from one in telecommunications.
Intel is also entering the space of the Internet of things, an emerging field in which networked devices with embedded processors and sensors are used as data-gathering instruments. The company has assets such as McAfee's software and hardware platform and Wind River's real-time operating system for its embedded chips to quickly process and securely collect data.
Outside of the silicon, Intel is focusing on providing the right software tools for data centres. Hadoop was the starting point, and now Intel is looking closely at analytics, Kasabian said.
Attaching Intel's name to Hadoop will "kind of ease the mind of folks in enterprises," Kasabian said, adding that implementation of the platform in data centres will be easier.
A lot of research is also taking place at Intel labs on stream processing and graph analytics as the company designs chips and tweaks software.
"We're looking at all the big industry categories," Kasabian said.