Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

In-memory computing: Speed boost 2012

Stefan Hammond | Feb. 27, 2012
While the buzz in 2012 will continue to revolve around mobile devices and cloud computing, enterprises plan to leverage raw speed.

While the buzz in 2012 will continue to revolve around mobile devices and cloud computing, enterprises plan to leverage raw speed.

No, it won't be Usain Bolt at the London Olympics. As tech-gear comes ever closer to the physical boundaries of electric-speed, hitherto-unseen bottlenecks are surfacing. Storage vendors have been selling systems using SSDs (solid-state drives, essentially large-capacity flash memory units with higher throughput speeds) at critical data-junctures for a couple of years now.

As SSD-prices continue to drop, expect this trend to continue.

Speed is found across the tech-spectrum nowadays. Processors ramp up their clock-speeds and multi-core processors amplify dataflow. It all leads to more data: tech terms like "Big Data" are now commonplace.

In-memory computing
The selective use of flash memory can improve the overall performance of many enterprise-tech deployments. "We will see huge use of flash memory in consumer devices, entertainment devices, equipment and other embedded IT systems," said Gartner VP David Cearley, listing the technology as one of Gartner's 10 key IT trends for 2012.

"In addition, flash offers a new layer of the memory hierarchy in servers and client computers that has key advantages--space, heat, performance and ruggedness among them," said Cearley. "Unlike RAM, the main memory in servers and PCs, flash memory is persistent even when power is removed--in that way, it looks more like disk drives where we place information that must survive power-downs and reboots, yet it has much of the speed of memory, far faster than a disk drive."

Cearley added that software is critical to unlock these advantages. "As lower-cost--and lower-quality--flash is used in the datacenter, software that can optimize the use of flash and minimize the endurance cycles becomes critical," he said. "Users and IT providers should look at in-memory computing as a long-term technology trend that could have a disruptive impact comparable to that of cloud computing."

"In-memory as a technology has come into focus over the last two to three years. However, there is nothing new about it," said Surya Mukherjee, lead analyst for Ovum's Information Management team. "Any memory faster than a disk drive with moving parts will speed the process of data storage, retrieval, and analysis," he said.

"As RAM and SSDs become cheaper, it's now possible for organizations to load an entire database onto fast memory and benefit from low-latency transactions," said Mukherjee. "Popularity of in-memory analytics is closely linked to advances in hardware technology, such as 64-bit computing, multi-core processor, and improvements in processor speed. Technical advancements in these areas help vendors optimize the use of memory and speed up data processing performance over previous incarnations of the technology."

Mukherjee explained that "software applications using the memory should be optimized to deliver desired features within the application-tier and work with the data in-memory. Also the compression used within in-memory databases and applications should be optimized to perform read/write without the need for significant decompression of data."

Which vendors are racing to bring their in-memory products to market?

In pole-position
In December 2010, German firm SAP launched HANA: software which uses an in-memory computing engine that allows data to be held in RAM instead of being read from disks or flash storage, thus providing a performance boost. SAP intends HANA boxes to be attached to its own ERP systems, sucking in and analyzing transactional data in real time. However, HANA's 'agnostic' data access functionality means any information source can be used.

In an October 2011 research note, Gartner's Massimo Pezzini and Daniel Sholler wrote: "SAP has started a new generation of application infrastructure and architectures effort focused on cloud and in-memory. The new vision will force the competition to respond; but addressing its potentially disruptive impact on SAP's most-conservative customers' requirements will be a key challenge."

The Gartner analysts described SAP's strategy as "bold" and "very ambitious," and also wrote: "The HANA Architecture is a work in progress, and will undergo several significant changes before it is completed. This potentially exposes SAP users to challenges for migrating to, and integrating with, different technology generations."

Vendor scrap
"Although still rough around the edges and incomplete, SAP's strategy is likely to shake up the application infrastructure market and put extraordinary competitive pressure on the megavendors, including IBM, Microsoft and Oracle, as well as on application infrastructure specialists, such as Red Hat-JBoss, Tibco Software, Software AG and VMware," wrote the Gartner analysts.

In various public statements, SAP has made it clear that the long-term goal is for HANA to replace other databases--especially rival Oracle's offering--that are now running its applications, including the flagship Business Suite.
Constellation Research analyst Ray Wang agrees that SAP's strong play targeting the database market will hurt Oracle, but added the latter should not worry just yet.

"If you're really serious about undercutting Oracle, you make a database. You cut into Oracle's market on database so that you actually stop funding there," Wang said.

"But Oracle's got so big because of all the acquisitions, you can't just cut them off on database, although that's one big piece. They were vulnerable at one point in time. Teradata pretty much was in that same boat. So this is just natural competition between the tech vendors."

Oracle sets sail
Oracle CEO Larry Ellison hasn't always been a supporter of in-memory technology. According to CIO Magazine, at an event in early 2010, Ellison said: "This is nonsense. There is no in-memory technology anywhere near ready to take the place of a relational database. It's a complete fantasy on [SAP's] part."

That was then, this is now. Oracle intends to ship its own in-memory powered appliance, Exalytics, later this year. Both Exalytics and HANA incorporate in-memory databases, providing a performance boost over systems that read and write data from disks. Oracle's Engineered System price list, which was updated in early January, states a price for their Exalytics In-Memory Machine X2-4. The machine consists of a single server with 1TB of RAM and four Intel Xeon E7-4800 processors, each with 10 cores, according to an Oracle whitepaper. Exalytics machines can also be clustered together.

Exalytics also "supports the broad portfolio of Oracle BI applications right out of the box," and customers who have existing applications built with Oracle BI Enterprise Edition and Essbase can migrate them to Exalytics without changes, according to the whitepaper. But requisite support and software isn't priced in.

Oracle's second-quarter results, which showed a 14 percent drop in hardware systems product revenue, may have accelerated the release schedule for both Exalytics and the Big Data appliance. Oracle is under pressure from investors to boost the hardware business and its leadership is no doubt looking to give sales teams as many products as possible to push as its fiscal year draws to a close.

Meanwhile, Oracle shares many customers with SAP and some of those may be evaluating HANA. While SAP plans to position HANA over time for transactional as well as analytic workloads, the rivalry between the two companies' products is difficult to deny.

"While in-memory isn't really a new technology, vendors are rolling out products that incorporate in-memory technology," said Sharon Tan, research manager, IDC Asia/Pacific, Information Management and Analytics Domain Research Group. Tan added that vendors and their products--"SAP (HANA as an in-memory platform), and SAS (SAS High-Performance Analytics)"--were among the in-memory offerings.

"Most software vendors in the business analytics and database market [will] incorporate in-memory technology somewhere in their product range," said Tan. "IDC expects that in-memory technology will enable applications that were previously not feasible and thus increase the userbase throughout lines of business."

The IDC analyst said that the higher price of flash memory would initially counterbalance user growth. "IDC expects that smaller vendors--with software that...incorporates niche analytic capabilities will be seen as acquisition targets as large vendors seek to beef up their portfolio with complementary and proven [firms]," she said.

Tan said that SAP has announced plans to expand their reach in retail industries by integrating products from SAF Simulation, Analysis and Forecasting AG, a vendor they acquired in January. "SAF's software provides automated ordering and forecasting software that helps enable more accurate demand prediction and automate the ordering process for mainly the retail industry," she said.

IDG staff contributed to this story.

 

Sign up for MIS Asia eNewsletters.