Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

In-memory technologies move databases to real time

Joab Jackson | March 25, 2014
Last week, application-performance monitoring service provider New Relic launched an offering that allows customers to mine its operational data for business intelligence.

These assumptions are being challenged, however. Online transactional databases, in particular, are being moved to main memory.

"If your data does not fit in main memory now, wait a year or two and it probably will," said Michael Stonebreaker, a pioneer in database development who is chief technology officer at VoltDB, the company overseeing his latest database of the same name. "An increasing fraction of the general database market will be main-memory deployable over time."

Stonebreaker's opinion is echoed by others in the business.

"If you have a bit of a performance pain, and you have the budget to pay for a market leading, general purpose, relational database management system anyway, then it makes sense to manage the data in RAM," said Monash Research head analyst Curt Monash.

Stonebreaker admits the approach wouldn't work for all databases. With today's memory prices, keeping a 100GB or even a 1TB database in main memory is not prohibitively expensive, depending on the level of responsiveness required and amount of money that an organization is willing to spend.

"It is still too early to call it commodity, but in-memory is becoming commodity in a way," said Nicola Morini Bianzino, managing director for the SAP Platform Solutions Group at Accenture.

Bianzino said he has seen a shift in the questions he gets from clients and potential clients about in-memory over the past six months. "The questions are shifting from 'What is it?' to 'How do I do it?'" Bianzino said.

"The message has gone to the market and has been assimilated by clients," Bianzino said. "This doesn't mean they will move everything tomorrow, but they are taking it for granted that they will have to move in that direction."

With SQL Server 2014, Microsoft's approach to in-memory is to bundle it into its standard relational database platform."We're not talking about an expensive add-on, or buying new hardware or a high end appliance," Kelly said.

The SQL Server in memory component can be used for online transaction processing, business intelligence and data warehousing.

The interesting thing about in-memory is not only that it can expedite current database operations, but actually create entirely new lines of business, Kelly said.

As an example, Kelly pointed to online auto parts reseller Edgenet.

Using a beta version of SQL Server 2014, "Edgenet has been able to transform its business to respond much faster to competitive threats, by enabling dynamic pricing on their website," Kelly said. The company can change prices of goods for customers in a given market, based on what the latest prices are from regional competitors, which may run spot sales on certain items.

Although dynamic pricing can be done with a standard relational database, the practice could lead to contention issues, in which updating the prices may slow the response to the end users to the point where they may not get the price quite immediately, Kelly said.


Previous Page  1  2  3  4  Next Page 

Sign up for MIS Asia eNewsletters.