It’s no secret that SAP and Oracle don’t like each other, nor is it a secret that SAP would love to stop being a major sales channel for Oracle’s database, which is still the number 1 database in the SAP customer base. It’s also no secret that many SAP customers would love to stop paying a premium price for a database that is functionally underused by the SAP product line.
But now there’s a new reason that SAP would like to get rid of the traditional relationship between its software and the top end of the database market: performance. If SAP could make the database side of its applications perform an order of magnitude faster without Oracle, then replacing Oracle would be a good business decision, regardless of the schadenfreude (pleasure gained from the misfortune of others) that would greet the decline of Oracle’s database business in Walldorf and Palo Alto.
The “Oracle Killer”
Well, get ready, because if SAP’s plans pan out, there’s going to be a lot of schadenfreude to go around. The “Oracle killer,” as it’s been dubbed, is on its way, and that means that SAP customers will someday have a choice as to whether they want what my old database administrator (DBA) boss — a throw-back to a much less politically correct era — used to call a “big, hairy-chested database,” or something a little leaner and a whole lot meaner.
This isn’t the first time SAP has tried to break the relational database management system (RDBMS) paradigm that has defined its back-office relationship with Oracle. SAP bought the rights to the Adabas relational database from Software AG in 1997 and christened it SAP DB in an attempt to get its users out from under their Oracle database license. Despite favorable reviews, the strategy failed to gain momentum, and eventually SAP DB was turned out to the open-source pasture, where it became one of the principal offerings in MySQL’s open-source database portfolio.
Part of the reason SAP DB didn’t work was precisely the “me too” functionality that the database provided vis-à-vis Oracle. It was a replacement, but not necessarily one that did anything positive other than lower license costs. On the face of it, that should have been enough to bolster SAP DB sales, but it wasn’t. SAP clearly needed to do more than just provide a lower-cost alternative to Oracle.
The testing ground for SAP’s new database strategy can be found in a product the company released last year: the BI Accelerator, or BIA. Among its many attributes, BIA is a hardware appliance that runs SAP Analytics incredibly quickly. The potential “Oracle killer” part of BIA comes from its in-memory database functionality, which processes database queries in RAM with no need to access disk-based storage — or even have a relational database at all — which means no costly database license, no massive data-storage farm, no expensive DBAs, and so on.
What’s interesting about BIA as a precursor to an eventual Oracle-free SAP environment is the fact that SAP’s proving ground is the data warehouse, home of the largest datasets known to humanity and a black hole for IT budgets and end-user expectations over the years. Putting BIA to work tackling the combined misery of high cost and slow response on the business analytics side of SAP is a great way to prepare for doing something similar on the transactional side.
The idea of in-memory query management at SAP is hardly new. Back in the late 1990s, SAP unveiled LiveCache, an in-memory processor for what was then called the Business Warehouse. LiveCache was a little ahead of its time for lots of reasons, starting with the fact that CPU and memory costs were still relatively high for what SAP had in mind. In the end, LiveCache failed to live up to expectations. While it still survives as the in-memory query engine for SAP’s Advanced Planner and Optimizer (APO) supply-chain product, for the most part LiveCache has been quietly relegated to the nether regions of the SAP developer network and the SAP Web site.
Before LiveCache was consigned to the depths, it had worked itself into SAP benchmarking history, giving an indication of the response times that are possible using an in-memory database engine. Based on SAP’s own benchmarking standard — the SAP Standard Applications Benchmark — SAP’s hardware partners have had a glorious time leapfrogging each other in recent years to see which could achieve the best response times with LiveCache.
Using APO as the standard application, with LiveCache doing much of the heavy lifting, the Standard Applications Benchmark calculates how many different combinations of “10,000 different products from 10 different distribution centers bought by 2,000 different customers for a two-years (sic) history on a weekly basis” can be processed per hour. In other words, the benchmark shows how many combinations and how quickly APO and LiveCache can sift through the different options available to a planner trying to understand his or her product-planning options.
When SAP first started certifying the benchmarks in 2001, a benchmark of 53,199 combinations was set by IBM, using an eight-processor system with 32GB of RAM. By the time the benchmarking war had settled down two short years later, Unisys was on top with a benchmark of 586,319 combinations, using a 16-processor machine with 64GB of memory. It’s amazing how a little competitive spirit and some tuning can promote technological progress.
Of course, these benchmarks are only a placeholder for the kind of functionality possible in an in-memory transactional world. As they were based on older software and hardware environments than we have available today, we can safely assume that when the “Oracle killer” arrives, it will do an even better job at an even lower relative cost. “When” is still the operative question. Shai Agassi, president of the Product and Technology Group and a member of the Executive Board at SAP AG, spoke publicly for the first time about an in-memory database for transactional systems only last September, and at the time was circumspect about an eventual release date. But, considering the fallout that same week from Oracle’s successful first quarter 2007 revenue announcement and Larry Ellison’s claims that SAP was “rethinking” its strategy in light of Oracle’s results, you can be sure that SAP executives are eager to unveil such a powerful strategic weapon as soon as possible.
The Die Is Cast
Assuming “when” — not “if” — is the main issue, an eventual “Oracle killer” would have its greatest initial impact on new implementations, where SAP could come in and offer a significantly faster transaction environment at a significantly lower cost. But it will also be interesting to see if a “rip and replace” justification can be made for existing implementations: A lot will depend on the cost of migrating to the new environment, which will not necessarily be done by tweaking a couple of dials.
Regardless, the die is cast, and SAP’s Oracle database customers are now on notice: Someday soon, you will have the opportunity to choose between the status quo and a new and radically different database approach. It may prove to be one of the most important — and cost-effective — choices a CIO will ever make.
|Joshua Greenbaum is a market research analyst and consultant specializing in the intersection of enterprise applications and e-business. Greenbaum has more than 15 years of experience in the industry as a computer programmer, systems analyst, author, and consultant. Before starting his own firm, Enterprise Applications Consulting (www.eaconsult.com), he was the founding director of the Packaged Software Strategies Service for Hurwitz Group.