Expand +



Take the Guesswork Out of Your Next Hardware Purchase: Depend on Experienced, Impartial SAP Standard Application Benchmarks to Find the Technology that Best Fits Your Business Needs

by Dr. Ulrich Marquard | SAPinsider

April 1, 2008

How can you be certain that the hardware you’re buying will scale to meet your needs? And will it perform reliably and predictably? To help answer these questions, SAP offers SAP Standard Application Benchmarks, designed to efficiently test and prove the scalability of SAP and partner solutions.


Companies of all sizes and in all industries expect their SAP solutions to function reliably and predictably, even (or rather, especially) during phases of extremely high load. The type and scale of their business needs are as diverse as the companies themselves:

  • Banks may need to process transactions in more than two million checking accounts

  • Some large companies may need to calculate payroll overnight for 500,000 employees

  • Pharmaceutical companies may be legally obliged to retain data for several years, blowing their databases up beyond 10,000 gigabytes

And it's not only large customers that expect good performance; small companies with only a few users expect SAP standard software to perform at its best, even if it is not stretched to its limits.


SAP Standard Application Benchmarks assure customers that the tested hardware and software configuations can handle their required load.

To test and prove the scalability of SAP solutions and the hardware they run on, SAP and its hardware and technology partners have collaborated for over 15 years and have developed a suite of SAP Standard Application Benchmarks. Both SAP and its partners use these benchmarks to prove that their software and hardware components scale to fit their customers' business needs. This assures customers — be they small businesses or large, multinational enterprises — that the tested hardware and software configurations can handle their required load.

A Deep Dive into SAP Standard Application Benchmarks and Their Benefits for Customers and Partners

A benchmark, in the broadest of definitions, is a point of reference for any type of measurement. In the IT industry, benchmarks are standardized load tests that yield key performance figures, which enable a quantitative comparison of the hardware and software being tested. SAP Standard Application Benchmarks, then, specifically measure and report the performance behavior of individual SAP system components (such as database systems or application software) on a particular piece of underlying hardware.

What differentiates SAP Standard Application Benchmarks is the fact that they combine the measurement of system performance with a business application that is productively used in customer implementations. By executing actual business processes, they render application-specific and business-relevant performance indicators — such as the number of users that can work simultaneously in the system, navigation steps or user interactions per hour, or business throughput figures, such as fully processed order line items per hour.

These benchmarking results prove that the tested software and hardware components are scalable — that they can be expanded or reduced (in terms of size, volume, or number of concurrent users, for example) while still functioning reliably and predictably. In this way, the benchmarks help both SAP and its partners demonstrate the scalability and manageability of even extraordinarily large installations. Imagine our excitement when — against the dire predictions of naysayers — we reached the 100,000 SAP Sales and Distribution (SD) Benchmark user hurdle in 2004 (more to come on the SAP SD Benchmark later)!

And hardware and technology partners have not stopped there in their efforts and commitment to optimize their technologies for SAP Business Suite applications: With the type of configurations available today, it would take only one hour's worth of sales order processing for an online retail company to turn over $170 million, assuming an average sales price of $10 per sales order item.

Customers benefit from these strong correlations between benchmarks and live customer implementations. By analyzing benchmarking results, available at, customers can anticipate how a particular hardware and software configuration may behave under high load. Since benchmarks are also the basis for predicting hardware resource requirements — a process called hardware sizing — they help customers define a system configuration suitable for their particular business environment.1


The SAP Benchmark Council Celebrates Its 15th Anniversary

Agility, Stability, and Continuity – A Rare Combination in IT

2008 marks an important year for SAP Standard Application Benchmarks; it's the 15th anniversary of the first meeting of what evolved to become the SAP Benchmark Council — the body that governs the definition, development, and certification process of the SAP Standard Application Benchmark suite. Throughout this committee's history, representatives from SAP and its hardware and technology partners have successfully collaborated to help shape the SAP benchmarks and make them what they are today: some of the most credible and influential application benchmarks in the IT industry.

Started initially in March 1993 as a relatively informal meeting of representatives from SAP and its hardware and technology partners, the SAP Benchmark Council was officially inaugurated in 1995 as the body governing the entire SAP benchmarking process. The council meets once per month and is responsible for:

  • Monitoring all activities around benchmarking

  • Defining benchmark rules and processes, and ensuring the strict adherence to them

  • Controlling the content and publication of the benchmarks

  • Ensuring a level playing field by having SAP certify the benchmarking efforts on behalf of the SAP Benchmark Council

Some of the biggest challenges the council faces are the speed at which it must react to technological change and the need to harmonize the potentially diverging opinions of its members in the process. Don't forget that this is a multi-vendor assembly and its focus topic — system performance — is usually the field in which these companies differentiate themselves from one other in a very competitive market.

Over the years, the practice of setting up spin-off work groups has proved to be an efficient way to investigate new topics, discuss the challenges they present to benchmarking, and arrive at solutions that both benefit customers and work for all parties involved. These think tanks, staffed with experts from interested partners, meet on an as-needed basis to work out proposals on how to integrate a particular requirement into the benchmarking process. The most recent topics topping the agenda of the SAP Benchmark Council and its expert workgroups, for example, were multi-core computing and virtualization — both of which have significantly influenced how benchmarks are run and evaluated.

Uncovering the Unmatched Quality of SAP Standard Application Benchmarks

Since the early 1990s, SAP had been running in-house load tests in cooperation with its hardware and technology partner competence centers. The goal? To gain insight into performance behavior across applications, releases, and hardware platforms. SAP soon realized that to ensure reliable and irreproachable results and enhance the quality of the comparisons, these load tests had to be standardized.

The Benefits of Standardization

SAP worked with its hardware and technology partners to develop the concept of SAP Standard Application Benchmarks, along with a standard process for these benchmarks. This ensured that the benchmarks were portable across multiple platforms and operating systems, and that they generated reproducible and publishable results with integrity beyond doubt. SAP and its hardware and technology partners:

  • Prescribed clear definitions and rigorous criteria for permissible configuration and tuning activities

  • Defined a set of benchmark tools and a standardized environment that enabled the benchmarks to be implemented on different platforms

  • Established the SAP Benchmark Council, an independent governing body with members from SAP and its hardware and technology partners, to oversee the certification of existing benchmarks and to promote the development of new ones for emerging technologies (see sidebar)

The Evolution of Benchmarks Mirrors IT Progress

The first application benchmark to be formally submitted, certified, and published within this new context was the SD Benchmark in 1995, followed closely by the Financial Accounting (FI) Benchmark the same year. Since then, a host of application benchmarks have joined the ranks to cover the full range of SAP Business Suite, Industry Suite, and SAP NetWeaver platform solutions (see Figure 1).2


Figure 1
The evolution and usage patterns of the different benchmark types mirror the evolving interests of the IT community

Under the guidance of the SAP Benchmark Council, application benchmarks are defined and developed in response to — or in anticipation of — concrete customer requirements. Or, they're simply triggered by an industry-wide need to understand and measure emerging high-volume business processes. A glance at the evolution of SAP Standard Application Benchmarks over the years is like reading a chronology of the driving market forces, business challenges, and hot topics over time.3


A glance at the evolution of SAP Standard Application Benchmarks is like reading a chronology of the driving market forces, business challenges, and hot topics over more than a decade.

Benchmarks as Quality Indicators

The stability of benchmark scripts enables comparative evaluation across different platform versions, software releases, or architectures. SAP also uses these scripts for quality assurance within its own development community. For example, the SAP Enterprise Portal Benchmark, defined in 2006 to answer the growing need for benchmarking results for the Application Server Java, helped SAP understand the behavior of different virtual machines, as well as the effects of Java garbage collection in multi-user environments.4

Partners also use the benchmarks in their QA efforts. For example, one major hardware provider runs regression tests with SAP Standard Application Benchmarks for its new servers. The number of benchmarks that partners are certifying each year is rising steadily (see Figure 2); to date, partners have invested substantial time, effort, and money to submit over 680 benchmarks, subject them to the scrupulous investigation of the SAP Benchmark Council and its members, and have them certified and published. I expect it won't be too long before the 1,000th benchmark certificate is published — and I can only guess at its user numbers and throughput figures!


Figure 2
The number of benchmarks that technology partners have certified each year, and how many of them are SD Benchmarks

The Benchmarking Flagship — The SAP SD Standard Application Benchmark

Since its inception in 1995, the SAP SD Standard Application Benchmark has evolved as one of the most popular, influential, and credible online transaction processing (OLTP) benchmarks in the industry.

This benchmark covers a sell-from-stock scenario, which includes the creation of a customer order with five line items and the corresponding delivery with subsequent goods movement and invoicing (see Figure 3). This process is repeated by as many simulated SD Benchmark users running in parallel as needed to stretch the system to its limits — usually approaching nearly 100% utilization of the database server at a prescribed response time of less than two seconds.


Figure 3
The benchmarking process for the SD Standard Application Benchmark

In large productive systems, the database CPU utilization is the determining factor for throughput and response time, whereas all other performance-relevant components — such as application server CPU time, network bandwidth, main memory, and hard disk — can be configured and extended as needed to meet the required performance needs.

But customers are not only interested in the number of parallel users or response times. Another important performance metric is throughput per unit of time. SAP Standard Application Benchmarks define throughput numbers in business application terms — such as "fully processed order line items" in the SD Benchmark, or "navigation steps per hour" in the BI Data Mart Benchmark — and then map them to the most prominent hardware components including CPU, memory, and disk size. The results are presented in SAP Application Performance Standard (SAPS), a hardware-independent unit of measurement (see Key Term box).


Key Term: SAPS

Throughput numbers and system response times determine the performance of a system configuration in the SAP environment. And the hardware- independent unit of measurement SAP Application Performance Standard (SAPS) describes the throughput result of an SAP system configuration. Derived from the SAP Sales and Distribution (SD) Benchmark, the unit is defined as follows: 2,000 fully processed order line items per hour = 100 SAPS. Technically speaking, this equals 6,000 dialog steps (screen changes), 2,000 postings per hour in the SD Benchmark, or 2,400 SAP transactions.

Over the years, the SD Benchmark has remained virtually unchanged: The same business process used to measure a system back in 1993 is used to measure the performance of a system setup today. Therefore, it's possible to gauge just how rapidly technological development has progressed — and how customers' businesses have been able to benefit from such enormous performance gains. For example, benchmark statistics indicate that SAPS figures have increased 50-fold between 1996 and 2006 (from roughly 600 to 30,000), a development roughly mirroring performance improvements according to Moore's Law.


It's the foresight, dedication, and cooperation of the members of the SAP Benchmark Council that has kept the SD Benchmark and all other benchmarks in the suite so untainted and reliable.

It's the foresight, dedication, and cooperation of the members of the SAP Benchmark Council that has kept the SD Benchmark and all other benchmarks in the suite so untainted and reliable (refer back to sidebar). Christian Kowarschick from Fujitsu Siemens Computers, a member of the SAP Benchmark Council since its founding, says, "Straight from the start, the meetings of the Benchmark Council have been fascinating to me because its members — even though they represent companies usually engaged in tough competition — pursue a common goal: to strengthen the importance of the SAP benchmarks as industry-standard benchmarks. Often, opinions on how this goal is to be reached diverge significantly, leading to ardent discussions. Even after 15 years, these discussions have not lost any of their enthusiasm, and I am convinced that it will continue that way in the years to come."

Effectiveness and Customer Benefit — The SAP Benchmark Council's Fundamental Objectives

The principles behind the SAP Benchmark Council are as simple as they are effective: All members have the same rights and responsibilities. The independence of the body is guaranteed by the fact that no partner can push through a personal agenda unilaterally; there is an understanding — and a self-imposed restraint — that's fundamental to keeping this institution flexible and agile, while at the same time ensuring continuity and reliability for customers, partners, and SAP.


How You'll Know that Published Benchmark Data Claims Are Accurate and Up to Date

The Publication Workgroup Ensures Reliability

One of the major benefits for hardware and technology partners that participate in the benchmarking process is that they can publicize the results their systems have achieved. But how can customers ensure that published benchmarking results are accurate and valid? This is where the council's self-governance practices come into play again.

The SAP Benchmark Council established a Publication Workgroup in 2001. This workgroup defined a set of fair and competitive practices and a common terminology for the publication of information related to SAP Standard Application Benchmarks. Consider a technology partner that comes across another partner's publication — a newsletter or press release, for instance — that cites benchmarking results in a way that violates the publication guidelines. True to the council's self-governance principles, the partners will first attempt to resolve this violation between themselves. Only if the matter cannot be settled satisfactorily will the involved parties bring the issue to the workgroup for discussion. If the publication is deemed a valid breach of agreement, the workgroup will list the breach in the Publication Policy and Violations section of the benchmarking Web page (

While SAP ultimately certifies and publishes all submitted benchmarks, fair play is guaranteed by the fact that each partner can obtain a benchmark submission from another partner for review. This right to disclosure — along with each partner's readiness to divulge this potentially sensitive data to other partners — discourages illicit tuning or tampering with the benchmark results. This helps to maintain the high level of credibility and visibility that the SAP Standard Application Benchmarks have attained in the industry, and all partners involved have a vested interest in keeping it that way.

Apart from valid, objective, and reliable hardware performance figures, the most important and practical outcome for customers is that all configurations certified through SAP Standard Application Benchmarks are either available on the market already or will become so within six months of certification.

The SAP Benchmark Council: A Legend in the Making

SAP Standard Application Benchmarks have enjoyed a long history of helping IT companies celebrate new performance records and helping customers configure and size SAP business solutions for their productive systems.


SAP Standard Application Benchmarks have enjoyed a long history of helping IT companies celebrate new performance records and helping customers configure and size SAP business solutions for their productive systems.

This year, it's time to celebrate the people behind the benchmarks: the members of the SAP Benchmark Council who have collaborated for 15 successful years to make it a steadfast institution in a world of rapid technological change. The SAP Benchmark Council has been able to strike a pivotal balance between continuity and progress, and — if the past is any indication — it will continue to do so going forward. I greatly anticipate a future rich with thriving benchmarking results that will provide continually high value to our partners and customers for years to come.

For more details about SAP Standard Application Benchmarks, please visit


Additional Resources

"Targeted Methods and Tools for Right-Sizing Your Hardware Landscape" by Susanne Janssen (SAP Insider, January-March 2006,

"Demystifying Java-Based Load Tests and Their Results" by Xiaoqing Cheng (SAP Insider, October-December 2006,

"Putting the Database on a Diet: The 'Oracle Killer' Positions SAP as a Database Pioneer" by Joshua Greenbaum (SAP NetWeaver Magazine, Winter 2007,

Sizing SAP Systems by Susanne Janssen and Ulrich Marquard (SAP PRESS,


Dr. Ulrich Marquard ( is Senior Vice President of the Performance, Data Management, and Scalability group at SAP AG. He joined SAP in 1990 and does extensive research in the fields of system architecture, system analysis, scalability, and performance. He is one of the founding members of the SAP Benchmark Council.

1 SAP provides a well-defined sizing methodology and an online sizing tool, SAP's Quick Sizer, to assist customers in the sizing process. For more information, see and Susanne Janssen's article "Targeted Methods and Tools for Right-Sizing Your Hardware Landscape" in the January-March 2006 issue of SAP Insider (

2 Please note that Figure 1 is not a complete listing of SAP Standard Application Benchmarks; visit for a complete list.

3 A good example here is the development of the SD Parallel Benchmark in response to the first parallel database servers. Consider also the development of the Transaction Banking Benchmark in close cooperation with technology partners in 2005, a time when the banking industry solution came into focus.

4 For more information on the experience gained in this benchmarking study, see the SAP Insider Performance & Data Management Corner columns "Demystifying Java-Based Load Tests and Their Results" and "Taking Out the Trash: Avoid Performance Bottlenecks from Java Garbage Collection" at

5 Moore's Law states that computing power doubles every 18 months. See an interesting discussion of the law's formulation at's_law.




An email has been sent to:

More from SAPinsider


Please log in to post a comment.

No comments have been submitted on this article. Be the first to comment!