The benefits of virtualization — including resource consolidation and reduced operational costs — have become increasingly clear, leading many IT departments to make virtualization a priority. Unfortunately, I find that many virtualization installations often lack proper preparedness, which leads to a delayed implementation and slower ROI. By adhering to some key considerations, however, customers can quickly reap the benefits of virtualization — and avoid potential pitfalls.
The first step to a successful virtualization implementation is to determine which implementation approach is right for your company. Currently, there are two fundamentally different implementation models: the proprietary approach and the multi-vendor approach. Let’s look at each in more detail.
The Proprietary Approach: Let Your Vendor Do the Heavy Lifting
Taking a proprietary approach to virtualization means having a single vendor provide the entire IT infrastructure that makes up a virtualization implementation, including:
- A 64-bit server with built-in partitioning and virtualization features
- An operating system that has been developed to run on this vendor’s specific hardware (and that usually won’t run on other platforms)
- A virtualization layer that fits seamlessly between the hardware and the operating system
- Additional application components, like a database or Java Software Development Kit
The vendor is usually responsible for delivering the preconfigured system and for initially setting it up at the customer site. In addition, the vendor provides comprehensive administration guidelines describing the whole stack, as well as consulting and implementation services to ensure that the SAP system and the hardware fit well together and will meet the customer’s needs.
There are several advantages to using the proprietary approach (see Figure 1). It’s a convenient way to adopt a virtualized IT infrastructure, and by establishing a single point of contact, this approach makes it easier to assess and address any issues. Working with a single vendor prevents unnecessary finger pointing if something goes wrong. In addition, when compared to the multi-vendor approach, the proprietary approach nearly always is superior, particularly when:
- Your company does not have in-house SAP expertise. Certain vendors provide all of the SAP implementation and business know-how you need for your virtualization efforts.
- Your implementation requires significant specialization. If, for example, your implementation project requires you to set up a cluster file system or high availability features, you’ll often find that there are only a few experienced proprietary vendors that could implement such a solution.
- You are in an industry that legally requires particular certifications. Certain laws or government regulations might demand a specific kind of certification. For example, in the US Army, IT implementations must meet Evaluation Assurance Level (EAL) certification. Because only certain vendors meet this certification and the certification must cover the entire infrastructure rather than just one part of it, the proprietary approach is a must.
However, the proprietary approach does have one major drawback — it can be quite expensive. The multi-vendor approach tends to be more cost-effective.
||An at-a-glance view of the pros and cons of the two virtualization approaches
The Multi-Vendor Approach: A Cost-Effective Method
The multi-vendor approach, as the name implies, involves several vendors that each supply a different piece of the virtualization puzzle. Different vendors will supply the OS, the hardware, and the software — thereby preventing vendor lock — and companies will often use their own in-house SAP experts to implement the solution, rather then relying on outside consultants.
As I mentioned earlier, the biggest argument in favor of the multi-vendor approach, also known as the commodity approach, is that it is much cheaper than the proprietary approach (refer again to Figure 1). When coupled with virtualization’s own promises of cost savings, this approach can be very attractive. However, it’s important to remember that you can only achieve these cost savings if all components of the stack work together flawlessly — and such flawless integration is not as assured as it might be with the proprietary approach. Extra vigilance from the customer is needed.
Ensure the Success of Your Multi-Vendor Implementation with TEAM Principles
With the proprietary approach, the success of your implementation rests in the hands of the vendor you select. The multi-vendor approach, on the other hand, requires more work from you, the customer.
Ensuring that the multiple elements of a virtualization implementation work together is critical, but it’s not always easy. Many customers think that the answer to this conundrum lies in the SAP certification process, but they fail to realize the complexity involved in these certifications (see sidebar). Certifications don’t guarantee that certain components will work together without additional work from the IT team. That’s why choosing the partners who will each provide a component of your virtualization implementation is only the first step.
Whoever leads a virtualization project should take into account the TEAM principles, focusing on Testing, Education, Awareness, and Monitoring.
- Testing: Every customer site runs its SAP solution differently. Just count how many Z_* reports your company runs — the more Z_* reports you have, the more customization your virtualization solution needs. Even certified partner solutions cannot account for every nuance of how your virtualization solution will work within your environment. That’s why, when introducing virtualization, any performance-critical jobs or application components that will run on the implementation should be thoroughly tested.
- Education: Virtualization introduces two new views into the solution stack. From a hardware perspective, the virtualization technology acts like software. But from an operating system perspective, the virtualization technology acts like hardware. Possessing the characteristics of both hardware and software, the interaction among the operating system, virtualization technology, and hardware is deeply integrated. IT operations teams must have a good understanding of the low-level software and hardware principles — such as process scheduling for non-uniform memory access — to make the best possible decisions for daily operations and configurations. I recommend that companies provide education on topics like CPU scheduling, CPU pinning, cache coherency, memory mapping, resource over-subscription, PCI BUS pass-through, and I/O scheduling technologies, just to mention a few.
- Awareness: On top of educating the IT team, companies implementing virtualization solutions must also improve the awareness of their business users. Business users traditionally think of their hardware resources as static entities. With virtualization, this is no longer the case. For example, a virtual CPU with 2,000 MHz could be backed by a physical CPU with only one MHz or less. Because these virtual resources are shared, if business users utilize them as they did when they were dealing with physical resources, they might overload the system and experience performance problems. Improving awareness about these limitations may encourage business users to share their resources more readily. In addition, IT will have to deal with fewer service tickets about “mysterious” performance issues. I’d also recommend improving business awareness before the virtualization implementation. Your business users should understand how the benefits of the implementation — cost savings, in this case — can outweigh the negatives, such as reduced performance during peak times.
- Monitoring: To earn an SAP certification, virtualization vendors must have virtual server monitoring capabilities within their products. These capabilities can be used for performance analysis — to distinguish between bottlenecks in the hardware versus those in the virtualization layer, for example — or for finding the reason for operational errors, such as system crashes. The performance analysis is then used for SAP EarlyWatch Alerts, which in turn can trigger improvements or suggestions from SAP on how to mitigate the problem. The information around any operational errors is sent to the SAP Development Support organization, which will then use that information to determine if there is an error in the application or the virtualization layer. In addition, in-house IT staff can use the monitoring capabilities to enrich customer performance reports and gain faster access to relevant system information.
These TEAM principles may seem like obvious best practices, but they are too often overlooked; companies tend to underestimate what needs to happen to make their virtualization implementation a success. By addressing these principles, you can improve the quality of your implementation and its operations.
Even if you opt for the proprietary approach, the TEAM principles may still apply. Talk to your vendor of choice and ensure that they’re well versed in these best practices before your implementation project begins.
Virtualization Is Here to Stay — Are You Prepared?
Virtualization implementations will only becosme more commonplace in the future, as companies seek to take advantage of the cost savings and resource consolidation that the technology promises. However, I strongly encourage companies not to charge blindly toward virtualization. Customers should carefully evaluate the two approaches discussed in this article to find the one that best fits their company’s needs, and they should work to adequately prepare their team to tackle their virtualization implementation in a way that will ensure smooth operations.
Visit www.sap.com/linux or www.sdn.sap.com/irj/sdn/windows for more information.
Hannes Kuehnemund (email@example.com) is the Chief Linux Performance and Virtualization Advisor in the SAP LinuxLab. He also hosts the SAP Virtualization Certification Workshops to generate basic configuration recommendations for SAP customers.
1 Between 1998 and 2008, the 2-Tier SAP Sales and Distribution (SD) Standard Application Benchmarks (available at www.sap.com/benchmark) show a huge increase in the performance of commodity hardware. Certificates used for comparison are 1998015 and 2008030.
1998015: The 2-tier SAP Sales and Distribution (SD) Standard Application Benchmark ran on IBM Netfinity 7000 M10, 4 processors SMP, Windows NT EE SP3, Oracle 8.0.4, and SAP R/3 3.1 H, and achieved 229 SAP SD benchmark users.
2008030: The 2-tier SAP Sales and Distribution (SD) Standard Application Benchmark ran on IBM System x3950 M2, 16 processors / 64 cores / 64 threads, Windows Server 2003 Datacenter Edition, IBM DB2 9.5, and SAP ECC 6.0 (2005), and achieved 10,600 SAP SD benchmark users. [back]