GRC
HR
SCM
CRM
BI


Article

 

Master Data Management Supplement

by SAP and Partners: Silver Creek Systems

August 11, 2009

by SAP and Partners: Silver Creek Systems, BearingPoint, ecenta AG, BackOffice Associates & Cransoft, Winshuttle, Optimal Solutions Integration, and Oniqua SAP NetWeaver Magazine - Volume 3, Issue 2
 
Inside

The following solution providers are highlighted in this supplement:

BackOffice Associates & CranSoft, Inc.
Data Governance: 5 Questions for Sustainable Data-Quality Success
Silver Creek Systems
5 Steps to Eliminating the Product Data Bottleneck in MDM
Winshuttle
Efficiently Improving the Quality of Master Data in Your SAP System
BearingPoint, Inc.
Three Dimensions of a Powerful Master Data Management Methodology
Optimal Solutions Integration
Managing Master Data to Maximize the Value of Core Enterprise Systems
ecenta AG
SAP NetWeaver MDM and Enterprise SOA in SAP NetWeaver 2004s
Oniqua
Material Masters May Be Boring, but What an Investment!

Roland van Breukelen
EMEA NEWS Solution Principal - MDM
SAP

Improve Efficiency and Data Governance with SAP NetWeaver MDM

SAP is a process company; we provide business applications with best-practice business processes. One factor that influences the efficiency of a process is the quality of the master data the process uses. According to our customers, the big driver for SAP NetWeaver Master Data Management is to improve efficiency and make sure the process runs as quickly and predictably as possible. SAP NetWeaver MDM helps processes to run more effectively.

One example of what customers are starting to address is what to do when the CEO of the company asks, “Exactly how many customers do I have at any point in time?” Marketing will give the CEO one number, finance another, and customer support another because there is no common business rule that defines “customer.” The CEO needs to be able to find a “single version of the truth.” Data-related issues affect the quality and speed of a process. The next step in making business more effective and efficient is to address these types of data issues.

Everyone has a different understanding of master data. If you talk to a CEO, an IT person, and a line-of-business (LOB) person, each will have a different definition of what master data is, whether they are discussing the same piece of master data or not. This is a governance process. For example, sales and marketing may want to know who the key contacts are, what their positions are, when their birthdays are so they can send the contact a card, and so forth. Finance doesn’t care about birthdays; finance just wants to know when the payments are coming and what the invoices are. The challenge within organizations is to ensure a consistent definition of global master data across the company. Then there can be local “attributes.”

Defining these kinds of processes is challenging and demands strong project-management and data-governance people who have the support of senior management. Ultimately, the business needs to build processes around the data to take advantage of it.

The Governance Challenge

The European Union has lots of regulations coming out constantly. I know the United States does, too. For example, there are EU regulations for automotive around the Parts Information Exchange Standard. This requires each automotive manufacturer that sells cars and trucks into Europe to supply technical information to independent operators. Now, these companies have to supply to the whole of Europe all the information about most of their parts.

The Waste Electrical & Electronic Equipment directive aims to minimize the impact of electrical and electronic goods on the environment, by increasing re-use and recycling and reducing the amount of WEEE going to landfills. It seeks to achieve this by making producers responsible for financing the collection, treatment, and recovery of waste electrical equipment, and by obliging distributors to allow consumers to return their waste equipment free of charge. So, when a mobile phone, TV, or coffee machine manufacturer or distributor puts its product together and prints all the instructions, it has to label which parts are recyclable and which aren’t to comply with the directive. This data isn’t always held in an ERP system, but someone has to be responsible for compliance with WEEE.

Different geographical areas have different influences — some governmental, some environmental — not to mention the whole Sarbanes-Oxley and reporting accuracy areas. There are many different forms of compliance and regulation, and the challenge is that they change, and a business needs to meet them all. That’s why SAP NetWeaver MDM is so important; that’s why SAP is stressing it so much.



Martin Boyd
VP of Marketing
Silver Creek Systems

5 Steps to Eliminating the Product Data Bottleneck in MDM

MDM deployments face many obstacles, but for product-focused operations such as e-commerce, retail, distribution, and manufacturing, one issue towers above the others: the numerous problems associated with product data.

Product data is inherently complex. It is rich in attributes, but without broad standards it is also very unpredictable and tends to “break” with every hand-off between systems – often requiring manual intervention or custom-coding. This slows down what should be a real-time process and creates numerous bottlenecks that mar the smooth flow of standardized information. For many systems, escalating volumes and shortening response times make this attempt at resolution a losing battle.

Can MDM help? Certainly! MDM (and Product Information Management, or “PIM”) systems provide a solid base for storing and synchronizing master reference data across multiple systems. This helps intersystem communications and avoids duplication of effort.

The real question is, how do you assimilate many disparate information sources into a “single source of truth”? For certain types of information, MDM systems have native capabilities that are sufficient, but they can be severely tested when dealing with complex and unpredictable product information.

Take the 5 Steps to Eliminating Product Data Bottlenecks

1. Benchmark your data.
Before starting an MDM project, assess your data for structure, completeness, and validity. Most often legacy structures do not match MDM requirements, and your data may require complex transformation and mapping.

2. Find your most manual process and automate it.
Even before implementing MDM, consider how to automate your most manual process. Not only will this eliminate a pain point, but it will also simplify any eventual MDM implementation and be highly instructive on the capabilities you will need going forward.

3. Automate all information assimilation.
“Hand-crafting” of new data is a major bottleneck in product-oriented MDM. New records must be identified, restructured, and merged quickly and accurately to ensure the integrity of all downstream processes.

4. Automate governance and exception management.
All incoming data feeds should have a set of quality metrics for item completeness and validity compared to MDM data standards. This allows feedback to the information providers for ongoing improvement and prevents poor quality data from getting into your system in the first place.

5. Optimize outputs for downstream use.
The MDM system will hold a single version of the truth, but downstream systems such as Web sites, catalogs, and business intelligence applications may need it in different forms, so automated transformation must be put in place.

These practical steps will ensure the smoothest possible implementation and ongoing operation of any MDM system. Some MDM systems can handle all these requirements for some data types, but most of them struggle with product data. For product data, seek out a specialist who can demonstrate an ability to automate the understanding, integration, and transformation of your most complex and unpredictable data.

For a white paper on MDM and product data quality, go to www.silvercreeksystems.com/MDM.

back to top



Marc Zimmer
Manager
Technology Solutions
SAP MDM Practice
BearingPoint, Inc.

Rajesh Nagarajan
Senior Manager
Technology Solutions
SAP MDM Practice
BearingPoint, Inc.

Three Dimensions of a Powerful Master Data Management Methodology

By focusing on processes, people, and technology, you can realize the most value from your SAP NetWeaver investment.

IT leaders are all too familiar with the pain points of managing a company’s master data: cumbersome system maintenance, countless ad hoc requests, and integrating new acquisitions, to name a few. Even more challenging is that the master data may not support key business processes. BearingPoint’s master data management (MDM) methodology can help soothe these problem areas by striking a crucial balance of data-governance processes, people, and technology tailored to your needs and instructions to help capture more value from your SAP NetWeaver investment.

Establish Your Data Governance

Before you can implement any MDM solution, a rigorous data-governance process must be in place. Without it, data clean-up efforts will be wasted. Over the long term, data integrity will suffer. For example, if the enterprise lacks clear guidelines on how to enter a customer’s name into the system, no matter how elaborate your governance, you end up with countless variations of a name as simple as AT&T. The same is true for product data. A single product repeated multiple times within your ERP system will lead to bad data, which leads to supply-chain inefficiencies, incomplete or flawed reporting, and more. As these examples illustrate, consistent data reduces IT and operational costs, builds enterprise-wide confidence in the data, and leads to accurate reporting. An MDM solution such as SAP NetWeaver MDM can help you design a governance process that delivers significantly enhanced benefits.

Tap the Right People

Your MDM program will succeed only if strong people – from the executive sponsor to the IT organization to the front-line employee – are in place. By establishing a master data team that is accountable for master data qualities, creating and overseeing governance guidelines and procedures, you create ownership in the process and in the long-term outcomes. Within this team data stewards are assigned as owners of specific data objects. Often this team will lead change-management efforts to build awareness for the project across the organization and help prepare employees for long-term success.

Apply Sound Technology

With solid data-governance processes and a governance team in place, you are ready to deploy an MDM solution. The critical point when selecting and implementing your technology platform is that it must support the processes your governance team has established. Can it support your workflows? Can you integrate data from disparate systems and harmonize them? Today’s leading tools can help to implement an MDM structure and governance process, but it’s your existing SAP investment that can provide the foundation to move toward seamless integration, a lower total cost of ownership (TCO), and the leading functionality you need to transform your MDM methodology.

BearingPoint’s master data management approach strikes the right balance between people, process, and technology. A successful MDM implementation must address all these dimensions.

Take the First Step

Governance – people – technology: We believe that a successful, sustainable MDM implementation must include these three dimensions. As you begin your MDM initiative – or assess your solution alternatives – look closely at how your organization aligns with this model. It can provide a powerful framework that brings out the best in your organization and in SAP NetWeaver.

For more information, see www.bearingpoint.com/mdm.

back to top



Nicolas Fink
Senior Consultant
SAP NetWeaver®
ecenta AG

Danny Thien
Vice President Solutions
SAP NetWeaver®
ecenta AG

SAP NetWeaver MDM and Enterprise SOA in SAP NetWeaver 2004s

Master Data Is an Issue

Management of master data has been a challenge for enterprises and institutions for many years. Typically, departments, geographical units, or divisions deploy multiple heterogeneous applications and spreadsheets over time. In addition, newly acquired companies tend to retain their legacy IT systems. This all leads to master data that resides in silos and evolves independently — sometimes even in conflict — among systems in the same landscape.

Decisions and processes that are based on unsynchronized master data induce a greater risk of failure. Such inaccuracies make it more difficult to realize the full potential of a business area and usually result in inefficiencies and errors. For example, field organizations that do not share a single view of customer data might contact a prospect repeatedly, causing dissatisfaction to the customer and consequently botching the business opportunity.

Similarly, it is not unusual if a globally active corporation has vendor records stored in several different purchasing systems. Sourcing departments are then faced with the uphill task of generating accurate global spending reports to be able to negotiate better terms with their suppliers. With consolidated vendor master data, improved analytics can be easily attained, generating cost savings.

A corresponding dilemma occurs when product and material master data records are duplicated across various source systems with different identifiers. Parts with similar specifications are not identified, which then increases the costs of research and development (R&D), manufacturing, and maintenance. A company’s productivity can only be optimized when its master data is handled efficiently.

In other words, having a single version of truth about master data is crucial for a business’s survival. The pitfalls are obvious: Discrepancies in financial accounts affect the figures in the period-end closing. Sales are lowered since field organizations don’t have a complete picture of the customer, thus missing cross-selling opportunities. Even fundamental operations such as the delivery of goods could fail if partners don’t have synchronized and integrated customer data. The damage caused by the lack of synchronization and integration in this data has been estimated at US $40 billion annually just for retail and consumer products, a fact that was revealed in an A.T. Kearney study in 2003.

The 3 Core IT Scenarios

The SAP NetWeaver components and services provide a very powerful platform to address master data management. The following IT scenario variants serve as the foundation for MDM business scenarios:

  1. Content consolidation: Vendors, customers, products, materials, and other business objects are stored in a heterogeneous system landscape. The source systems are then connected to SAP NetWeaver® Master Data Management (SAP NetWeaver MDM), and the master data undergoes de-duplication and is assigned a global ID with the applicable key mappings. The consolidated records can then be loaded into SAP NetWeaver Business Intelligence (SAP NetWeaver BI) to enable accurate, company-wide reporting.

  2. Data synchronization and harmonization: Besides contributing to consistent global reporting, the content consolidation scenario prepares the master data for distribution to connected client systems. By supplying high-quality data, the harmonization scenario aims to streamline business operations and ensure cross-systems data consistency.

  3. Central maintenance of unified data: Corporations often appoint central agencies to be responsible for master data. However, there are limited tools for administering master data, resulting in tremendous workloads for these groups to accomplish, for example, master data creation/update requests. Data specialists have to query all source systems to see whether the requested record already exists, and if it doesn’t, they then have to manually create it in each system. This process is cumbersome, error-prone, and incurs long lead times. In the central maintenance scenario, the data is only being entered once, processed with SAP NetWeaver MDM as the hub for enterprise information, and then distributed to the master data subscribers who are connected.

SAP NetWeaver MDM is a key enabler for the effective control and use of master data, especially when incorporated in SAP enterprise service-oriented architecture (SOA) scenarios. It alleviates data-administration issues and allows businesses to focus on value-adding activities instead of dealing with data problems.

How ecenta Can Support You

As an SAP Services Partner, ecenta provides niche consulting worldwide in SAP NetWeaver MDM, SAP NetWeaver BI, SAP NetWeaver Process Integration (SAP NetWeaver PI, known as SAP NetWeaver Exchange Infrastructure before the release of SAP NetWeaver 2004s), and the enterprise SOA functionality of SAP. Our portfolio of projects spans all three core scenarios mentioned above.

As part of a content-consolidation scenario, we have connected more than 100 source systems to SAP NetWeaver MDM. Using SAP NetWeaver PI, we constructed interfaces to SAP source systems (SAP R/3, mySAP ERP 2004/2005, mySAP CRM [Customer Relationship Management]), as well as non-SAP systems such as Oracle applications and other legacy systems.

Besides integrating SAP NetWeaver MDM into the system landscape, we also implemented normalization, cleansing, scoring, and data-merging processes. With the MDM Java application-programming interface (API), we customized an automatic record-merging algorithm based on the clients’ requirements. In our solutions, the consolidated data furnished the local ID for local reporting and the global ID for global reporting via SAP NetWeaver BI analytics.

Furthermore, we have built data synchronization and harmonization scenarios by combining the best in industry functionality of SAP NetWeaver MDM, SAP NetWeaver PI, and SAP’s enterprise SOA platform — in particular, the development environment included SAP Composite Applications Framework, or SAP CAF, with Adobe Interactive Forms, Web Dynpro, and Guided Procedures in SAP NetWeaver 2004s. These scenarios also support delta update functionality, where changes to records in the source systems are periodically extracted and imported into SAP NetWeaver MDM. Data specialists can then decide whether to ignore these changes or to harmonize them with other client systems.

In another instance, we developed Adobe Interactive Forms for offline master data requests and Web Dynpro forms for online submission. Guided Procedures within the SAP NetWeaver 2004s platform receive the entered data and trigger the necessary validation workflows. Upon approval, SAP NetWeaver pushes the information into SAP NetWeaver MDM and creates a temporary record with a new global ID. Next, SAP NetWeaver starts a normalization and de-duplication workflow to compare all temporary records against existing records; then it generates matching proposals, if applicable.

The data specialists check the results in SAP NetWeaver MDM and interactively resolve any matching cases. If SAP NetWeaver doesn’t merge some temporary records, it classifies them as new records. Then, SAP NetWeaver PI syndicates and retrieves these records. Based on the assigned ports, available services in mySAP ERP, mySAP CRM, and other applications create the new records in their respective source systems.

Based on project experiences, we have optimized our operations with a global three-tier organization that shares expertise:

  • Tier 1: SAP NetWeaver MDM and enterprise SOA consultants, implementing project solutions for the customers

  • Tier 2: ecenta development centers in India and China, taking over offshore-able activities, while ensuring that the overall cost stays under control

  • Tier 3: Solution architects who have a close link to SAP solution management and standard development and are responsible for aligning customer solutions to the SAP standard – both current and future

This guarantees smooth upgrades to future releases and a low maintenance cost.

Building successful master data management scenarios is a complex task. However, SAP NetWeaver MDM achieves this by using open standards that support both SAP and non-SAP systems, thereby enabling quick realization of benefits, which results in a higher return on investment (ROI) and a lower total cost of ownership (TCO). At ecenta, we understand the importance of managing master data effectively. By leveraging SAP technologies, we deliver the functionality that helps enterprises to conquer these challenges.

For more information, please send an email to ecenta at mdm@ecenta.com.

back to top



Danielle McQuaid
Director of Data Governance Applications
CranSoft, Inc.*

Tom Kennedy
Founder
BackOffice Associates
& CranSoft, Inc.*
*CranSoft, Inc. was formerly known as CranBerry Technologies, Inc.

Data Governance: 5 Questions for Sustainable Data-Quality Success

1. Why is data governance important?

With the corporate trend of consolidating legacy systems into a single enterprise system, data is becoming increasingly important.

Corporations are producing more and more data on a daily basis with little or no quality control. It would be unthinkable for any corporation to bypass quality control on the production floor. In fact, Total Quality Management (TQM), Six Sigma, and other quality-management models are products of the increasing need and desire to add to revenue and improve efficiency over the past decade. In response to the ongoing creation of so much new data, companies have employed a myriad of different models to manage the quality of their data.

“Companies that are consolidating to a single instance of enterprise resource planning (ERP) usually start thinking about master data management (MDM) when they realize they aren’t reaping the expected benefits because of data-quality problems,” says Bill Swanton of AMR Research. In this revolutionary world of enterprise automation, business-critical processes are driven by a company’s most valuable asset – the data. Many businesses that are running ERP systems are not paying enough attention to the quality of their information. Unfortunately, the value of the data is often discovered only when the value of the company is negatively affected.

As businesses become more and more aware of the impact data quality has on their bottom-line performance, this increased awareness is breeding more proactive initiatives for MDM strategies. Companies considering data governance need to understand the four levels of data governance and carefully decide and understand what they need to accomplish to achieve a sustainable and successful data-governance model.

2. Who should be responsible for data quality?

It’s surprising to find that many organizations are confused in their understanding of data-quality responsibility. As a standard exercise, we often ask the following question of project teams and leadership: “Who owns the data’s quality before, during, and after the SAP implementation?” Receiving their answers, often filled with hesitancy, we constantly see that different groups within the same organization have different answers. Some say IT; others say the business.

The answer should be perfectly clear: “The business owns responsibility for data quality.” This is not to say that IT has no stake in it – merely that the business must understand and remain accountable for the quality of its own data.

Generating consensus around this business-critical understanding is essential before embarking on any data-quality initiative, whether pre- or post-implementation. We have found that SAP data-quality success is contingent upon this understanding. Those companies that embrace this view to drive their planning, organization, tool selection, and implementation processes are the most successful.

3. What are the four data-governance models, and where does my company rank?

As increased data-quality awareness has spread, many companies are defining departments or teams to take charge and responsibility for data quality throughout their enterprise. During this much needed development in corporate priorities and structure, we have seen various models under which data-governance strategies are defined and implemented. The first step is to evaluate the current and intended data-governance models.

  • Model 1 – No data governance: This is a classic model for data governance that we like to refer to as the “Wild Wild West.” Every user is trusted to enter their perfectly accurate data on time while minding corporate standard operating procedures and regulatory-compliance statutes. The reality is that despite rigorous training, many users are casual users and standard operating procedures are most often not followed. Based on the lack of control and accountability, this is clearly the least efficient and most risky model.

  • Model 2 – Center of Excellence: A common model in many different industries is the Center of Excellence (COE). In this model a central group is charged with the responsibility of creating and verifying all data requests before posting them to the SAP system. The intention is to have a central core entering an agreed-upon “single version of the truth”; however, in many cases this model can result in costly downstream effects. Common problems are bottlenecks resulting in slow and often costly data-entry times. Also, in some cases the team members of the COE are overburdened with monotonous, repetitive tasks that underutilize their knowledge and company experience. Furthermore, this is not a scalable model from a cost or resources perspective.

  • Model 3 – Passive data governance: Users enter data into the SAP system, and then a toolset or reporting mechanism iteratively identifies data-related errors in the system. Errors are automatically reported back to their authors for correction, and quality metrics are delivered to management. This model enables a valuable, measurable process that we will discuss in the next question.

  • Model 4 – Active data governance: All data required to support the configured SAP business processes is collected and validated automatically prior to posting through a collaborative environment. Data is deemed business-ready prior to entry into the SAP system, eliminating the possibility of business-process interruptions due to errors in omissions, consistency, content, duplicates, misuse of SAP, or lack of standards. This model is discussed in more depth below.

The objective of data governance is to reduce business-process interruptions from data by increasing data readiness. Increased data readiness is accomplished by reducing data-error-resolution times by increasing automation with advanced data-governance technologies. The image above clearly identifies how the four data-governance models compare to one another based on these principles.

4. Why is passive data governance important, and how do I get there?

Based on our interaction with many Fortune 500 companies, it is no secret that often the COE model may improve the quality of data but increase the time required to collect, validate, and enter data into SAP manually. This model proves difficult to scale with a growing SAP footprint.

Introducing automation and creating accountability and ownership at the user level, passive data governance is the best first step away from the Wild Wild West or COE models. The first step to implementing the passive model is acquiring a preconfigured toolset. Understanding that each SAP system has its own particular data requirements, this toolset should be built specifically for the data challenges of your SAP environment.

The toolset should include out-of-the-box functionality for workflow enablement, quality-metrics reporting, and duplicates detection. For global organizations, the tools should be multilingual. Most important, in addition to preconfigured content and functionality, the toolset should be easily configurable by business people, not IT. Enabling the business to control the data is imperative to effectively encapsulating your specific business-process requirements, so the tools must be business-user-friendly.

Once implementation begins, a business-process repository is constructed based on your then-current knowledge of the data requirements. Over time the configuration of this repository should be capable of iteratively reporting on all business-critical master and transactional data.

The passive model’s automation of data governance implements control while alleviating any bottlenecks associated with SAP data entry. This is a great step forward; however, it does not solve the entire data-governance conundrum. After all, bad data is still getting into the system.

In some cases the passive model is sufficient as a standalone solution. Companies needing only passive data governance are complacent about having acceptable levels of data errors in their system at any given time. Having identified this fact, we have our own passive data-governance solution, DataDialysis®, built specifically for SAP.

For companies demanding the most sophisticated data-governance solutions and those operating in strictly regulated industries, such as pharmaceuticals, the passive solution alone is not sufficient. Active data governance is necessary for these companies to control and validate data prior to entry into SAP. Recognizing that both the passive and active models are business-critical, the best passive solution should serve as a roadmap for implementing an active model.

5. Why is active data governance important, and how do I get there?

The mission of data governance is to enhance bottom-line performance by eliminating business-process interruptions related to incomplete, missing, or erroneous data while fully complying with general business and industry-specific regulations. The best way to accomplish this is to restrict any data that is not business-ready from ever reaching the SAP system. An Active Data Governance Model™ achieves this by implementing an automated system to manage the data-collection and data-validation process.

Passive data governance automates remediation of existing data errors; however, erroneous and incomplete data still gets into SAP causing unforeseeable business interruptions. Fully aware that the passive model alone was not a holistic proactive solution, we dedicated our internal development efforts to developing active data-governance solutions. Our team has developed a suite of collaborative applications built specifically for SAP that manages the data-entry and data-change processes through a validated collaborative workflow environment. Our applications become a firewall for SAP data ensuring that only business-ready data reaches SAP through an automated and transparent process.

The data-governance applications are deliberately created for the business user, and the technology skill level is based primarily on SQL statements. So far, we have developed and released applications designed specifically for materials, customers, and vendors. Known as cMat™, cCust™, and cVend™, these applications are live and running for several of our Fortune 500 clients.

Generally speaking, implementation of these applications takes anywhere from six to nine months and hinges heavily on the scope of business processes and customization a client requires. A large portion of the effort required to kick off a data-governance initiative must be accomplished internally to build the team organization and structure. It has been our experience that an effective data-governance structure should include the following groups:

  • Executive sponsorship: Corporate buy-in, prioritization, and budgets

  • Governance group leadership: Best-practices implementation coordination, data-governance rules, and quality-assurance management

  • IT group: Application development, integration, and security management

  • Business group: Subject matter expertise for data stewardship and business-rules determination

After the organizational structure is determined and operational collaboration and planning begin, initial considerations should be to determine business processes, data requirements, integration, infrastructure, and security. Throughout this collaboration between business and IT, identification of the users’ responsibilities should become more and more apparent. The ultimate goal is to come to a unified consensus with signoffs on:

  1. How the data-creation process should operate

  2. Which individuals are responsible for the various data elements and validations along the way

Once live, our customers are on the forefront of risk mitigation, experiencing zero business-process interruptions because of data in SAP.

Essential for Success

Implementing a data-governance strategy is essential for sustainable SAP success. If your organization has not yet considered data quality on its critical path to SAP success, we strongly recommend you reconsider. In every case we have seen, the costs of implementing a holistic data-governance solution greatly outweighs the risks involved with not having any governance. If you would like to learn more, please go to our Web site at www.boaweb.com or contact us directly at info@boaweb.com.

back to top



Vikram Chalana
President
Winshuttle, Inc.

Efficiently Improving the Quality of Master Data in Your SAP System

Introduction

Poor data quality can have disastrous consequences for any organization: delays in the supply chain, wasted time and money, and poor decision-making based on bad data. According to a recent PricewaterhouseCoopers survey of large companies, only 34% of the respondents claimed to be very confident of their data quality. More than 80% of the survey respondents agreed that their company should view data quality more strategically.

The quality of data in an ERP system may be poor for many reasons, such as errors occuring during initial manual data entry, duplicated records, delays in keeping the data up-to-date, and a lack of synchronization with external systems.

Fortunately, there are some simple best practices that companies can employ — in conjunction with using software applications such as TxShuttle from Winshuttle, Powered by SAP NetWeaver — that are making a tremendous impact on master data quality for hundreds of SAP customers worldwide. Three of these best practices are described below.

3 Effective Ways to Improve Data Quality

1. Automate data entry without programming, improving data quality at the source.
There are many ways in which an enterprise can automate the process of entering both master and transactional data. This is particularly true when the data being entered already exists in other digital formats, such as Microsoft Excel spreadsheets, or PDF and Microsoft Word files. This automation can have a significant positive impact on data quality, as well as offer cost- and time-savings by eliminating manual data entry. Easy-to-use applications, such as TxShuttle, are designed specifically to automate the process of uploading and downloading data between SAP and Microsoft Excel and Access without burdening IT resources.

2. Make mass data changes as soon as needed.
Companies can maintain high-quality data by making mass changes to the data in their SAP systems as soon as the need arises. Such mass changes can be done quickly and easily by first downloading the data into Excel spreadsheets, modifying it as necessary, and uploading it to the SAP system. Winshuttle applications are designed to help business users update their SAP data effectively with this method.

3. Synchronize SAP data with external systems frequently.
Data stored in external systems should be synchronized with SAP data frequently, in real time, if possible. SAP’s enterprise service-oriented architecture (SOA) facilitates real-time synchronization. For non-real time, but frequent, synchronizations between external systems and your SAP system, Excel or Access databases can be used as the intermediary. Winshuttle applications can be scheduled to automatically load such Excel or Access data into your SAP system.

Conclusion

Poor data quality wastes time and money and adversely impacts your company’s bottom line. Maintaining high-quality data in your SAP system is essential to business success. The simple methods shown above, used in conjunction with SAP NetWeaver and Winshuttle software, can make a significant improvement in your SAP data quality.

Next Steps

Try Winshuttle applications on your own SAP system. Easily download and install these applications on your desktop by requesting a free trial at our Web site, www.winshuttle.com.

 

back to top

 



Sam Sliman
President
Optimal Solutions Integration, Inc.

Managing Master Data to Maximize the Value of Core Enterprise Systems

Large SAP customers are very familiar with the challenges in managing master data – customers, vendors, materials. What about the advantages of doing it well? You can “master” master data and reap substantial rewards from a master data strategy that features the proper tools, a proven methodology, and an appropriate governance structure.

Many SAP customers have invested not only in mySAP ERP, but also in mySAP Human Capital Management (mySAP HCM), SAP NetWeaver Business Intelligence (SAP NetWeaver BI), mySAP Supply Chain Management (mySAP SCM), mySAP Customer Relationship Management (mySAP CRM), and other systems that produce and consume master data. These disparate systems often store their own reference data about core business entities. With no one system of record, it’s easy to see how data inconsistencies can result.

SAP NetWeaver Master Data Management (SAP NetWeaver MDM) enables companies to define, change, and manage master data, ensuring consistent definitions and managing the distribution of master data throughout the IT landscape. Working across heterogeneous systems, SAP NetWeaver MDM leverages existing IT investments while delivering centralized data management capabilities.

Benefits of Master Data

  • Benefits for human resources: Consolidation and harmonization of employee data provides a coherent picture of employees in a large, international corporation where mobility is a strategic need. SAP NetWeaver MDM enables an organization to maximize employee resources and respond quickly to market changes.

  • Benefits for business intelligence: As managers measure overall performance, they often find many versions of the truth. Finance has one set of revenue numbers, sales has another. By helping to create a single version of the truth, SAP NetWeaver MDM enables accurate, consistent, real-time business performance reporting.

  • Benefits for supply chain management: By providing centralized, consistent views of materials and other master data, SAP NetWeaver MDM improves supply chain visibility, reduces inventories, helps plan better deliveries, and identifies spending trends. It also improves the ease with which trading partners can exchange data throughout the supply chain.

  • Benefits for customer relationship management: By creating a single, accurate, timely, and holistic view of the customer across multiple channels and business lines, SAP NetWeaver MDM helps sales, marketing, and service teams better anticipate customer needs, provide targeted offers, and improve customer service.

Ensure MDM Success

Ultimately, master data management is best thought of as a process enabled by SAP NetWeaver MDM – an ongoing, specialized discipline, requiring a dedicated team focused on standards, data stewardship, consensus building, and change management – the cornerstones of a successful MDM program. Visit Optimal Solutions at www.optimalsol.com for a free white paper on how to deploy SAP NetWeaver MDM to maximize the value of mySAP ERP, SAP NetWeaver BI, mySAP SCM, and mySAP CRM systems.

back to top



David Watkins
Business Consultant
Oniqua

Material Masters May Be Boring, but What an Investment!

The term “master data” applies to information that seldom changes but is fundamental to managing and optimizing an organization’s effectiveness. The material catalog (or “material master”) is in this category.

The material catalog crosses domains, assisting in accurate material selection throughout the demand and supply chains: for example, maintenance spares, manufacturing materials, warehouse stock, purchased items, and manufactured products. A good material catalog unambiguously describes materials. Incorrect selection can cause excessive stock, unnecessary replenishment, returns, expediting, and rework. Communication is vital to ensure that production materials and spares are available when required.

Catalog Test

The cataloging process is best described in an example. Look at the table and take the test.

Before - Unstructured Catalog

After - Standardized Catalog

1. 3/4" BOLT, 6IN LONG UNC GRADE 5 W/NUT AND WASHER BOLT, MACHINE; 3/4" UNC X 5-1/2" HEX HEAD GDE 5 W/NUT AND WASHER 9.
2. 3/4" UNC X 6" HEX HEAD UNC BOLT BOLT, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER 6.
3. 3/4" UNC X 6" ZINC PLATED HEX HEAD UNC SCREW BOLT, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER 1.
4. 3/4" X 6" HEX HEAD UNC GRADE 5 BOLT BOLT, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER 2.
5. 3/4" X 6" HEX HEAD UNC GRADE 5 SCEW ZINC FINISH BOLT, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER 4.
6. 6"IN LONG BOLT GRADE 5 3/4"UNC BOLT, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER 10.
7. GRADE 5 BOLT BLACK .375" X 6.0" UNF C/W NUT AND WASHER BOLT, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER 12.
8. GRADE 5 SCREW BLACK .375" X 6.0" UNF C/W NUT AND WASHER BOLT, MACHINE; 3/8" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER BLACK 13.
9. HEX BOLT 3/4 X 5-1/2IN UNC GRADE5 BOLT, MACHINE; 3/8" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER BLACK 14.
10. HEX HEAD BOLT UNC 3/4IN X 6" LG WITH NUT AND WASHER GDE 5 BOLT, MACHINE; 3/8" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER BLACK 7.
11. HEX SCREW 3/4 X 5-1/2IN UNC GRADE 5 BOLT, MACHINE; 3/8" UNC X 6" HEX HEAD GDE 5 W/NUT AND WASHER BLACK 8.
12. HEXAGON HEAD BOLT 3/4"UNC X 6" LONG TO GRADE 5 W/NUT AND WASHER SCREW, MACHINE; 3/4" UNC X 5-1/2" HEX HEAD GDE 5 11.
13. UNC BOLT 3/8IN X 6 IN C/NUT AND WASHER BLACK GRADE 5 SCREW, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 ZINC PLATED 3.
14. UNC SCREW 3/8IN X 6 IN C/NUT AND WASHER BLACK GRADE 5 SCREW, MACHINE; 3/4" UNC X 6" HEX HEAD GDE 5 ZINC PLATED 5.

How many are duplicates? There is no rocket science to solving this; it is a technique called “data standardization” and follows five simple steps:

  1. Sort alphabetically.

  2. Place the noun first and re-sort.

  3. Format data, resequence qualifiers, and re-sort.

  4. Apply encoding standards.

  5. Conduct full research, examine actual item, and use standard names and abbreviations.

If your organization lacks the inclination to follow the standards, the business case may change your mind.

The Value

Consider the nine duplicates. If each is purchased four times per year, then purchasing will drop from 56 to 20 orders per year. With an order cost of $30, procurement savings alone are $1,080 per annum. With duplications frequently running at 5% to 20%, the opportunities are there.

A good quality catalog facilitates strategic purchasing, improved contract coverage, higher service levels, reduced inventory holdings, and reduced material-handling.

Business Case

For a $5 million inventory comprised of 5,000 items with 12% duplication and each replenished four times per year, procurement savings are $120,000 per annum. Removal of excess stock delivers a further $48,000 (8% interest) per annum. Contrast this savings with cleansing costs of about $30,000 (0.6% inventory value) and a yield of $168,000 per annum (3% inventory value). Why delay? How many projects deliver a return of 16 times and are cash-positive in the first year?

A more detailed version of this paper is available at www.oniqua.com entitled “Value of Cataloging.”

back to top

 

An email has been sent to:






More from SAPinsider



COMMENTS

Please log in to post a comment.

No comments have been submitted on this article. Be the first to comment!


SAPinsider
FAQ