An enterprise system should improve an organization’s effectiveness and decision-making by allowing employees to use information more efficiently. Yet, using analytical tools to support that decision-making has taken a backseat in corporate IT for more than 40 years. Today, less than 20 percent of knowledge workers use analytics (according to market research firm Gartner, Inc.).
Agile organizations (those that can deal effectively with complexity, uncertainty, and risk) need to embed analytical insight into operational processes to close the loop between analysis and action. Although the lag time between actions taken, results tracked, and conclusions drawn is getting shorter, the existing analytical tools, methods, and approaches are inhibiting the last step — connecting analysis to operations in a continuous cycle.
Many analytical tools today limit and simplify the data models and scale that you can investigate with a client-based architecture. Even in server-based business intelligence (BI), many tools have architectures that scale poorly and simply cannot make use of the volume and richness of the available data. To claim that a product is scalable because it can handle dozens of servers, while it is limited to only 10–12 users per server and performs no real load-balancing or work distribution across servers, only complicates the result.
IT groups are forced to limit access to data warehouses by concealing detailed data, implementing query governors, or limiting live queries to certain times of the day. As a result, despite huge data warehouses that house the freshest and most detailed data available, the models used in BI are typically summarized, simplified, and denatured to fit into the limited computing capabilities of many BI tools. These limitations are not the only factors that restrict the reach of analytics today but they are the ones that can be dealt with most easily by technology.
Next-generation analytics promise to finally take advantage of the SOA concept. They will change the way organizations use information, break down the practices that separate knowledge workers from useful tools and information, and shift all knowledge workers away from passive receipt of information to levels of self-service and investigation. These are all challenging problems to solve (see “Tipping Points for the New Analytics,” below).
Tipping Points for the New Analytics
The following are some of the factors that work together to tip the scales in favor of a new generation of analytics:
- Moore’s law: Intel cofounder Gordon E. Moore famously predicted that the transistor count of integrated circuits would double every 24 months. Despite this increase in processing power, analytical schemes today are still managing from scarcity. The continued rapid increase in capacity and decrease in relative cost of hardware and bandwidth are finally allowing faster, broader data analysis.
- SOA: Open standards, such as those set out by the World Wide Web Consortium (W3C), are agreed upon by a committee and accepted by the community, such as the standard for XML or Java, and foster an environment where analytics can cooperate with operational systems and processes without disrupting or interfering with smooth operations.
- Maturity: Most organizations have some experience with analytics and are open to using them more often.
- Business-process commoditization: Visionaries such as Thomas H. Davenport, coauthor of Working Knowledge: How Organizations Manage What They Know (HBS Press, 1997) and author of Thinking for a Living: How to Get Better Performances and Results from Knowledge Workers (HBS Press, 2005), are getting the word out that a lot of the work in process optimization has been done.
- On-demand: Opening business up through the Internet creates an entirely new timetable for conducting business — on-demand. This means there’s no software install. You use software that is hosted somewhere else; you pay by the user, by the month, or some such; you don’t have to apply upgrades, everyone is upgraded at once, and so forth. There are no big upfront license fees and no long contracts.
- Hegemony: The dominance of a few vendors and the influence of analysts can act as negative forces slowing down innovation, but once enough momentum is gained, the dominant vendors tip the scale abruptly to the positive side of innovation. Think of a roller coaster: It’s slow going until you get to the first crest; then, it really gets moving.
Meeting these challenges requires an understanding of how SOA is changing the way analytics are used, and it requires a new set of rules to govern how analytics are used within the organization.
Analytics: Yesterday, Today, and Tomorrow
Before e-business and the Internet, before business-process reengineering (BPR), organizations functioned in a timeframe that seems inordinately long by today’s standards. Few decisions needed to be made immediately. No one was in danger of losing his or her market to a competitor without a lengthy fight, and the time allowed to review results and decide on tactics was measured in weeks, not hours or minutes.
The current models for analytics and BI were formed in this environment. Yesterday’s data warehouses were managed largely by batch processes, and BI software was focused on individuals instead of on work communities and whole enterprises. Today, the luxury of latency is gone. “On-demand” isn’t just a clever TV commercial anymore; it’s how business is conducted.
BI is still a departmental or even an individual affair in most companies. But the move to SOA will force IT managers to look at analytics as enterprise assets. Since analytics will become an integral part of many operational systems, the current situation, with many different BI tools in place and skills too thinly spread to be useful, will become unwieldy. Standardizing on the analytical services best suited to composite applications (both operational and analytical), instead of on each department’s selection, will elevate the visibility of analytics. That doesn’t mean every organization needs to apply advanced analytics to its business processes, but each company needs to have faith in the output of its analytical processes. A partial list of the analytical functionality needed in the next generation of SOA-enabled analytical tools is in the table below. Many of these features exist today, but need to change to be applicable.
|Analytical Functionality Needed in Next-Generation Tools
|Online analytical processing (OLAP)
||Separation of navigation and presentation from the engine; OLAP-like manipulations, such as drill-down, need to be available as services without human intervention.
||Common statistical engine across all applications; this boosts understanding.
||Event-based service watches and alerts, configured for many processes; this feature should be generalized and not linked to a particular data source or schema.
||Instead of a variety of specialized tools implemented by disjointed groups, you can implement predictive modeling tools as shared and reusable services for those qualified to use them, and parameterized models derived from their discoveries implemented as services and embedded in many operations.
||A generic optimization engine with different add-on optimization engines, with domain intelligence, applied to the right problems.
|Combination of event manager, predictive modeling, optimization, rules engines, and other decision tools to take over the task of low-risk decisions and inform those responsible for higher-risk ones.
|Dashboards that learn what you want and how you want to see it
||Dashboard software, abstracted from most of its current functionality, focused on learning the patterns and needs of clients.
|Ontologies in place of
proprietary rules engines
|Semantic technology in open standards, such as the Resource Description Framework (RDF), using conceptual search techniques to draw inferences without explicit rules.
One method for organizing analytical capabilities is to create a team of specialists who perform most of the advanced analytics on a centralized basis and have the confidence of the firm’s senior executives. Their work, finding the underlying causes, relationships, and predictors, can be reduced to data models with a set of parameters that runs repeatedly. FICO credit scores are a good example, where decision-management software vendor Fair Isaac Corporation develops and continuously tunes a model that can be rendered as a series of variables to produce a credit score.
This premise, most recently advanced by Tom Davenport in Competing on Analytics: The New Science of Winning (HBS Press, 2005), historically has had some drawbacks, such as concentrating the analytical tasks to a degree that hinders development of skills in the rest of the organization, but they were most likely caused by separating the data modelers from the model users. Nevertheless, Davenport and others writing on this subject highlight the need for these capabilities and focus on executive acceptance of analytics, which is crucial.
Making Sense of the Data
Metadata management is positioned as the solution for understanding the data, and it is. However, existing BI tools provide incomplete and proprietary metadata. The diagram below provides a simplified and idealized chart of the information flow for analytics. This chart shows the location of the different sources of metadata. One BI tool may have as many as five different metadata structures, but putting them into a single metadata repository without altering their form or function won’t solve the problem.
|This diagram of fractured metadata shows a simplified and idealized diagram of the information flow for analytics in current BI architecture. There are many different sources of metadata; one BI tool may actually have as many as five different metadata structures.
Ideally, metadata should allow you to use different tools in different locations — logical or physical — so they can exchange information about their semantics and operation. The lack of uniform metadata across applications makes it difficult to collaborate and inhibits standardization. BI metadata comes in three basic forms:
- Production: Describes data movement and translation from one source to another, sets up ownership, and logs updates
- Catalog: Allows the definition of tables and attributes
- Presentation: Enables additional data and calculation definitions, report layouts, preferences, and roles in the reporting and analysis tool
A lot of work has gone into defining metadata standards and attempting to open up the metadata, but to a lesser degree, competitive pressures among vendors impede the effort. Technology itself stands in the way of a useful metadata solution. Using relational databases and relational modeling techniques as a solution may be inadequate to capture the richness and nuance of the data, models, and conditions in even the simplest business. Using extended relational modeling such as the Unified Modeling Language (UML) offers marginal improvement over relational approaches, but all these schemes effectively leave metadata in a passive state.
Semantic technology holds the promise of breaking through the metadata ice jam. Semantic technology is a broad field, but the rapidly growing commercialized part of it is called “ontology.” An ontology captures the semantic meaning of things and creates a structure from which a machine can draw inference. In a metadata query, all the knowledge needed to frame a question is in the query that the query writer composed.
In an ontology, the captured relationships are capable of revealing, through a process of introspection and inference, more information than was consciously placed into it. In addition, ontologies are constructed in languages such as RDF or Web Ontology Language (OWL), which are based on XML. All of these are open standards managed by the W3C, which is responsible for Web services.
Semantics play a role in orchestrating SOA in various ways. Discovering services that use Universal Description, Discovery, and Integration (UDDI) and Web Service Definition Language (WSDL) is rare, but semantic extensions to these services make it possible to find one with a conceptual search; for example, “Find a seasonal smoothing function that is used for consumer products.” The massive amount of data in data warehouses no longer needs to be limited to a single definition. Semantics can determine which elements depend on what others and which are predictors. Many people can construct semantics simultaneously, and the semantic engines can handle incorrect entries. Ontologies that are built by people over time are called “folksonomies.”
Semantic technology promises to supercharge analytics by educating the users. Soon, metadata built with semantic technology will glue all the disparate services together, with meanings replacing definitions and patterns. Current metadata approaches are largely based on data; with semantics, metadata can marry data, process, application, use, and security — virtually everything. Furthermore, it can exchange this information freely because it’s based on open standards that conform perfectly to an SOA approach.
Analytics Manifesto: New Rules
As analytics emerge from their various niches and become more central to operational and strategic roles, a clear set of rules for using them is crucial. The following list contains examples of rules you can use to govern analytics:
The use of analytics at all levels of skill and function must be pervasive in organizations. It is simply not enough to supply data, it has to be connected to the work that people do in a relevant and timely way.
Even complicated analytics don’t have to be difficult and shouldn’t be the preserve of a cadre of statistical experts. Models can be guided and visual, and provide best practices.
Analytics cannot just inform; they must be active, stitched into the fabric of work. The distinction between transactional and analytical work is fading. Analysis engenders action, and action is preceded by a decision.
Embedded analytics need to be composed of standard services with consistent function and metadata across the enterprise. If analysis and action are to occur in composite functions, they must have cooperative metadata to operate simultaneously.
Analysis in isolation is not effective; it must close the loop with business action. An out-of-stock-in-three-hours calculation should fire an order without a handoff. It’s no longer necessary to split the processes across systems.
Commodity business processes such as basic accounting, inventory, HR, or warehousing should be outsourced, increasing the need for reliable measurements. Outsourced processes have to have standard measurements to track whether they are being managed properly.
Requirements for real-time implementations grow as the availability of reliable real-time tools increases. Enterprise application integration (EAI) and other messaging queues provided a rich source of real-time transactional data for business activity monitoring (BAM) that wasn’t available before. New approaches to on-demand and software as a service will create more opportunities.
The effects of Moore’s law gradually move thinking about analytics away from “managing from scarcity” and toward “capitalizing on potential.” Designs in the past always started with the limits of available hardware resources, which have been largely relaxed.
Metadata must rise above proprietary standards. Abstraction layers must be the preferred path for data access. Analytical tools still present a data-centric view, requiring knowledge of relational technology or other databases for proprietary data. A standard, non-proprietary abstraction for analytics, a form of metadata, is needed so that analysts can move from tool to tool or location to location and still be proficient in their work.
Analytical tools eventually discover how people work, not vice versa. In the meantime, they require customization and configuration to become tools relevant and understandable to a wide audience. Guides, interactive visualization, collaboration, search and retrieval, and a host of other helpful features beyond pure number-crunching are needed so that analytics can become a useful part of the normal workflow for a wide group of people.
In an SOA world, BI tools that bundle query, interface, output, metadata, and calculations must gradually unbundle their services. Applications will become more specific to processes than general to the BI function and, hence, will need pieces of the BI stack as services.
BI is difficult due to a lack of understanding of the data, models, and tools. Analytics need to be relevant to the work that people do and make their work easier — not more difficult. BI was conceived on the idea that people want to discover things on their own, but only a small proportion do. BI needs to adjust to a more embedded, operational focus for a broader cross-section of the population.
Manipulating complex data models and walking through reasoning, layers of detail, and concepts do not have to be difficult tasks. Quality SOA-based analytical tools should adjust to the individual user’s skill level, and more skilled users can create analyses for others to share. Finally, a semantics-oriented abstraction layer records all aspects of the analytical workplace, including data, relationships, models, assumptions, and report objects. This will be similar to, but more robust than, today’s metadata. The sharing of knowledge, the ability to use encoded knowledge in new and creative ways, and collaborative analytical environments will become the “killer apps” of the SOA world.
|Neil Raden is the founder of Hired Brains, Inc., which provides consulting, market research, product marketing, and advisory services to the business intelligence, data warehousing, and semantic technology industries worldwide. Raden is an active consultant, author, and speaker. You can reach him at email@example.com.