RiskTech Forum

SAS: “Real MDM” and the quest for long-term data quality improvement

Posted: 1 April 2016  |  Author: Dylan Jones  |  Source: SAS


I'm frequently asked: "What causes poor data quality?" There are, of course, many culprits:

But there is one reason that is common to all organizations – poor data architecture.

The (inherited) data architecture problem

Most organizations have some degree of duplication in their data assets; this always stems from badly conceived system architecture design. I don't envy the role of CIO and IT leader, constantly having to transform the historical IT landscape they inherited from predecessors.

At one company, I discovered 15 systems independently storing facilities management data. The organization was, in effect, managing 15 variations of the same physical asset. Speaking to the head of IT architecture confirmed what everyone could see: "No one designed this, we were just given it."

The problem is that we have constructed our information systems around an assembly line approach. Each business unit creates its view of the world, complete with the functions and data it requires. This silo mentality means that data flows from one function to another, and from system to system. As a result of this interconnectivity, data defects can quickly spread and proliferate across the organization like a virus.

Another problem is that these core master data assets quickly become out of sync due to the discrepancies between systems. As a result, users struggle to complete business functions in an optimal way as they are frequently grappling with defects.

Tom Redman summed it up perfectly in a recent interview when he said:

The way our organizations are set up today is wrong for data. Companies are built around a division of labor, with an assembly line mentality. In the Industrial Age, this was remarkably effective. But now, these silos prevent data sharing.

So why hasn't the "data architecture problem" been solved?

There are many reasons why these issues persist, but some of the most common are:

In short – poor enterprise strategy leads to poor design, which leads to poor quality data.

What should your IT and data leaders be doing differently?

The obvious starting point is to have a central strategy for your master entities, the fundamental building blocks of your business. You need to build an enterprise plan for master data.

As Larry English commented in a past TDAN interview:

Real master data management tells us we must design databases around the fundamental resources, such as customer (party), product, financials, facilities, equipment, etc. These must be defined in singular enterprise-strength information models about each discrete resource.

The benefit of applying real master data management is fairly obvious: you're significantly reducing the cost and complexity of managing the same data across hundreds of locations in the business.

Larry continues:

Building redundant and disparate databases is like paying an invoice multiple times, with each and every invoice failing to solve the “enterprise” problems. I have never found a CFO who condoned paying a single invoice multiple times. Why does IT insist that redundant databases are a best practice?

With data mastered centrally and federated to the wider organization, you can start reducing the complexity in your information chains. By reducing the amount of master data moving around the organization, you are guaranteeing a reduced defect rate due to the reduction in information handover points where translation issues inevitably creep in, not to mention the synchronization problems. The long-term strategy should be to deploy new applications that can access and maintain master data while delivering their individual business functions.

Your goal should be to have no overlap between master data sets, combined with accurate, timely information across the company.

What else is happening to help eliminate the Industrial Age mentality toward data?

I believe we are at a turning point in the data sector, and data leaders now have to make some critical decisions. No longer can they simply maintain the status quo of legacy data strategy. If they don't innovate and change, they will be usurped by younger, leaner businesses that are driven by customer-centric models and far lower operating costs.

In terms of drivers that are helping change the situation, here are some obvious ones:

What steps should you take next?

To be fair, this is long-term strategy stuff. It's not the usual tactical data quality improvement I talk about here on the Roundtable.

For a lot of firms, this will require a total re-think of the way they do business and manage their data. It's also the reason why new startups – for example, those in the banking sector – are able to aggressively compete with entrenched mega-firms. They are building out an IT and data landscape that is far more closely modeled to their business. As a result they're able to build with quality and master data management in mind, right from the outset. Here are some obvious starting points:

1. Identify your master data sets:

2. Understand which business functions are driven by these master data sets.

3. Identify where tactical MDM can be replaced with real MDM.

4. Specify policies for new system development: