“We have learned to live in world of mistakes and defective products as if they were necessary to life.” W. Edwards Deming (1900-1993)
Deming’s name is associated with quality and many, including Akio Morita, consider him to be the “patron saint” of Japanese quality control. Today, the importance and value of quality control throughout the manufacturing process is obvious. This is especially true in the repetitive manufacturing process where an out-of-tolerance condition on a production run of 10,000 or 100,000 products can be extremely costly. If data is the new raw material…
Is your organization applying quality control concepts to your data and information?
Let’s assume that an organization’s data is the raw material in the manufacturing of actionable information. For this analogy, let me define the basic elements. By ‘data’, I am referring to all types, master data, transaction data, organizational data, as well as their corresponding metadata. By ‘actionable information’, I am referring to information (data that has been put into context by its use in a process) that needs no further action and can be utilized in its native or received state. For example, if I ask you to process an ‘order’ without the context of process or master data / organization data, you need to know if I’m talking about a ‘sales’ order, ‘purchase’ order or ‘production’ order. And if I simply say process it, you need to know what I mean by process it, e.g., for a sales order, do you want me to create it, deliver it, invoice it? This leads us to the fundamental understanding that I can have clean ‘data’ and still not have actionable information. Also, having clean data is essential for having actionable information.
Let’s look at the following example – master data selection during the ‘Create Sales Order’ process:
In this example, the raw material of data is pulled into context by the ‘create sales order process’ with a sales representative selecting and confirming the customer’s sales order requirements. The result is a sales order transaction which can be acted upon by the enterprise. Some of the actions the enterprise can take regarding this sales order include; viewing the profitability (defined in the business metadata) of the sales order within company xyz for the wholesale distribution channel, managing to the customer requirements for delivery – date, location, packaging, adjusting the production requirements needed to fulfill the order, collecting on the payment terms, updating the impact of the customer’s purchase on the volume discounts agreed to in the purchasing agreement – the list goes on and on. Each recipient of the actionable information is counting on accurate, consistent data for communicating internally and externally.
It’s easy to see how an out of tolerance condition with our data will quickly escalate bad information throughout the enterprise and beyond the four walls to our customers, suppliers, third-party processors.
Organizations today fall into four basic categories regarding data quality control. In the diagram below, starting at the lowest level, are organizations that have no formal data quality control processes or governance in place. They simply rely on their technology applications to capture and store the data. The capture and store process may include some basic formatting requirements.
The middle-level includes organizations that invest in basic data cleansing activities, either in a one-time event or periodically. The process is usually based on a set of rules and guidelines with the resources following the letter of the rules and not necessarily in tune with the spirit of the organizations objectives.
The top-level organizations see data as a strategic asset and have invested in a comprehensive data quality control process and governance structure. This includes assigning employees with defined roles, responsibilities and incentives, defining processes, and enabling them with technology.
Let’s take a closer look at each category and the implications associated with the quality of data produced.
No Formal Data Quality Processes or Governance
Organizations that rely on their business applications, either ERP systems or point solutions, to capture and format the data at entry without further cleansing or governance make up this first category. It is like a manufacturing company doing an initial quality check on raw materials at the point of entry only. These organizations have no formal data quality process or governance in place and are subject to the following business implications:
- Low operational efficiency
- Duplicate entries can occur at the point of entry
- Minor differences can be introduced into the system
- Wrong selections of existing data that look similar but are in fact different
- Extra effort to process business transactions – Data mistakes are corrected when the error is recognized, which many times is much later in the process, instead of at the source which results in duplicated effort on the same piece of data
- Costly Customer/Supplier implications because they may be subject to incorrect delivery, billing or invoicing. This compounds the problem where people don’t trust the system and employees spend extra time “double checking” the data.
- Flawed Business Decisions – These can be small such as minor stock overages or outages. However, when not corrected they can lead to loss of key customers and suppliers when they lose faith in the company’s ability to delivere what they need.
For this type of organization taking the next step is essential for improving operational efficiency and effectiveness. Below are five initial steps for improving data quality and control in this type of organization:
- Define basic data standards and guidelines
- Integrate data standards into your data creation and maintenance process
- Cleans existing data to meet defined standards
- Assign ownership to each master data domain
- Establish schedule for periodic quality checks (in lieu of continuous checking as done by a technology)
Basic Data Quality
In this category, it’s worth distinguishing between data quality and data cleansing. Organizations that implement a software application, e.g., ERP, typically conduct a data cleansing effort before turning on the new software. Data cleansing efforts are either one-time or at best periodic events. Data quality is a continuous process and when done correctly removes the need for data cleansing.
Organizations in this category have most likely performed an initial data cleansing event and have equipped resources with a set of rules or guidelines for entering their master data into the day-to-day operational system. Despite the investment in the data cleansing effort to improve data quality initially, the organization has not provided the go-forward resources with sufficient education and training to understand ‘why’ and ‘how’ the data supports the business processes and decisions. Therefore, when confronted with multiple choices on how to enter data the decision that best aligns with the business objectives is not always taken. For example, entering a location for a customer. There may be several options and not choosing the right one could impact logistics, finance and even create regulatory issues. In addition to the business impacts, this also leads to frustration for the employee responsible for setting up the master data – not having the tools or knowledge needed to do their job.
In the manufacturing quality & control analogy this would be similar to a person setting up a machine/robot without knowing how the machine is to be used or without understanding the necessary tolerance objectives to be achieved by the production run.
If the organization continues to struggle with inconsistent data and questions as to what the right selection is when creating or maintaining the Master Data then you probably fall into this category. You generally also go through multiple data cleansing efforts.
Below are the recommended steps for improving the data quality and control in this category:
- Educating and training of your resources on why and how their choices impacts and contributes to the organization’s goals and objectives. If they understand how the data is used across the enterprise and why it is important, they can make informed decisions which are generally more accurate.
- Leveraging data quality and control technology to evaluate common data anomalies and automatically correct them – positioning the team to move toward an exception-based management process
Technology Driven Data Quality & Governance
Organizations in this category recognize the importance of data quality and invest in technology to try to achieve it. However, many times the organization believes that the technology itself is sufficient and does not wrap it with the necessary data quality processes, governance and education needed to achieve a comprehensive solution.
Organizations that have purchased technology and struggling to achieve their objectives should consider the following steps to progress:
- Align the technology with the data quality and control processes by confirming which process steps are enabled by the technology and which will remain with the employees
- Ensure the technology is capable and configured to enable the chosen process steps
- Business driven – technology is an enabler not the solution
- Technology can speed up incorrect data as well as good data. The people defining how the technology is being used must understand why the data quality decisions are being made so they can ensure the downstream impacts are evaluated and communicated.
Comprehensive Data Quality Processes & Governance
Organizations in this category view their data and information as a strategic asset to be leveraged in driving future value for the organization. They have invested in a comprehensive data quality and control solution that is integrated into their core business processes. The employees understand the business objectives and are incented accordingly. The data supports the operational needs of the business as well as the strategic reporting, analytics and key performance indicators (KPI) that are used to drive the business.
Our Challenge for You
Manufacturing of good information begins with ensuring that you have quality raw materials, i.e., data, and the ability to sustain that quality throughout the manufacturing process via a governance and control structure. It also includes bringing context to your data through business processes, industry/company knowledge and business metadata, which we will cover in a future write-up.
Our challenge to you is to pursue excellence and quality in your data by considering the steps outlined in this write-up.
“You can have data without information, but you cannot have information without data.” Daniel Keys Moran