Industrial Manufacturer Improves Supply Chain Visibility With New Supplier Master Data Process
A large, global, industrial manufacturing company improved its supply network visibility by creating a governance process and a centralized corporate master database for its supplier information. Supply chain leaders can use this case study when improving their supplier management capabilities.
- Duplicate master records for suppliers prevented the company from gaining consolidated visibility to corporatewide spend by supplier and identifying the strategic key suppliers.
- Poor data quality further prevented this manufacturer from generating standardized supplier scorecards.
- A lack of visibility into affected suppliers hindered the company's ability to react and respond in a timely way to supply disruptions.
- Assign supplier master data ownership to the business. It is not an "IT thing."
- Develop and implement supplier business data governance at the company level.
- Establish master data quality metrics and a process to continually monitor the quality of data.
Table of Contents
Although supplier master data management (MDM) is not a new or innovative concept, it is a foundational capability when a company is looking to improve its supply chain visibility into its supplier base. This case study looks at how a large, industrial, discrete manufacturing company's business leaders understood the significant risk from the poor state of supplier data management, and took action to develop this foundational capability to correct the situation.
With organic growth, and mergers and acquisitions (M&As), the company's supplier master data had become a collection of disconnected databases and offline systems, leaving the company with no readily accessible system of record (SOR) for supplier information. To complicate the situation, the company's operations had reduced vertical integration and increased its dependency on suppliers over the past several years. This manufacturer depended on suppliers for raw materials, but also component assemblies and manufacturing processes.
At the start of the project, the company had 180,000 supplier records with several problems, all of which would have significantly reduced supply chain visibility into the supplier base:
- The data was stored in multiple systems. There was no single SOR.
- Data maintenance was a manual, time-consuming process.
- There was duplication. For example, one supplier existed in the systems multiple times, and with a unique supplier code for every instance.
- Supplier records were incomplete and fragmented.
- Supplier metadata was inconsistent.
- Any visibility into suppliers was limited to direct suppliers, with a total lack of visibility to Tier N suppliers.
After outsourcing important parts of its supply chain, the company had to manage the flow of incoming materials and components, and its alignment with the production schedules. The third-party services had to be there when needed. For the company's operations to perform well, the performance of its suppliers had to be in step with this manufacturer, reliably and predictably.
The company's key objective was to make a step change improvement in its supplier management and collaboration capabilities. From the initiative, the company expected to accomplish the following:
- Develop a "one-stop shop" for supplier information (that is, one SOR).
- Improve supply chain visibility into the supplier base.
- Consolidate supplier spend analysis capabilities.
- Improve risk management and compliance tracking.
- Standardize supplier reporting and analytics.
The project was a success and delivered the results the company was looking for. With greater supply chain visibility, this manufacturing company now has an improved ability to manage and collaborate with its suppliers and service providers. Supply chain leaders can use this best-practice example when improving or extending their supply chain visibility to the supplier base.
This industrial manufacturing company had been growing its business for nearly a century without too much regard for its supplier MDM. A lack of data governance, organic company growth and M&As resulted in "dirty" and unusable supplier data. The poor data quality reached strategic and secondary suppliers alike, along with purchased materials, component assemblies and services.
The roles of raw material and component suppliers, as well as service providers, have increased significantly over the past few years within the manufacturer's operations, since the company has become less vertically integrated. The need for clear visibility into its supplier base became obvious to the corporate procurement and supply chain leaders. They came to recognize their poor supplier master data quality and management process was keeping the company from achieving the needed improvements.
It is often very difficult to identify a quantifiable ROI justification for data-related programs. Usually, the return is realized with increased visibility from analytics capabilities that data projects make possible. At this company, the business case and justification came from the business itself. The company had identified great opportunities to grow into new markets, but could not assess the impact of future growth on its extended supply chain. This planned growth required clear visibility into the supplier base. As a result, the manufacturer needed to identify the key suppliers, how they were performing for the company and how much the company was spending on each supplier.
To ensure the success of the supplier management initiative, the company's business leaders and the core supplier master data project team concluded that all supplier business data had to be governed at the company level. A corporate mandate backed the project. This meant the business units or functions could not opt out of the initiative or deprioritize project tasks.
A business leader led the core team that was supported by offshore data quality analysts and business process optimization resources. Since this was the first time the company had embarked on a full-scale MDM project, it decided to seek outside help. The vendor selection process led the business to partner with Infosys, which provided the expertise on the data quality management process for the project. Infosys brought in a data manager to be part of the project core team. This team had the full support of executive leadership, and reported to the vice president (VP) of corporate supply chain, who reports directly to the CEO.
Prior to its supplier MDM project, the manufacturer had no formal standardized supplier performance reports, supplier scorecards or standard reports required for regulatory purposes. The company was sending out a great number of scorecards and other communications to its suppliers. However, the reports were developed independently of each other, with a different look and content structure to them. In addition, the data sources included a whole range of sources, from tribal knowledge and spreadsheets on individual employees' desktop computers, to ERP systems. The process reliability, repeatability and flexibility did not meet the requirements of the leading supplier management process this company was looking for.
The project team started by conducting a deep-dive analysis into the state of the data. It discovered that there were several problem areas:
- There was no single SOR.
- There was duplication of supplier records.
- Data records were incomplete and inconsistent.
- There was no visibility into second-tier suppliers.
Not only was the quality of supplier data poor, but the link from the suppliers to the products and/or services they were providing was also missing.
The company became aware of this when the earthquake and resulting tsunami hit Japan in March 2011. The disaster disrupted most supply chains in today's globally connected supply networks. The team at this discrete manufacturing company suspected it would not be an exception. However, it was unaware of the magnitude of the impact that poor supplier master data had on its supply chain.
The procurement and supply chain teams took on the urgent and critical tasks of analyzing which suppliers were likely to be offline as a result of the catastrophe. They needed to identify the purchase orders that would not be delivered as originally scheduled, as well as what raw materials, components or services were affected. They had to find all this based only on the geographical location of the disaster. The sourcing team also had to identify alternate supply options for the affected materials. The company discovered that they had no way to provide the needed information easily or quickly. It took two weeks to collect data from a multitude of data sources to come up with answers. Only then could the company start working on contingency plans.
It had already started the supplier management process when the earthquake hit Japan. The project was not far enough along to significantly help with the analysis, but it certainly helped the team to stay focused and mobilize a large number of volunteers for the data-cleansing activities later on.
This initiative represented the first formal master data governance structure in the company's history, and it did not want to try to do this alone. As a result, it formed a three-way partnership with Oracle and Infosys to support the initiative. Oracle provided supplier management software (Oracle Supplier Hub) and consulting to set up one clearly defined SOR for all supplier master data. Infosys provided the techniques, templates and expertise in designing the master data rules. Infosys also helped with the data cleanup efforts, and was the system integrator for Oracle Supplier Hub.
The team developed a supplier data quality management approach, a clearly structured road map for the journey (see Figure 1).
Source: Gartner (November 2013)
The project team developed a strategy and road map for the new supplier MDM process. The team established a governance process to ensure that there was a structured method for addressing process decisions that were related to supplier master data. To clearly assign ownership, the project team identified the specific roles in the organization that would be responsible for maintaining specific data fields.
The team started looking at attribute usage when it analyzed the master data. It needed to understand how staff was using the more than 360 different descriptions of a supplier, if at all. With a clear understanding of how often and where each of the data attributes were used across the business, the team set out to gather business requirements for the new supplier data structure. It used the Six Sigma voice of the customer (VOC) approach to interview over 250 individuals across the company, which aimed to collect the requirements for the supplier information master (SIM) process.
Once the project team understood the requirements, it developed and documented definitions and descriptions for the data attributes (metadata). Infosys brought its experience in setting up clearly defined rules for the attribute data. The rules define:
- The format and structure for data elements
- What to do if attribute data is not provided or available
- How to consistently manage local versus global data elements
The team also wrote processes and instructions to identify duplicate records, roll up supplier location data to the parent organization, and link purchased products and services to supplier locations.
With data management rules and processes in place, Infosys developed data templates to help the company's internal team in the data-cleansing phase of the project.
Infosys extracted the entire supplier dataset from the existing systems and databases, and cleaned up the data on spreadsheets. Once the data had been consolidated into a single list, the team did an initial assessment of the data quality and captured a baseline snapshot. Information on 80,000 suppliers was spread across 180,000 records of data, with up to 360 data attributes associated with each record. The problems they found with the data included the following:
- Incomplete and disparate data — The data was scattered across several systems, and it was fragmented and incomplete.
- Duplicate records — Supplier records were duplicated across the databases, and even within the same systems.
- Unreliable data — Although the database contained some good supplier records, they were not reliable for supplier analytics when mixed with lower-quality information. As a result, the supply chain and procurement communities had little confidence in the supplier data.
With a clear understanding of the initial state of the data quality, the project team developed a strategy for the cleansing efforts, and identified the resource requirements for the undertaking. The project team was very successful promoting the project across the company. By the time the data cleansing started, the team had been able to recruit about 150 volunteers across the business and functions to take part in the six-month data-cleansing effort.
When data cleansing started, Infosys preprocessed the data by applying the known rules. The manual portion of the data-cleansing work involved verifying each record, reconciling and consolidating duplicate data, and filling in data for incomplete records.
The new supplier hub enables additional supplier information to be captured, enhancing the company master data. This information consists of third-party content that is linked to the company's own supplier master data. Such content included information from the likes of D&B and Trillium Software. With the new system, the manufacturer can also conduct supplier surveys to capture additional information directly from suppliers and store the results within its structured master data.
In its project plan, the team first cleansed existing master data. Once the company had a clean version of the supplier data, it followed with the enrichment activities.
The company promoted the project extremely well, and was able to recruit over 270 volunteers across all business units, including the following:
- Purchasing functional excellence teams
- A global purchasing team and sourcing managers
- Finance and intercompany teams
The data-scrubbing work started in late 2011. As good data became available during the data-cleansing phase, the team refined the business rules and processes. The clean data was uploaded into the new Oracle Supplier Hub six months after the cleansing began. This massive data effort was only the first step in the process. The company has also implemented methods and processes to monitor the data quality, and maintain ongoing data quality.
Oracle Supplier Hub is accessed via a supplier portal. Suppliers can maintain their information directly in Supplier Hub. In the old process, suppliers would contact their sourcing manager at the company with changes to their contact information, and then changes would be transcribed into the database. The new process is streamlined, and places the ownership of the information where it originates — with the suppliers themselves. In addition to maintaining contact information, suppliers can also respond to online supplier surveys that the company can now easily send to registered suppliers in the hub.
This large, industrial manufacturer set out to build a supplier data management process that would support overall supplier management by improving supply chain visibility. Over the course of three years of focused and dedicated effort, the company implemented its first working data governance process and a cleansed supplier master database.
The project yielded clear benefits to the business, according to the expectations the leadership had set for the initiative. Table 1 shows the deliverables that the program achieved and the business impact of each.
Source: Gartner (November 2013)
In addition to initial expectations, this project established a foundation on which to build new capabilities for supplier management. The company has plans to further leverage its supplier data quality management approach for other master data quality improvement projects, notably customer data and product data masters.
The company had an opportunity to put its new system to the test when Hurricane Sandy was heading toward the East Coast of North America. Within a few hours, the company sourcing team had analyzed the impact on 15 states. The team knew which suppliers it had worked with in the six months before the storm. The company reached out to them to assess the impact and determine if they needed help. Before the introduction of this new system, a disruption in the supply chain was followed by an informal process dependent on access to disparate supplier data. But when Hurricane Sandy hit, the company could look at cleansed, deduplicated supplier spend; identify and collaborate with affected suppliers; and make informed decisions.