Getting LeanA new system weighs the impact of proposed changes.

A basic premise in Lean manufacturing principles is the elimination of non-value added activity by minimizing variation. However, standardizing systems while accommodating the needs of 30 or more customers can be a significant challenge for EMS providers. Customer satisfaction measurement is one area where standardization can improve efficiency. Yet, the more standardized the format, the less meaningful it may be at the customer level.

Like many contract manufacturers, Epic ran a dual customer satisfaction system. Epic participated in its customers’ ratings systems and conducted monthly and annual web-based customer satisfaction surveys. However, only 8% of customers were sending formal monthly scorecards defining their expectations. Of the remaining customers, only 20% were filling out monthly surveys, and the surveys tended to generate subjective responses that did not necessarily tie to specific improvement activities or clearly defined goals.

In the fourth quarter 2007, the company’s management team decided to put a new system in place that would:

The new system was fully implemented in 2008, then re-evaluated and fine-tuned last March.

System overview. We manage projects using a Customer Focus Team model. Each CFT includes a program manager, account manager, quality engineer, product engineer, test engineer, material analyst and inside salesperson.

Early in our operational strategy formulation, management developed a methodology for measuring and sharing performance information, known as the Plant Operational Review (POR) system. The original version monitored approximately 60 metrics company-wide down to the floor level. These metrics were formally reviewed on a daily/weekly basis by project personnel, monthly by plant managers, and quarterly by senior management. Over time, the system has evolved to include the original metrics list, external benchmarks and longer-term performance trends. The POR process starts with a summary of overall company financial performance metrics, then focuses on specific productivity and operational performance in human resources, quality, manufacturing, engineering, sales, purchasing and finance. The functional managers responsible for performance to measured metrics are also responsible for defining the external benchmarks relevant to their areas.

During the gap analysis of the former Customer Survey System, opportunities for improvement were identified that could tie into the management review cycle (POR) to close our internal loop. One such opportunity was to create a working tool for use by the CFT and customer that would define expectations based on the monthly survey. The redesigned customer satisfaction measurement tool was named the Customer Expectation Worksheet. A goal for the new system was that it link to POR, showing both customer issues and the status of corrective actions related to those issues. Another goal was to link the customer satisfaction survey closely with other program management tools. One key tool that was developed was the CFT Tracker.

The CFT Tracker is a living diary of each customer. It is an Excel workbook resident on the company’s intranet that includes tabs for core customer team contact list, product/part number lists, NPI planning, meeting agenda, CFT open action items, continuous improvement team (CIT) tracker, CFT Paynter chart, customer PPM tracking, scrap analysis, closed CFT action list and the Customer Expectation Worksheet. In short, the CFT provides the entire account history and current status information at the fingertips of anyone within the organization.

Because the CFT Tracker stores trends information related to quality and continuous improvement initiatives, it enables real-time analysis of customer issues identified through the Customer Expectation Survey and makes it easy for the CFT to respond with specific data related to issues identified by the customer.

For example, if our team has made DfM recommendations that are currently affecting manufacturing, on-time delivery, quality, etc., then this will be tracked in the CFT Tracker through the Paynter charts. If the customer has opted not to adopt the recommendations, but indicates in its monthly survey that defects exceed predefined limits, the CFT can look at the CIT tracker, PPM data and CFT Paynter to determine what percentage of defects relate to the unadopted DfM recommendation. With the Paynter chart, the team can show a weighted analysis of the impact of adopting the proposed changes. Similarly, if the defects relate to an out-of-control process, we would have the data to drive internal improvements. The result: a hyper-focused corrective action tool. Training was conducted at all Epic facilities. In Mexico, training was conducted in Spanish to ensure full understanding among all CFT members.

The survey process. The Customer Expectation Worksheet was designed as a relatively simple tool. We have defined a series of ratings tied to quality, delivery performance and partnership. Each of the three sections has three-to-four defined performance indicators. Within each performance indicator, respondents rate on a 1 to 5 scale:

The CFTs work with their customer to establish a customer-specific metric for each of these defined performance indicators. Customers then rate on the 1 to 5 scale against their predefined performance metrics. Although a rating of 3 indicates requirements are met, it is coded yellow and reported as an opportunity for improvement. Ratings of 1 or 2 are coded red and generate a corrective action requirement, which is tracked at both the CFT and POR level.

Each CFT provides a list of key contacts that Epic interacts with on a day-to-day basis. The survey is sent as a web-based choice board form to customer contacts. Requests are rotated among the total list of core contacts so that each contact only gets a request a couple times a year. If there is no response to the initial survey request, up to two reminders are sent out. If the customer contact still doesn’t return a survey, the CFT will touch base to determine the reason why.

Also, a more detailed annual Customer Loyalty Survey conducted via email measures:

This annual survey is sent to multiple contacts at each customer and includes areas for detailed comments and suggestions for improvement. Survey data are reviewed at the plant and corporate level.

Results and lessons learned. When the new survey was deployed in 2008, it consistently generated a 45 to 50% response rate, compared to the prior 20% response rate. In 2009, that has dropped to about 30%, but that percentage typically includes 100% of our largest customers. In determining survey improvements for 2009, one issue has stood out. Customer project teams prefer to do the survey as a group. When individuals are contacted for a survey, they often solicit feedback from other members of the core customer team. If they do not get feedback, they often do not return the form. As a result, a 2009 change to the survey method will be to offer the customer the opportunity to complete surveys from each core customer group, rather than to attempt to rotate between those team members.

An additional indicator of the robustness of the process is that we won all five individual service category awards in our revenue size class in Circuits Assembly’s 2009 Service Excellence Awards for EMS providers.

Tony Bellitto is quality manager-US Operations at Epic Technologies (Epictech.com); tony.bellitto@Epictech.com. 

Submit to FacebookSubmit to Google PlusSubmit to TwitterSubmit to LinkedInPrint Article
Don't have an account yet? Register Now!

Sign in to your account