In the transition to Pb-free, we resolved everything. Except the soldering process, that is.

Pb-Free Lessons Learned We chose alloys. We selected chemistries. We performed reliability tests. We upgraded equipment. We qualified laminates and components. We segregated materials. We assigned new part numbers. We scrubbed bills of materials. We trained. We documented. We prepared due diligence defenses. We declared RoHS compliance. And we did it all in time for the deadline!

With all the activity surrounding the process change, it’s easy to understand why we couldn’t focus on the soldering process itself and why yields are suffering. To date, it is estimated that only about 10% of electronics assemblies are currently running Pb-free, with projections for that number to ramp up to 90% during the next four years.

With the bulk of the “paperwork” behind and an aggressive ramp ahead, the sooner new processes get optimized, the faster assemblers can realize a payback on their (often mandated) investments in Pb-free technology. One of the best methods to optimize soldering processes is by using design of experiments (DoE) tools. DoEs are perfectly suited for process optimization because they permit the study of multiple variables at once, and permit the investigation of potential interactions of those variables. In other words, they are efficient, and efficiency is paramount to manufacturing operations.

I’ve performed my share of DoEs to optimize production processes over the years, and learned a few lessons along the way.

Lesson 1: Think hard. I took a number of DoE classes; the best was on Taguchi Methods by the American Supplier Institute (amsup.com). It provided one of the most important lessons a young engineer can learn, and it was applicable regardless of the statistical methods being employed. We were taught to spend as much time thinking about the experiment as we spent executing and analyzing it. This is clearly the most difficult part of the entire process. I’ve seen too many large DoEs go awry because there were just too many parameters, and one or two of them swamped the results either because their main effects were so great or because one parameter setting was so far out on the edge of the window that half the data turned up useless (poor paste release, cold joints, unfilled holes, etc).

It’s great to have computer software that does all the statistical work, but these tools tend to remove some degree of thought from the process. Prior to the widespread availability of statistical software, engineers set up their own data reduction spreadsheets, which forced more thought into the process. And although we are immensely grateful for the automation of what was once a tedious manual task, we sometimes get lazy and rely more on the tool than our brains. There’s no substitute for human thought.

Lesson 2: Step away from the edge. Weird things can happen at the edges of the process window. Because two-endpoint DoEs assume linearity between the high and low settings, stay within a robust response range. If an input variable’s performance is not linear near the edge of the window, its effects on the output can be over- or underestimated. If several absolute endpoints are combined, the results can be highly misleading. How much will we learn if we use the minimum amount of solder paste, maximum placement offset, highest possible peak temperature, and longest allowable time above liquidus? We will likely learn this is not a great combination. We don’t really need a DoE to figure that one out, and it sure did not help us find an optimum within the prescribed operating window.

When selecting parameters, ask yourself if you are trying to simulate worst-case scenarios, find a process window, or optimize an established process. If you are trying to simulate worst-case scenarios, as in a reliability test, then the edges are likely the right place to be. If you are developing a brand new process in a laboratory and trying to characterize it, then pushing the limits is absolutely necessary. But if your goal is process optimization, consider stepping in from the edges. Not only can stepping in from the edges provide more accurate response information, it “zooms in” to give a better snapshot of the region you are most likely to operate in.

The rule of thumb I use for process optimization is: Imagine your window runs from 0 to 100. Set the low and high parameters at 10 and 90 or at 20 and 80. As an example, consider the reflow window. Suppose ramp rates are recommended to run as fast as 1.8˚C/sec or as slow as 0.7˚C/sec. Stepping back from the edge would set us around 1.6˚C/sec and 0.8˚C/sec. If soak times and TALs are recommended in the 35-90 sec. range, 45 and 80 sec. may be reasonable choices. If recommended soak temperatures are 150-180˚C, then 155˚ and 175˚ should be good indicators. It is important at this juncture to consider what is reasonable for the process and product under consideration.

Lesson 3: Less is more. It is sometimes better to do a series of small DoEs instead of one big one. The benefits include faster data analysis, which should yield quicker incremental process improvements; cleaner, more manageable data sets; and the natural down-selection of input variables as the experimental series progresses.

Avoid DoE creep. Once the number of parameters in a DoE is pared to a manageable number of runs, it is important to maintain the experimental matrix. Consider a basic two-level factorial experiment: The number of runs doubles every time another factor is added. For the time and expense required to add another factor, a second, equivalently sized matrix could be run to refine the initial results or study other factors. The costs and benefits of adding that variable should be carefully reviewed before expanding the test matrix.

Lesson 4: Secure extra materials. Regardless of the DoE’s size or complexity, the results will certainly answer some questions but invariably raise others. Nothing is more frustrating than finding something really intriguing in the data, but not having the materials available to investigate. If the DoE is using a special test vehicle, it is wise to order a few extra. In my experience, extra test vehicles never go to waste.

If using production PWBs for the DoE, more may be available if questions arise, but keep a lookout for variations in the incoming materials. If more than one fabricator supplies raw PWBs, repeat experiments should be performed on boards from that same supplier, preferably from the same date code or lot number. The same goes for key components on which measurements are being taken, like BGAs, QFNs or connectors. If using production materials, a discussion with the planner can help determine the status of the inventory, how quickly it is turned over, and what can be done to maintain stability in the raw materials supply if future testing should be required.

Lesson 5: Follow the confirmation run downstream. The DoE results that precipitate a process change should always be verified before making widespread changes. During the confirmation run, the assemblies should be attentively followed through downstream production stages. Sometimes improvements in one step of a process can cause adverse effects in another. Common examples of this situation include changes to soldering parameters that affect pin testing by leaving heavier residues or impair final assembly by inducing excessive warp to PWBs or connectors.

There are other lessons, of course; I chose the five that I felt were most important (see Lesson 3). If I had to choose the most important consideration, it would be to think hard. Every subsequent point in my top five is based on thinking hard before executing and using common sense all the way. Seasoned engineers can disagree on DoE methodologies, but can usually agree that there are no substitutes for experience and common sense. Anyone who can navigate an Excel-type software interface is technically capable of setting up a DoE, but it takes ample forethought to construct a truly effective one.

Chrys Shea is an R&D applications engineering manager at Cookson Electronics (cooksonelectronics.com); chrysshea@cooksonelectronics.com. Her column appears monthly.

Submit to FacebookSubmit to Google PlusSubmit to TwitterSubmit to LinkedInPrint Article
Don't have an account yet? Register Now!

Sign in to your account