Home
Global Supplier Directory
APPLIANCE Engineer
Supplier Solutions
APPLIANCE Line
Whitepaper Library
Calendar of Events
Association Locator
Contents Pages
Market Research
Subscription Center


 
issue: July 2006 APPLIANCE Magazine

The Open Door
Testing: Emotional or Scientific?


 Printable format
 Email this Article
 Search

by Alexander J. Porter, testing and engineering development manager, Intertek ETL Entela

Have you ever argued with someone who just won’t give up? No matter how logical you are, not matter how many times you prove your point, they still insist on being wrong. And the amazing thing is they are dead certain they are right. So here is a mental challenge: How can you tell the difference between being right and sticking to your guns and thinking you’re right and sticking to your guns? Inside your mind, there seems to be no difference between being right and thinking you’re right; they feel the same. The reality is you need a trusted outside reference to provide confirmation.
This is where testing and validation comes in. Sometimes viewed as the necessary evil of bringing a product to market, testing doesn’t create revenue, reduce cost or speed things up. But it’s necessary because in the consumer products world, we can’t be the only one who thinks we are right; testing is the outside reference.
Now layer on another challenge: How do you know that the performance and certification testing you are doing is really the most efficient? In the last 15 years, I have seen many different test plans proposed, and many reasons given for proposing them. Often the reasons given are either emotional—“We’ve always done this, so I would hate to not have this data”—to the reactionary—“We had that warranty claim and this test was cobbled together to reproduce it...now management says we have to do it all time.” Some tests are conceived and verified by benchmarking the test against what actually happens in the field. But often that is not the case.
The fact is we are not the highly efficient scientific creatures we would like to believe we are. We do things out of expediency, ego, apathy, loyalty, and for many other emotional or reactionary reasons. And if someone tries to challenge us on why we are doing what we are doing, we refuse to change. After all, the way we are testing may not be the best, but it is familiar.
So what do we need? There are two ways to effectively improve the way testing and validation is conducted in business. One is for the organizations that stubbornly hold onto old ways of doing things to go out of business and let new companies take their place. Not exactly the ideal solution.
The other way is to allow an objective outside reference to take a dispassionate look at how a product is tested in the company and synthesize an efficient set of tests to meet the needs of the product. The outside reference should address the following: shorten the overall validation time (both resources and calendar); manage the risk of differing result formats and their impact on decision making and decision makers; be traceable and defendable to management scrutiny or internal/external audit; and be adaptable as the technology and business structures change.
This can be accomplished by using information from a design failure modes and effects analysis (DFMEA) or a similar document. Regardless of the document used, it must contain a list of the potential failure modes that are known, anticipated or suspected in the product. Also, the document must have a list of all of the mechanisms that could precipitate the failure in the product. In a DFMEA, these are the “potential failure” column and the “mechanism of failure” column. From these columns, a hypothesis matrix can be created. The hypothesis matrix contains all of the hypothesis that must be true for the product to work and serves as a cross reference of all failure modes verses all mechanism. For the product to work, it must be true that no mechanism could cause any failure.
There are several things that can be done with the matrix. One is synthesizing a concise set of tests that satisfies all of the hypotheses. This is accomplished by adding to the matrix all of the tests that could be used to check the hypotheses. From these, the one test that covers as many of the hypotheses as possible can be identified, and the hypotheses that it checks are circled. Then the next test that covers most of the remaining hypotheses is identified, and those hypotheses are circled. This is continued until all hypotheses are checked at least once. This provides a concise list of tests that traceably satisfy all of the hypotheses.
This technique has been used successfully to drastically re-structure how testing and validation is accomplished and can be used to provide the necessary objective outside reference for evaluating test and validation plans. But this is only part of the solution. Until companies are willing to take a long, hard look in the mirror and realize they are the one who is the irrational party in the argument, nothing will change.
Don’t believe me? Ask yourself this question, “Do I feel comfortable not conducting a test we have conducted for the last 20 years?” You see, the response to that question has nothing to do with science and has a lot to do with taking risks. Most of us would rather take an inferior course of action that is comfortable than to suggest to all of our colleagues that what we have been doing for 20 years is not the best way. Do you want to save 50 percent to 60 percent on validation time and material? Then stop arguing, set aside your emotions and find an objective reference to gauge your testing and validation plans.

About the Author

Alexander J. Porter is the testing and engineering development manager of Intertek ETL Entela, where has worked since 1992. He holds a B.S. in Aircraft Engineering and an M.S. in Mechanical Engineering, both from Western Michigan University. If you wish to contact Porter, e-mail editor@appliance.com

 

Daily News

...........................................................

Aug 14, 2014: Electrolux In Talks to Acquire GE Appliances

Aug 14, 2014: June U.S. Cutting Tool Consumption Strongest of 2014

Aug 13, 2014: Manufacturing Technology Orders Up in June

Aug 13, 2014: Toshiba Appliance/Electronics Segment reports Higher 1Q Sales

Aug 12, 2014: Fagor Operations in Spain to Restart Under New Owner CNA

More Daily News>>

RSS Feeds
.........................................................
Appliance Industry
Market Research

...........................................................

March 2014: Market Research - 62nd Annual U.S. Appliance Industry Forecast
February 2014: Appliance Magazine Market Insight: December 2013
January 2014: Market Research - Appliance Historical Statistical Review: 1954-2012
January 2014: Appliance Magazine Market Insight: November 2013




 
Contact Us | About Us | Subscriptions | Advertising | Home
UBM Canon © 2014  

Please visit these other UBM Canon sites

UBM Canon Corporate | Design News | Test & Measurement World | Packaging Digest | EDN | Qmed | Pharmalive | Plastics Today | Powder Bulk Solids | Canon Trade Shows