Issue: Volume 7, Number 1
Date: January 2007
From: Mark J. Anderson, Stat-Ease, Inc., Statistics Made Easy® Blog

Dear Experimenter,

Here's another set of frequently asked questions (FAQs) about doing design of experiments (DOE), plus alerts to timely information and free software updates. If you missed the previous DOE FAQ Alert, please click on the links at the bottom of this page. If you have a question that needs answering, click the Search tab and enter the key words. This finds not only answers from previous Alerts, but also other documents posted to the Stat-Ease web site.

Feel free to forward this newsletter to your colleagues. They can subscribe by going to If this newsletter prompts you to ask your own questions about DOE, please address them via mail

For an assortment of appetizers to get this Alert off to a good start, see these new blogs at
— "Close encounters with improbable events ('Goofers') and implausible beliefs (Martians)"
— "Do mental workouts keep your mind sharp?" (See comment, too)
— "Stress as factor for cardiac arrest felled along with author who did not sweat the small stuff?"
— "Murderous statistics? Also check out the thoughtful feedback on this prior blog:
— "Drinking twice as much reduces heart attack by factor of three?"

Yesterday the citizens of our United States exhibited their mercurial mood by voting for a transit in their political landscape from the right to the left. Today they can see the planet Mercury transit the sun. This happens only about every decade. For an astronomer's eye view (Kitt Peak, Arizona), see the webcast from 11 am to 4 pm PST by San Francisco's Exploratorium at I expect that they will save the record of this astronomical event if you miss Mercury actually in transit this time around.

Topics in the body text of this DOE FAQ Alert are headlined below (the "Expert" ones, if any, delve into statistical details).

1. FAQ: Why do my point predictions vary so much from those shown in the diagnostics on actual runs (blocked)?
2. Expert-FAQ: Investigating aliased interactions
3. Reader Response: Jeff Hybarger's article, "The Ten Most Common Designed Experiment Mistakes"
4. Reader Comments: Thoughts on design and analysis of computer experiments (DACE)
5. Events Alert: Biomedical Focus
6. Workshop alert: "Experiment Design Made Easy" in California

PS. Quote for the month: Be fearless about failing. (Page through to the end of this e-mail to enjoy the actual quote.)


1. FAQ: Why do my point predictions vary so much from those shown in the diagnostics on actual runs (blocked)?

-----Original Question-----
From: Texas
"I performed a response surface method (RSM) experiment using your new D-optimal blocking feature in Design-Expert® version 7 (DX7) software.* This was very helpful for dealing with differences that I anticipated over the three-day period required to complete all the runs. It turned out that one set-up produced a very desirable 'overall liking' response. However when I enter the specific factor levels in the point prediction tool offered by DX7, it deviates quite a bit from the actual result. I went back to the diagnostics produced by the program's thorough statistical analysis, but this particular run does not exhibit any significant abnormalities. What is going on?"

Presumably experimenters like you choose to block out variables that cannot normally be controlled, such as day-to-day or lot-to-lot differences. Thus it makes sense to not account for blocks in responses generated by point prediction. However, when looking back over the actual experimental responses for diagnostic purposes, Stat-Ease software includes block corrections in the predicted values. Think of it this way, on the one hand you look forward to what will happen in future, whereas on the other you are going back over what already occurred. In your case, the block variation from day to day proved to be relatively large — about five times that of the residuals from the model-fitting. The shifts in response caused by your blocking variable exceeded those caused by some of the statistically significant factors! I suggest you look for special causes for shifts from one day to the next. However, if this is a state of nature, you'd best accept it and move ahead the best you can.

(Learn more about blocking and diagnostics by attending the three-day computer-intensive workshop "Experiment Design Made Easy." See for a description of this class and then link from this page to the course outline and schedule. Then, if you like, enroll online.)

*See for details and link from there to a free 45-day fully functional trial of version 7 of Design-Expert software.


2. Expert FAQ: Investigating aliased interactions

-----Original Question-----
From: France
"Using DX7 I screened 8 potentially influential factors via a 16-run, resolution IV, fractional-factorial design [2^(8-4)]. The analysis revealed two large main effects C and H and a lesser, but significant, interaction labelled as 'AF.' The software suggested that I add terms A and F to the predictive model to preserve
hierarchy. However, an earlier screen explained to me that AF is aliased with CH. This makes much more sense since C and H are so large, whereas A and F are not. Can DX7 substitute CH for AF?"

Design-Expert and Design-Ease® software make it really easy to substitute aliased interactions: Simply right-click on the particular point in the half-normal plot of effects or bar in the Pareto plot (AF in your case) to get a drop-down list of appropriate choices. Then choose the one that you assume to be correct. We provide guidance for cases like yours in the article posted at


3. Reader response: Jeff Hybarger's article, "The Ten Most Common Designed Experiment Mistakes"

-----Original Response-----
From: John DeLuca, Quality Champion — Certified Six Sigma Master Black Belt, Micro and Nano Structures Technologies, GE Global Research, Niskayuna NY

"Jeff Hybarger's article, "The Ten Most Common Designed Experiment Mistakes"
[] addresses a very important aspect of applying the DOE methodology, an aspect that is often not properly addressed in either introductory or advanced courses. I commend him for helping to steer those less experienced in the application of DOE's from the shoals of sloppy planning, design, and execution. A very good paper on this topic by Coleman and Montgomery, titled 'A Systematic Approach to Planning for a Designed Industrial Experiment' appeared in 'Technometrics,' Feb. 1993, vol. 35 No. 1 pp.1- 12."

Thanks for the heads-up. We have this article on file with my handwritten notes made in 1993 — interesting for me to revisit!


4. Reader Comments: Thoughts on design and analysis of computer experiments (DACE)

-----Original Contribution-----
From: Professor Ermanno Oberrauch, Scuola Universitaria Professionale della Svizzera Italiana, Department of Innovative Technologies, The iCIMSI Institute, Manno, Switzerland

"I would like to update you on my work with design and analysis of computer experiments (DACE) for computer fluid dynamics (CFD) simulations, and add some suggestions for future versions of Design-Expert software. In the current situation without taking advantage of DACE, people run individual CFD simulations according to expert choice of levels of about 30 factors, all continuous. Each run takes 24-48 hours of computer time. My objective is to build a reasonably approximate RSM design by means of DACE. Ideally, the responses predicted by RSM would be within some specified interval from the responses calculated by the CFD software. With our resources we can afford about 70 runs maximum. I found in your "RSM Simplified"* book an interesting chapter on DACE, with good hints to start. [See the relevant material posted at]

After browsing other books and searching the internet, I have come to the following (partial) conclusions regarding:
(a) Screening experiments can be the same as for physical experiments — fractional factorials, Plackett-Burman (PB), orthogonal arrays (A), but due to the fact that DACE is often used in connection with extremely complex and computer-intensive deterministic engineering calculations, resolution IV and V designs are rarely used. However, as to the criteria to identify important factors, I could not go beyond the common sense rule that important factors are those which affect the response most.
(b) The RSM designs — central composite (CCD), Box-Behnken (BB), D-optimal, etc. — are all suitable in principle, but often the number of important factors prevent their use. Instead, space-filling designs (uniform designs, sphere-packing, Latin Hypercube) are often used. Distance-based designs offered by DX7 are probably also good. As I view it, there are two important issues beyond the selection of a proper design:
— The selection of an appropriate fitter: ordinary least squares (OLS) are not the best choice, because of the absence of error. The best linear unbiased predictor (BLUP) algorithm (known also as kriging) is better, because it 'honors' all the design points with zero error. My understanding is that without such a fitter, (prohibitively) high order polynomials seem to be needed. My only experience (5 years ago) was that two factors required a full 15x15 grid and a 5th order polynomial!
— The use of kriging (or the like) as a fitter, raises the validation problem, because the predictive power of the model cannot be assessed at the design points. So, predicted residual sum of squares (PRESS — leave one out), cross validation, jackknifing, bootstrapping or external test set must be used.
You know I am an old user of Design-Expert, since the DOS times. As I love Design-Expert and owe so much to it in my career, I am taking the liberty to suggest that since DACE is becoming increasingly important that it could be better implemented in your software.

Thank you for your thoughtful suggestions regarding DACE and Design-Expert. Note that our software now offers RSM designs for up to 50 factors, which makes it more suitable for DACE. Engineers at General Electric and NASA, just to a name a few, have had success applying RSM aided by Design-Expert to produce useful transfer functions from computer simulations based on finite element analysis.

PS. Our Italian colleagues provided a 'heads-up' about the Joint ENBIS-DEINDE (European Network for Business and Industrial Statistics & DEsign of INDustrial Experiments) 2007 Conference on Computer Experiments versus Physical Experiments in Torino on April 11-13. It is targeted at scientists, researchers and users of DoE techniques. See for more information.

*See an overview of "RSM Simplified: Optimizing Processes Using Response Surface Methods for Design of Experiments" at and link from there to purchase it on line.


5. Event Alert: Biomedical Focus

See the Stat-Ease display at Biomedical Focus Conference 2007, February 12-13 in Brooklyn Center, Minnesota. Details are provided at Click for a list of appearances by Stat-Ease professionals. We hope to see you sometime in the near future!

PS. Do you need a speaker on DOE for a learning session within your company or technical society at regional, national, or even international levels? If so, contact me. It may not cost you anything if Stat-Ease has a consultant close by. However, for presentations involving travel, we appreciate reimbursements for airfare, hotel and meals — expenses only. In any case, it never hurts to ask Stat-Ease for a speaker on this topic — we are at the foremost ranks of practical expertise on design of experiments for process and product improvement. Contact me at if you have an event coming up with an open slot for a presentation.


6. Workshop Alert: "Experiment Design Made Easy" in California

If you work near the West Coast (or want to visit there this winter) and you want to get going on DOE, attend our three-day computer-intensive "Experiment Design Made Easy" workshop ( this January 16-18 in San Jose, California. See for schedule and site information on all Stat-Ease workshops open to the public. To enroll, click the "register online" link on our web site or call Stat-Ease at 1.612.378.9449. If spots remain available, bring along several colleagues and take advantage of quantity discounts in tuition, or consider bringing in an expert from Stat-Ease to teach a private class at your site.* Call us to get a quote.

*Believe it or not, it only takes a class of 4 students to make it economical for Stat-Ease to come and teach at your site versus sending them out to one of our public presentations. The economics are detailed in the July 2006 issue of the Stat-Teaser newsletter at


I hope you learned something from this issue. Address your general questions and comments to me at:



Mark J. Anderson, PE, CQE
Principal, Stat-Ease, Inc. (
2021 East Hennepin Avenue, Suite 480
Minneapolis, Minnesota 55413 USA

PS. Quote for the monthBe fearless about failure:

"Ever tried. Ever failed. No matter. Try again. Fail again. Fail better."
Samuel Beckett, Irish novelist and playwright ("Waiting for Godot")
Trademarks: Design-Ease, Design-Expert and Stat-Ease are registered trademarks of Stat-Ease, Inc.

Acknowledgements to contributors:
—Students of Stat-Ease training and users of Stat-Ease software
—Stat-Ease consultants Pat Whitcomb, Shari Kraber and Wayne Adams (see for resumes)
—Statistical advisor to Stat-Ease: Dr. Gary Oehlert (
—Stat-Ease programmers, especially Tryg Helseth and Neal Vaughn (
—Heidi Hansel, Stat-Ease marketing director, and all the remaining staff


Interested in previous FAQ DOE Alert e-mail newsletters?
To view a past issue, choose it below.

#1 Mar 01, #2 Apr 01, #3 May 01, #4 Jun 01, #5 Jul 01 , #6 Aug 01, #7 Sep 01, #8 Oct 01, #9 Nov 01, #10 Dec 01, #2-1 Jan 02, #2-2 Feb 02, #2-3 Mar 02, #2-4 Apr 02, #2-5 May 02, #2-6 Jun 02, #2-7 Jul 02, #2-8 Aug 02, #2-9 Sep 02, #2-10 Oct 02, #2-11 Nov 02, #2-12 Dec 02, #3-1 Jan 03, #3-2 Feb 03, #3-3 Mar 03, #3-4 Apr 03, #3-5 May 03, #3-6 Jun 03, #3-7 Jul 03, #3-8 Aug 03, #3-9 Sep 03 #3-10 Oct 03, #3-11 Nov 03, #3-12 Dec 03, #4-1 Jan 04, #4-2 Feb 04, #4-3 Mar 04, #4-4 Apr 04, #4-5 May 04, #4-6 Jun 04, #4-7 Jul 04, #4-8 Aug 04, #4-9 Sep 04, #4-10 Oct 04, #4-11 Nov 04, #4-12 Dec 04, #5-1 Jan 05, #5-2 Feb 05, #5-3 Mar 05, #5-4 Apr 05, #5-5 May 05, #5-6 Jun 05, #5-7 Jul 05, #5-8 Aug 05, #5-9 Sep 05, #5-10 Oct 05, #5-11 Nov 05, #5-12 Dec 05, #6-01 Jan 06, #6-02 Feb 06, #6-03 Mar 06, #6-4 Apr 06, #6-5 May 06, #6-6 Jun 06, #6-7 Jul 06, #6-8 Aug 06, #6-9 Sep 06, #6-10 Oct 06, #6-11 Nov 06, #6-12 Dec 06, #7-1 Jan 07 (see above)

Click here to add your name to the DOE FAQ Alert newsletter list server.

Statistics Made Easy™

DOE FAQ Alert ©2007 Stat-Ease, Inc.
All rights reserved.


Software      Training      Consulting      Publications      Order Online      Contact Us       Search

Stat-Ease, Inc.
2021 E. Hennepin Avenue, Ste 480
Minneapolis, MN 55413-2726
p: 612.378.9449, f: 612.378.2152