Multifactor design of experiments software

Multifactor design of experiments software

Software that is used for designing factorial experiments plays an important role in scientific experiments generally and represents a route to the implementation of design of experiments procedures that derive from statistical and combinatoric theory. In fact, in September 2009, at the International Industrial Statistics in Action Conference at Newcastle University in England, statisticians from SmithKline Beecham Pharmaceuticals put up a poster saying that easy-to-use design of experiments (DOE) software (product name omitted here to maintain article neutrality) must be made available to all experimenters to foster use of DOE.[1]

Contents

Background

The term "design of experiments" (DOE) derives from early statistical work performed by Sir Ronald Fisher. He was described by Anders Hald as "a genius who almost single-handedly created the foundations for modern statistical science."[2] Fisher initiated the principles of design of experiments and elaborated on his studies of "analysis of variance". He also concurrently furthered his studies of the statistics of small samples.

Perhaps even more important, Fisher began his systematic approach of the analysis of real data as the springboard for the development of new statistical methods. He began to pay particular attention to the labour involved in the necessary computations performed by hand, and developed methods that were as practical as they were founded in rigour. In 1925, this work culminated in the publication of his first book, Statistical Methods for Research Workers.[3] This went into many editions and translations in later years, and became a standard reference work for scientists in many disciplines. In 1935, this was followed by The Design of Experiments, which also became a standard.

Before Fisher's multi-factor DOE breakthrough, the common experimentation method was conducted using OFAT (one-factor-at-a-time) experimentation. It is a sea-going gentleman named James Lind who today is often attributed as a one-factor-at-a-time experimenter who discovered a cure for scurvy in 1747.

One-factor-at-a-time (OFAT) experimentation reached its zenith with the work of Thomas Edison’s “trial and error” methods.[4] OFAT was and remained the basis of scientific experimental design until agricultural needs to furnish growing city populations with food together with concurrent diminishing farm living necessitated something better.

Agricultural science advancements served to meet the combination of larger city populations and fewer farms. But for crop scientists to meet widely differing geographical growing climates and needs, it became important to differentiate local growing conditions. For local crops to be used as a guide to feeding entire populations, it became more essential to economically extend crop sample testing to overall populations. As statistical methods advanced (primarily the efficacy of designed experiments instead of one-factor-at-a-time experimentation), representative factorial design of experiments began ensuring that inferences and conclusions could profitably extend experimental sampling to the population as a whole. However, a major problem existed in determining the extent to which a crop sample chosen was truly representative. Factorial DOE began revealing methods to estimate and correct for any random trending within the sample and also in the data collection procedures trend estimation.

Use of software

Factorial experimental design software drastically simplifies previously laborious hand calculations needed before the use of computers.

During World War II, a more sophisticated form of DOE, called factorial design, became a big weapon for speeding up industrial development for the Allied forces. These designs can be quite compact, involving as few as two levels of each factor and only a fraction of all the combinations, and yet they are quite powerful for screening purposes. After the war, a statistician at Imperial Chemical, George Box, described how to generate response surfaces for process optimization.[5] From this point forward, DOE took hold in the chemical process industry, where factors such as time, temperature, pressure, concentration, flow rate and agitation are easily manipulated. Later, Box co-authored a textbook[6] that formed the basis for the original version of DOE software by Stat-Ease,Inc., called Design-Ease®.

Design of experiments results, when discovered accurately with DOE software, strengthen the capability to discern truths about sample populations being tested: see Sampling (statistics). Statisticians[7][8] describe stronger multi-factorial DOE methods as being more “robust”: see Experimental design.

As design of experiments software advancements gave rise to solving complex factorial statistical equations, statisticians began in earnest to design experiments with more than one factor (multifactorial components) being tested at a time. Simply stated, computerized multi-component design of experiments began supplanting one-factor-at-a-time experiments. Computer software designed specifically for designed experiments became a commercial reality in the 1980s—available from various leading software companies such as the aforementioned Design-Ease,[9] JMP and Minitab.

Notable benefits when using design of experiments software include avoiding laborious hand calculations when:

  • Identifying key factors for process or product improvements.
  • Setting up and analyzing general factorial, two-level factorial, fractional factoria (up to 31 variables) and Plackett–Burman designs (up to 31 variables).
  • Performing numerical optimizations.
  • Screening for critical factors and their interactions.
  • Analyzing process factors or mixture components.
  • Combining mixture and process variables in designs.
  • Rotating 3D plots to visualize response surfaces.
  • Exploring 2D contours with a computer mouse, setting flags along the way to identify coordinates and predict responses.
  • Precisely locating where all specified requirements meet using numerical optimization functions within DOE software .
  • Finding the most desirable factor settings for multiple responses simultaneously.

Today, factorial DOE software is a notable tool that engineers, scientists, geneticists, biologists, and virtually all other experimenters and creators, ranging from agriculturists to zoologists, rely upon. DOE software is most applicable to controlled, multi-factor experiments in which the experimenter is interested in the effect of some process or intervention on objects such as crops, jet engines, demographics, marketing techniques, materials, adhesives, and so on. Design of experiments software is therefore a valuable tool with broad applications for all natural, engineering, and social sciences.

See also

Notes

  1. ^ Marion Chatfield and Gillian Smith, SmithKline Beecham Pharmaceuticals, "Experiences of Promoting the Use of Design of Experiments in Synthetic Chemistry."
  2. ^ Hald, Anders (1998). A History of Mathematical Statistics. New York: Wiley.
  3. ^ Box, R. A. Fisher, pp 93-166[citation needed]
  4. ^ Douglas C. Montgomery, Scott M. Kowalski (2009). Design and Analysis of Experiments (7th Edition), John Wiley & Sons. ISBN 978-0-470-12866-4 (pp: 7,9-11)
  5. ^ Box and Wilson (1951) "On the Experimental Attainment of Optimum Conditions," Journal of the Royal Statistical Society, Series B, 13,1
  6. ^ Box, Hunter and Hunter, "Statistics for Experimenters," John Wiley, 1978.
  7. ^ Mark J. Anderson & Patrick J. Whitcomb (2000), DOE Simplified: Practical Tools for Effective Experimentation, 2nd Edition by ISBN 1563272253[page needed]
  8. ^ Mark J. Anderson & Patrick J. Whitcomb (2004)RSM Simplified: Optimizing Processes Using Response Surface Methods for Design of Experiments, Productivity Press. ISBN 1563272970[page needed]
  9. ^ http://www.statease.com/software.html

External links


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Design of experiments — In general usage, design of experiments (DOE) or experimental design is the design of any information gathering exercises where variation is present, whether under the full control of the experimenter or not. However, in statistics, these terms… …   Wikipedia

  • statistics — /steuh tis tiks/, n. 1. (used with a sing. v.) the science that deals with the collection, classification, analysis, and interpretation of numerical facts or data, and that, by use of mathematical theories of probability, imposes order and… …   Universalium

  • Data mining — Not to be confused with analytics, information extraction, or data analysis. Data mining (the analysis step of the knowledge discovery in databases process,[1] or KDD), a relatively young and interdisciplinary field of computer science[2][3] is… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”