- Optimal Experimental Design Methods
- Advanced Statistical Process Monitoring
- Advanced Statistical Methods and Models
- Statistical Methods and Bayesian Inference
- Probabilistic and Robust Engineering Design
- Reliability and Maintenance Optimization
- Statistical Distribution Estimation and Applications
- Statistical Methods and Inference
- Advanced Multi-Objective Optimization Algorithms
- Risk and Safety Analysis
- Statistical Methods in Clinical Trials
- Software Reliability and Analysis Research
- Fault Detection and Control Systems
- Scientific Measurement and Uncertainty Evaluation
- Manufacturing Process and Optimization
- Nuclear Physics and Applications
- Spectroscopy and Chemometric Analyses
- Industrial Vision Systems and Defect Detection
- Nuclear and radioactivity studies
- Radiation Detection and Scintillator Technologies
- Bayesian Methods and Mixture Models
- Radioactive contamination and transfer
- Statistics Education and Methodologies
- Pesticide Residue Analysis and Safety
- Radioactivity and Radon Measurements
Los Alamos National Laboratory
2016-2025
Horiba (Japan)
2016
Iowa State University
2010
Japan Broadcasting Corporation (Japan)
2008
John Wiley & Sons (United States)
2006
Bowling Green State University
2004
NTT (Japan)
2002
University of Michigan
1997
Al Ain Hospital
1995-1996
University of Wisconsin–Madison
1996
This
Traditionally, Plackett-Burman (PB) designs have been used in screening experiments for identifying important main effects. The PB whose run sizes are not a power of two criticized their complex aliasing patterns, which according to conventional wisdom gives confusing results. This paper goes beyond the traditional approach by proposing an analysis strategy that entertains interactions addition Based on precepts effect sparsity and heredity, proposed procedure exploits designs' thereby...
Experiments using designs with complex aliasing patterns are often performed—for example, twolevel nongeometric Plackett-Burman designs, multilevel and mixed-level fractional factorial two-level hard-to-control factors, supersaturated designs. Hamada Wu proposed an iterative guided stepwise regression strategy for analyzing the data from such that allows entertainment of interactions. Their provides a restricted search in rather large model space, however. This article efficient methodology...
Statistically designed experiments have been employed extensively to improve product or process quality and make products processes robust. In this paper, we consider with correlated multiple responses whose means, variances, correlations depend on experimental factors. Analysis of these consists modeling distributional parameters in terms the factors finding factor settings which maximize probability being a specification region, i.e., all are simultaneously meeting their respective...
Experiments using designs with complex aliasing patterns are often performed—for example, twolevel nongeometric Plackett-Burman designs, multilevel and mixed-level fractional factorial two-level hard-to-control factors, supersaturated designs. Hamada Wu proposed an iterative guided stepwise regression strategy for analyzing the data from such that allows entertainment of interactions. Their provides a restricted search in rather large model space, however. This article efficient methodology...
While statistically designed experiments have been employed extensively to improve product or process quality, they used infrequently for improving reliability. In this paper, we present a case study which an experiment the reliability (or lifetime) of fluorescent lamps. The effect three factors from among many potentially important manufacturing was investigated using fractional factorial design. For lamps, failures occur when their luminosity light intensity falls below certain level. An...
Today's manufacturers face increasingly intense global competition. To remain profitable, they are challenged to design, develop, test, and manufacture high reliability products in ever-shorter product-cycle times and, at the same time, within stringent cost constraints. Design, manufacturing, engineers have developed an impressive array of tools for producing reliable products. These will continue be important. However, due changes way that new product-concepts being brought market, there...
Scientific investigations frequently involve data from computer experiment(s) as well related physical experimental on the same factors and response variable(s). There may also be one or more expert opinions regarding of interest. Traditional statistical approaches consider each these datasets separately with corresponding separate analyses fitted models. A compelling argument can made that better, precise models obtained if combined are analyzed simultaneously using a hierarchical Bayesian...
This article shows how a genetic algorithm can be used to find near-optimal Bayesia nexperimental designs for regression models. The design criterion considered is the expected Shannon information gain of posterior distribution obtained from performing given experiment compared with prior distribution. Genetic algorithms are described and then applied experimental design. methodology illustrated wide range examples: linear nonlinear regression, single multiple factors, normal Bernoulli...
The systems that statisticians are asked to assess, such as nuclear weapons, infrastructure networks, supercomputer codes and munitions, have become increasingly complex. It is often costly conduct full system tests. As such, we present a review of methodology has been proposed for addressing reliability with limited testing. first approaches presented in this paper concerned the combination multiple sources information assess single component. second general set addresses levels data...
The performance of Radio-Isotope Identification (RIID) algorithms using NaI-based γ spectroscopy is increasingly important. For example, sensors at locations that screen for illicit nuclear material rely on isotope identification NaI detectors to distinguish innocent nuisance alarms, arising from naturally occurring radioactive material, alarms threat isotopes. Recent data collections RIID testing consist repeat measurements each several measurement scenarios test algorithms. It anticipated...
Since the early 1980s, industry has embraced use of designed experiments as an effective means for improving quality. For quality characteristics not normally distributed, practice first transforming data and then analyzing them by standard normal-based methods is well established. There a natural alternative called generalized linear models (GLMs). This paper explains how GLMs achieve intended goal transformation while at same time giving wider class that can handle range applications....
In addition to the number of functional chips on a silicon wafer, spatial patterns nonfunctional can provide important information for improving integrated-circuit fabrication processes. this article, we consider binary (functional/nonfunctional) response each chip and propose method detect effects in such processes through factorial experimentation. By using measure dependence, process factors (parameters) that influence clustering or be identified. The proposed method, which assumes wafer...
This tutorial explains statistically designed experiments which provide a proactive means to improve reliability as advocated by Genichi Taguchi. That is, systematic experimentation, the important parameters (factors) affecting can be identified along with parameter values that yield gains. In addition improving reliability, Taguchi's robust design used achieve reliability; make process or product insensitive factors are hard impossible control. Robust is also implemented using experiments....
We present a Bayesian model for assessing the reliability of multicomponent systems. Novel features this are natural manner in which lifetime data collected at either component, subsystem, or system level integrated with prior information any level. The allows pooling between similar components, incorporation expert opinion, and straightforward handling censored data. methodology is illustrated two examples.
AbstractThe Lenth method is an objective for testing effects from unreplicated factorial designs and eliminates the subjectivity in using a half-normal plot. The statistics are computed compared to corresponding critical values. Since distribution of not mathematically tractable, we propose simple simulation estimate Confidence intervals estimated values can also easily be obtained. Tables provided large number designs, their use demonstrated with data three experiments. proposed adapted...
Good measurement systems are an important requirement for a successful quality improvement or statistical process control program. A system is assessed by performing designed experiment known as gauge repeatability and reproducibility (R & R) study. Confidence intervals the parameters which describe part of analyzing data from R In this paper, we show how confidence can easily be obtained using recently developed generalized inference methodology, calculated exact numerical integration...
ABSTRACT Gauge repeatability and reproducibility (R&R) studies are used to assess precision of measurement systems. In particular, they quantify the importance various sources variability in a system. We take Bayesian approach data analysis show how estimate variance components associated with relevant functions these using gauge R&R together prior information. then provide worked examples for types common industrial applications. With each example we WinBUGS code illustrate easy it is...