Michael S. Hamada

ORCID: 0000-0003-3206-1695
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Optimal Experimental Design Methods
  • Advanced Statistical Process Monitoring
  • Advanced Statistical Methods and Models
  • Statistical Methods and Bayesian Inference
  • Probabilistic and Robust Engineering Design
  • Reliability and Maintenance Optimization
  • Statistical Distribution Estimation and Applications
  • Statistical Methods and Inference
  • Advanced Multi-Objective Optimization Algorithms
  • Risk and Safety Analysis
  • Statistical Methods in Clinical Trials
  • Software Reliability and Analysis Research
  • Fault Detection and Control Systems
  • Scientific Measurement and Uncertainty Evaluation
  • Manufacturing Process and Optimization
  • Nuclear Physics and Applications
  • Spectroscopy and Chemometric Analyses
  • Industrial Vision Systems and Defect Detection
  • Nuclear and radioactivity studies
  • Radiation Detection and Scintillator Technologies
  • Bayesian Methods and Mixture Models
  • Radioactive contamination and transfer
  • Statistics Education and Methodologies
  • Pesticide Residue Analysis and Safety
  • Radioactivity and Radon Measurements

Los Alamos National Laboratory
2016-2025

Horiba (Japan)
2016

Iowa State University
2010

Japan Broadcasting Corporation (Japan)
2008

John Wiley & Sons (United States)
2006

Bowling Green State University
2004

NTT (Japan)
2002

University of Michigan
1997

Al Ain Hospital
1995-1996

University of Wisconsin–Madison
1996

This

10.1093/tropej/47.2.126 article EN Journal of Tropical Pediatrics 2001-04-01

Traditionally, Plackett-Burman (PB) designs have been used in screening experiments for identifying important main effects. The PB whose run sizes are not a power of two criticized their complex aliasing patterns, which according to conventional wisdom gives confusing results. This paper goes beyond the traditional approach by proposing an analysis strategy that entertains interactions addition Based on precepts effect sparsity and heredity, proposed procedure exploits designs' thereby...

10.1080/00224065.1992.11979383 article EN Journal of Quality Technology 1992-07-01

Experiments using designs with complex aliasing patterns are often performed—for example, twolevel nongeometric Plackett-Burman designs, multilevel and mixed-level fractional factorial two-level hard-to-control factors, supersaturated designs. Hamada Wu proposed an iterative guided stepwise regression strategy for analyzing the data from such that allows entertainment of interactions. Their provides a restricted search in rather large model space, however. This article efficient methodology...

10.2307/1271501 article EN Technometrics 1997-11-01

Statistically designed experiments have been employed extensively to improve product or process quality and make products processes robust. In this paper, we consider with correlated multiple responses whose means, variances, correlations depend on experimental factors. Analysis of these consists modeling distributional parameters in terms the factors finding factor settings which maximize probability being a specification region, i.e., all are simultaneously meeting their respective...

10.1080/00224065.2001.11980104 article EN Journal of Quality Technology 2001-10-01

Experiments using designs with complex aliasing patterns are often performed—for example, twolevel nongeometric Plackett-Burman designs, multilevel and mixed-level fractional factorial two-level hard-to-control factors, supersaturated designs. Hamada Wu proposed an iterative guided stepwise regression strategy for analyzing the data from such that allows entertainment of interactions. Their provides a restricted search in rather large model space, however. This article efficient methodology...

10.1080/00401706.1997.10485156 article EN Technometrics 1997-11-01

While statistically designed experiments have been employed extensively to improve product or process quality, they used infrequently for improving reliability. In this paper, we present a case study which an experiment the reliability (or lifetime) of fluorescent lamps. The effect three factors from among many potentially important manufacturing was investigated using fractional factorial design. For lamps, failures occur when their luminosity light intensity falls below certain level. An...

10.1080/00224065.1995.11979618 article EN Journal of Quality Technology 1995-10-01

Today's manufacturers face increasingly intense global competition. To remain profitable, they are challenged to design, develop, test, and manufacture high reliability products in ever-shorter product-cycle times and, at the same time, within stringent cost constraints. Design, manufacturing, engineers have developed an impressive array of tools for producing reliable products. These will continue be important. However, due changes way that new product-concepts being brought market, there...

10.1109/24.387370 article EN IEEE Transactions on Reliability 1995-06-01

Scientific investigations frequently involve data from computer experiment(s) as well related physical experimental on the same factors and response variable(s). There may also be one or more expert opinions regarding of interest. Traditional statistical approaches consider each these datasets separately with corresponding separate analyses fitted models. A compelling argument can made that better, precise models obtained if combined are analyzed simultaneously using a hierarchical Bayesian...

10.1198/004017004000000211 article EN Technometrics 2004-04-06

This article shows how a genetic algorithm can be used to find near-optimal Bayesia nexperimental designs for regression models. The design criterion considered is the expected Shannon information gain of posterior distribution obtained from performing given experiment compared with prior distribution. Genetic algorithms are described and then applied experimental design. methodology illustrated wide range examples: linear nonlinear regression, single multiple factors, normal Bernoulli...

10.1198/000313001317098121 article EN The American Statistician 2001-08-01

The systems that statisticians are asked to assess, such as nuclear weapons, infrastructure networks, supercomputer codes and munitions, have become increasingly complex. It is often costly conduct full system tests. As such, we present a review of methodology has been proposed for addressing reliability with limited testing. first approaches presented in this paper concerned the combination multiple sources information assess single component. second general set addresses levels data...

10.1214/088342306000000439 article EN Statistical Science 2006-11-01

The performance of Radio-Isotope Identification (RIID) algorithms using NaI-based γ spectroscopy is increasingly important. For example, sensors at locations that screen for illicit nuclear material rely on isotope identification NaI detectors to distinguish innocent nuisance alarms, arising from naturally occurring radioactive material, alarms threat isotopes. Recent data collections RIID testing consist repeat measurements each several measurement scenarios test algorithms. It anticipated...

10.3390/a2010339 article EN cc-by Algorithms 2009-03-03

Since the early 1980s, industry has embraced use of designed experiments as an effective means for improving quality. For quality characteristics not normally distributed, practice first transforming data and then analyzing them by standard normal-based methods is well established. There a natural alternative called generalized linear models (GLMs). This paper explains how GLMs achieve intended goal transformation while at same time giving wider class that can handle range applications....

10.1080/00224065.1997.11979770 article EN Journal of Quality Technology 1997-07-01

In addition to the number of functional chips on a silicon wafer, spatial patterns nonfunctional can provide important information for improving integrated-circuit fabrication processes. this article, we consider binary (functional/nonfunctional) response each chip and propose method detect effects in such processes through factorial experimentation. By using measure dependence, process factors (parameters) that influence clustering or be identified. The proposed method, which assumes wafer...

10.1080/00401706.1993.10485037 article EN Technometrics 1993-05-01

This tutorial explains statistically designed experiments which provide a proactive means to improve reliability as advocated by Genichi Taguchi. That is, systematic experimentation, the important parameters (factors) affecting can be identified along with parameter values that yield gains. In addition improving reliability, Taguchi's robust design used achieve reliability; make process or product insensitive factors are hard impossible control. Robust is also implemented using experiments....

10.1109/24.387372 article EN IEEE Transactions on Reliability 1995-06-01

We present a Bayesian model for assessing the reliability of multicomponent systems. Novel features this are natural manner in which lifetime data collected at either component, subsystem, or system level integrated with prior information any level. The allows pooling between similar components, incorporation expert opinion, and straightforward handling censored data. methodology is illustrated two examples.

10.1080/00224065.2011.11917851 article EN Journal of Quality Technology 2011-04-01

AbstractThe Lenth method is an objective for testing effects from unreplicated factorial designs and eliminates the subjectivity in using a half-normal plot. The statistics are computed compared to corresponding critical values. Since distribution of not mathematically tractable, we propose simple simulation estimate Confidence intervals estimated values can also easily be obtained. Tables provided large number designs, their use demonstrated with data three experiments. proposed adapted...

10.1080/00224065.2000.11979971 article EN Journal of Quality Technology 2000-01-01

Good measurement systems are an important requirement for a successful quality improvement or statistical process control program. A system is assessed by performing designed experiment known as gauge repeatability and reproducibility (R & R) study. Confidence intervals the parameters which describe part of analyzing data from R In this paper, we show how confidence can easily be obtained using recently developed generalized inference methodology, calculated exact numerical integration...

10.1080/00224065.2000.11980000 article EN Journal of Quality Technology 2000-07-01

ABSTRACT Gauge repeatability and reproducibility (R&R) studies are used to assess precision of measurement systems. In particular, they quantify the importance various sources variability in a system. We take Bayesian approach data analysis show how estimate variance components associated with relevant functions these using gauge R&R together prior information. then provide worked examples for types common industrial applications. With each example we WinBUGS code illustrate easy it is...

10.1080/08982112.2012.702381 article EN Quality Engineering 2012-09-24

10.1080/08982112.2025.2460033 article Quality Engineering 2025-02-11
Coming Soon ...