- Statistical Distribution Estimation and Applications
- Reliability and Maintenance Optimization
- Software Reliability and Analysis Research
- Probabilistic and Robust Engineering Design
- Statistical Methods and Inference
- Software Engineering Research
- Risk and Safety Analysis
- Bayesian Methods and Mixture Models
- Bayesian Modeling and Causal Inference
- Statistical Methods and Bayesian Inference
- Advanced Statistical Process Monitoring
- Advanced Statistical Methods and Models
- Fault Detection and Control Systems
- Forecasting Techniques and Applications
- Insurance, Mortality, Demography, Risk Management
- Optimal Experimental Design Methods
- Probability and Risk Models
- Multi-Criteria Decision Making
- Financial Risk and Volatility Modeling
- Software Testing and Debugging Techniques
- Neural Networks and Applications
- Complex Systems and Decision Making
- Manufacturing Process and Optimization
- Target Tracking and Data Fusion in Sensor Networks
- Control Systems and Identification
George Washington University
2006-2020
City University of Hong Kong
2013-2019
Defense Advanced Research Projects Agency
2009
Office of Naval Research
2006
Decision Sciences (United States)
2006
United States Army Research Office
1978-2006
Umeå University
2002
Nuffield Foundation
2002
Carnegie Mellon University
1991-1998
National Institute of Standards and Technology
1996
Methods for statistical analysis of reliability and life data , کتابخانههای دانشگاه کردستان
Abstract This is an expository article. Here we show how the successfully used Kalman filter, popular with control engineers and other scientists, can be easily understood by statisticians if use a Bayesian formulation some well-known results in multivariate statistics. We also give simple example illustrating of filter for quality work.
This expository paper is an overview of a relatively new class failure models, both univariate and multivariate, that are suitable for describing the lifelength items operate in dynamic environments. Many currently used models developed under premise operating environment static: these turn out to be special cases overviewed here. These derived by underlying failure-causing mechanisms, such as degradation wear, using stochastic processes: this mathematical theme derives their development....
Summary Probability models and statistical methods are a popular technique for evaluating the reliability of computer software. This paper reviews literature concerning these methods, with an emphasis on historical perspective. The use stochastic techniques is justified, various probability that have been proposed, along any associated estimation inference procedures, described. Examples applied to real software failure data given. A classic development problem-how long should be tested...
In assessing the reliability of a system components, it is usual to suppose components fail independently each other. Often this inappropriate because common environment acting on all induces correlation. For example, harsh will encourage early failure components. A simple model that incorporates such dependencies described, and several properties investigated. Calculations are carried out for parallel two Inequalities multicomponent systems suggested. The results generalize easily.
In this paper we show how several models used to describe the reliability of computer software can be comprehensively viewed by adopting a Bayesian point view. We first provide an alternative motivation for commonly model, Jelinski–Moranda using notions from shock models. then that some alternate proposed in literature derived assigning specific prior distributions parameters above model. also obtain other structural results such as stochastic inequalities and association, discuss these interpreted.
AbstractThe notion of fuzzy sets has proven useful in the context control theory, pattern recognition, and medical diagnosis. However, it also spawned view that classical probability theory is unable to deal with uncertainties natural language machine learning, so alternatives are needed. One such alternative what known as “possibility theory.” Such have come into being because past attempts at making set work concert been unsuccessful. The purpose this article develop a line argument...
Abstract Kalman filter models based on the assumption of multivariate Gaussian distributions are known to be nonrobust. This means that when a large discrepancy arises between prior distribution and observed data, posterior becomes an unrealistic compromise two. In this article we discuss rationale for how robustify filter. Specifically, develop model wherein will revert extreme outlying observations encountered, point out can achieved by assuming with Student-t marginals. To achieve fully...
Abstract In this article we introduce the problem of computer software reliability and discuss a probabilistic model for describing failure software. We suggest procedure estimating parameters propose stopping rule debugging apply our procedures to some published data on failures. Key Words: Stopping ruleDebuggingBurn-inSoftware reliabilityTesting
Most of the familiar time-to-failure distributions used today are derived from hazard functions whose parameters assumed constant. An unconditional distribution is here by assuming that a parameter classical failure (viz., exponential and Weibull) random variable with known distribution. With use compound Bayesian techniques, it possible to join test data prior information arrive at combined, possibly superior, estimate reliability. The considered two-point, uniform, gamma. Conceptually,...
Much of the literature in reliability and survival analysis considers failure models indexed by a single scale. There are situations which require that be described several scales. An example from is items under warranty whose recorded time amount use. death mine worker noted age duration exposure to dust. This paper proposes an approach for developing probabilistic two scales: time, usage, quantity related time. The relationship between scales additive hazards model. evolution usage...
Manufacturers of consumer products, such as automobiles, usually offer warranties guaranteeing the product or its parts, for example, five years 50,000 miles, whichever comes first. There are at least two issues interest to applied mathematicians and statisticians that arise from warranty considerations. The first is specification an optimum price-warranty combination, second forecast a reserve fund meet claims against product. former involves consideration item’s reliability, rate usage,...
In this paper we motivate a random coefficient autoregressive process of order 1 for describing reliability growth or decay. We introduce several ramifications process, some which reduce it to Kalman Filter model. illustrate the usefulness our approach by applying these processes real life data on software failures. Finally, make pairwise comparison models in terms ratio likelihoods their predictive distributions, and identify "best"
Our objective in writing this paper is to point out some interesting relationships between the commonly referred indices of economic inequality, and a central notion used reliability theory. Specifically, we show that “Lorenz curve” “Gini index” economics are related “total time on test” “mean residual life” An advantage observing these now able consolidate knowledge has independently evolved two apparently diverse areas, furthermore, to: recently proposed test for exponentiality based Gini...
Abstract Percentage points are given for the Kolmogorov-Smirnov statistics D +, -, and Kuiper statistic V testing fit to extreme-value distribution with unknown parameters. The may also be used tests Weibull distribution.