- Statistical Methods and Inference
- Advanced Statistical Methods and Models
- Advanced Statistical Process Monitoring
- Statistical Methods and Bayesian Inference
- Advanced Proteomics Techniques and Applications
- Metabolomics and Mass Spectrometry Studies
- Mass Spectrometry Techniques and Applications
- Probabilistic and Robust Engineering Design
- Optimal Experimental Design Methods
- Scientific Measurement and Uncertainty Evaluation
- Fluid Dynamics and Vibration Analysis
- Fluid Dynamics and Turbulent Flows
- Statistical Distribution Estimation and Applications
- Bayesian Methods and Mixture Models
- Model Reduction and Neural Networks
- Pesticide Residue Analysis and Safety
- Statistical Methods in Clinical Trials
- Statistics Education and Methodologies
- Probability and Risk Models
- Advanced Multi-Objective Optimization Algorithms
- Data Analysis with R
- Transgenic Plants and Applications
- Financial Risk and Volatility Modeling
- Spatial and Panel Data Analysis
- Software Reliability and Analysis Research
Memorial University of Newfoundland
2010-2023
St. John's University
2020
Texas A&M University
2009
National Institute of Standards and Technology
2009
New York University
2009
NCCOS Hollings Marine Laboratory
2009
Vanderbilt University Medical Center
2009
University of Arizona
2009
Broad Institute
2009
University of British Columbia
2008
The complexity of proteomic instrumentation for LC-MS/MS introduces many possible sources variability. Data-dependent sampling peptides constitutes a stochastic element at the heart discovery proteomics. Although this variation impacts identification peptides, identifications are far from completely random. In study, we analyzed interlaboratory data sets NCI Clinical Proteomic Technology Assessment Cancer to examine repeatability and reproducibility in peptide protein identifications....
Computing a profile empirical likelihood function, which involves constrained maximization, is key step in applications of likelihood. However, some situations, the required numerical problem has no solution. In this case, convention to assign zero value This strategy at least two limitations. First, it numerically difficult determine that there solution; second, information provided on relative plausibility parameter values where set zero. article, we propose novel adjustment retains all...
A major unmet need in LC-MS/MS-based proteomics analyses is a set of tools for quantitative assessment system performance and evaluation technical variability. Here we describe 46 metrics monitoring chromatographic performance, electrospray source stability, MS1 MS2 signals, dynamic sampling ions MS/MS, peptide identification. Applied to data sets from replicate LC-MS/MS analyses, these displayed consistent, reasonable responses controlled perturbations. The typically variations less than...
Optimal performance of LC-MS/MS platforms is critical to generating high quality proteomics data. Although individual laboratories have developed control samples, there no widely available standard biological complexity (and associated reference data sets) for benchmarking platform analysis complex proteomes across different in the community. Individual preparations yeast Saccharomyces cerevisiae proteome been used extensively by community characterize LC-MS performance. The uniquely...
To monitor a multivariate process, classical Hotelling's T2 control chart is often used. However, it well known that such charts are very sensitive to the presence of outlying observations in historical Phase I data used set limit. In this paper, we propose robust T2-type for individual based on highly and efficient estimators mean vector covariance matrix as reweighted minimum determinant (RMCD) estimators. We illustrate how limit proposed chart, study its performance using simulations,...
Introduction In many practical situations, we are interested in the effect of covariates on correlated multiple responses. this paper, focus estimation and variable selection multi-response regression models. Correlation among response variables must be modeled for valid inference. Method We used an extension generalized estimating equation (GEE) methodology to simultaneously analyze binary, count, continuous outcomes with nonlinear functions. Variable plays important role modeling responses...
Abstract Use of Hotelling's T 2 charts with high breakdown robust estimates to monitor multivariate individual observations are the recent trend in control chart methodology. Vargas ( J. Qual. Tech. 2003; 35: 367‐376) introduced based on minimum volume ellipsoid (MVE) and covariance determinant (MCD) identify outliers Phase I data. Studies carried out by Jensen et al. Rel. Eng. Int. 2007; 23: 615‐629) indicated that performance these heavily depends sample size, amount dimensionality...
Multivariate control charts are widely used in various industries to monitor the shifts process mean and variability. In Phase I monitoring, limits computed using historical data, based on classical estimators (sample sample covariance) highly sensitive outliers data. We propose robust with high breakdown re‐weighted minimum covariance determinant volume ellipsoid variability of multivariate individual observations data under exponentially weighted square error moving variance schemes. The...
Hoteling's <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" id="M1"><mml:mrow><mml:msup><mml:mi>T</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:math> control charts are widely used in industries to monitor multivariate processes. The classical estimators, sample mean, and the covariance id="M2"><mml:mrow><mml:msup><mml:mi>T</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:math> highly sensitive outliers data. In Phase-I monitoring, limits arrived at using historical data...
In large eddy simulation (LES) of turbulent flows, the most critical dynamical processes to be considered by dynamic subgrid models account for an average cascade kinetic energy from largest smallest scales flow is not fully clear. Furthermore, evidence vortex stretching being primary mechanism out question. this article, we study some essential statistical characteristics and its role in approaches modeling subgrid-scale turbulence. We have compared interaction stresses with filtered...
Abstract In any fishery, it is important to know whether management decisions have an impact on catch and effort. This study demonstrates that bag limits can be effective tool for reducing effort when dealing with a retention‐oriented angling population. Since 1997, four different regimes ( MR s) been applied recreational Atlantic Salmon Salmo salar fishery Harry's River in Newfoundland, Canada. The s include release only 1); at the start of season, allowed retention after in‐season review...
Proportional hazard regression models are widely used in survival analysis to understand and exploit the relationship between time covariates. For left censored times, reversed rate functions more appropriate. In this paper, we develop a parametric proportional rates model using an inverted Weibull distribution. The estimation construction of confidence intervals for parameters discussed. We assess performance proposed procedure based on large number Monte Carlo simulations. illustrate...
Experimental designs with performance measures as responses are common in industrial applications. The existing analysis methods often regard sole response variables without replicates. Consequently, no degrees of freedom left for error variance estimation these methods. In reality, obtained from replicated primary-response variables. Precious information is hence lost. this paper, we suggest a jackknife-based approach on the primary to provide an estimate measures. resulting tests factor...
SYNOPTIC ABSTRACTThe additive reversed hazards model relates the conditional hazard function of lifetime linearly to covariates. It describes association between and covariates in terms risk difference. In present work, we introduce an for modeling analysis presence under left censoring. We develop a closed form semiparametric estimator regression parameter. also provide Breslow-type cumulative baseline function. Asymptotic properties estimators are studied. Simulation studies conducted...
In large eddy simulation (LES) of turbulent flows, dynamic subgrid models would account for an average cascade kinetic energy from the largest to smallest scales flow. Yet, it is unclear which most critical dynamical processes can ensure criterion mentioned above. Furthermore, evidence vortex stretching being primary mechanism not out question. this article, we study essential statistical characteristics stretching. Our numerical results demonstrate that rate provides dissipation necessary...
In longitudinal data analysis, our primary interest is in the estimation of regression parameters for marginal expectations responses, and correlation are secondary interest. The joint likelihood function challenging, particularly due to correlated responses. Marginal models, such as generalized estimating equations (GEEs), have received much attention based on assumption first two moments a working structure. confidence regions hypothesis tests constructed asymptotic normality. This...
SYNOPTIC ABSTRACTThe Box–Pierce and Ljung–Box tests are portmanteau generally used to test the independence in time series data. These can also be applied squares of observations detect independence. Because most financial data show heavy-tailed behavior, these may incorrectly reject null hypothesis no correlation when there is volatility clustering A modified version introduced capture this behavior showed that performs better an autoregressive conditional heteroscedasticity (ARCH) effect....
AbstractIn survival analysis, interval censoring case I or current status happens if each subject is observed only once for of occurrence the event interest. Current data often appear along with covariates in cross sectional studies and tumorigenicity studies. Cox's proportional hazards model has been widely used to explore relationship between lifetime variable covariates. In this paper we propose a novel easy implement Bayesian approach analyzing data. Under model, baseline function...
The success of the implementation control chart depends upon assumptions made on distribution quality characteristics. If distributional assumption deviates too much from true one or if it is misspecified, performance seriously affected and may make wrong conclusions about process. To avoid such situations, we propose a new class based empirical likelihood (EL). We to monitor EL ratio statistic for mean use resampling method arrive at its which inverted obtain limits. Our simulation results...