Asokan Mulayath Variyath

ORCID: 0000-0003-3467-7872
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Statistical Methods and Inference
  • Advanced Statistical Methods and Models
  • Advanced Statistical Process Monitoring
  • Statistical Methods and Bayesian Inference
  • Advanced Proteomics Techniques and Applications
  • Metabolomics and Mass Spectrometry Studies
  • Mass Spectrometry Techniques and Applications
  • Probabilistic and Robust Engineering Design
  • Optimal Experimental Design Methods
  • Scientific Measurement and Uncertainty Evaluation
  • Fluid Dynamics and Vibration Analysis
  • Fluid Dynamics and Turbulent Flows
  • Statistical Distribution Estimation and Applications
  • Bayesian Methods and Mixture Models
  • Model Reduction and Neural Networks
  • Pesticide Residue Analysis and Safety
  • Statistical Methods in Clinical Trials
  • Statistics Education and Methodologies
  • Probability and Risk Models
  • Advanced Multi-Objective Optimization Algorithms
  • Data Analysis with R
  • Transgenic Plants and Applications
  • Financial Risk and Volatility Modeling
  • Spatial and Panel Data Analysis
  • Software Reliability and Analysis Research

Memorial University of Newfoundland
2010-2023

St. John's University
2020

Texas A&M University
2009

National Institute of Standards and Technology
2009

New York University
2009

NCCOS Hollings Marine Laboratory
2009

Vanderbilt University Medical Center
2009

University of Arizona
2009

Broad Institute
2009

University of British Columbia
2008

The complexity of proteomic instrumentation for LC-MS/MS introduces many possible sources variability. Data-dependent sampling peptides constitutes a stochastic element at the heart discovery proteomics. Although this variation impacts identification peptides, identifications are far from completely random. In study, we analyzed interlaboratory data sets NCI Clinical Proteomic Technology Assessment Cancer to examine repeatability and reproducibility in peptide protein identifications....

10.1021/pr9006365 article EN Journal of Proteome Research 2009-11-19

Computing a profile empirical likelihood function, which involves constrained maximization, is key step in applications of likelihood. However, some situations, the required numerical problem has no solution. In this case, convention to assign zero value This strategy at least two limitations. First, it numerically difficult determine that there solution; second, information provided on relative plausibility parameter values where set zero. article, we propose novel adjustment retains all...

10.1198/106186008x321068 article EN Journal of Computational and Graphical Statistics 2008-05-28

A major unmet need in LC-MS/MS-based proteomics analyses is a set of tools for quantitative assessment system performance and evaluation technical variability. Here we describe 46 metrics monitoring chromatographic performance, electrospray source stability, MS1 MS2 signals, dynamic sampling ions MS/MS, peptide identification. Applied to data sets from replicate LC-MS/MS analyses, these displayed consistent, reasonable responses controlled perturbations. The typically variations less than...

10.1074/mcp.m900223-mcp200 article EN cc-by Molecular & Cellular Proteomics 2009-10-18

Optimal performance of LC-MS/MS platforms is critical to generating high quality proteomics data. Although individual laboratories have developed control samples, there no widely available standard biological complexity (and associated reference data sets) for benchmarking platform analysis complex proteomes across different in the community. Individual preparations yeast Saccharomyces cerevisiae proteome been used extensively by community characterize LC-MS performance. The uniquely...

10.1074/mcp.m900222-mcp200 article EN cc-by Molecular & Cellular Proteomics 2009-10-27

To monitor a multivariate process, classical Hotelling's T2 control chart is often used. However, it well known that such charts are very sensitive to the presence of outlying observations in historical Phase I data used set limit. In this paper, we propose robust T2-type for individual based on highly and efficient estimators mean vector covariance matrix as reweighted minimum determinant (RMCD) estimators. We illustrate how limit proposed chart, study its performance using simulations,...

10.1080/00224065.2009.11917781 article EN Journal of Quality Technology 2009-07-01

Introduction In many practical situations, we are interested in the effect of covariates on correlated multiple responses. this paper, focus estimation and variable selection multi-response regression models. Correlation among response variables must be modeled for valid inference. Method We used an extension generalized estimating equation (GEE) methodology to simultaneously analyze binary, count, continuous outcomes with nonlinear functions. Variable plays important role modeling responses...

10.1371/journal.pone.0236067 article EN cc-by PLoS ONE 2020-07-17

10.1016/j.jspi.2009.09.025 article EN Journal of Statistical Planning and Inference 2009-10-05

Abstract Use of Hotelling's T 2 charts with high breakdown robust estimates to monitor multivariate individual observations are the recent trend in control chart methodology. Vargas ( J. Qual. Tech. 2003; 35: 367‐376) introduced based on minimum volume ellipsoid (MVE) and covariance determinant (MCD) identify outliers Phase I data. Studies carried out by Jensen et al. Rel. Eng. Int. 2007; 23: 615‐629) indicated that performance these heavily depends sample size, amount dimensionality...

10.1002/qre.1169 article EN Quality and Reliability Engineering International 2010-12-20

Multivariate control charts are widely used in various industries to monitor the shifts process mean and variability. In Phase I monitoring, limits computed using historical data, based on classical estimators (sample sample covariance) highly sensitive outliers data. We propose robust with high breakdown re‐weighted minimum covariance determinant volume ellipsoid variability of multivariate individual observations data under exponentially weighted square error moving variance schemes. The...

10.1002/qre.1559 article EN Quality and Reliability Engineering International 2013-12-27

Hoteling's <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" id="M1"><mml:mrow><mml:msup><mml:mi>T</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:math> control charts are widely used in industries to monitor multivariate processes. The classical estimators, sample mean, and the covariance id="M2"><mml:mrow><mml:msup><mml:mi>T</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:math> highly sensitive outliers data. In Phase-I monitoring, limits arrived at using historical data...

10.1155/2013/542305 article EN Journal of Quality and Reliability Engineering 2013-06-09

In large eddy simulation (LES) of turbulent flows, the most critical dynamical processes to be considered by dynamic subgrid models account for an average cascade kinetic energy from largest smallest scales flow is not fully clear. Furthermore, evidence vortex stretching being primary mechanism out question. this article, we study some essential statistical characteristics and its role in approaches modeling subgrid-scale turbulence. We have compared interaction stresses with filtered...

10.20944/preprints202109.0438.v1 preprint EN 2021-09-24

10.1198/004017002188618644 article Technometrics 2003-02-01

Abstract In any fishery, it is important to know whether management decisions have an impact on catch and effort. This study demonstrates that bag limits can be effective tool for reducing effort when dealing with a retention‐oriented angling population. Since 1997, four different regimes ( MR s) been applied recreational Atlantic Salmon Salmo salar fishery Harry's River in Newfoundland, Canada. The s include release only 1); at the start of season, allowed retention after in‐season review...

10.1002/nafm.10011 article EN North American Journal of Fisheries Management 2018-02-01

Proportional hazard regression models are widely used in survival analysis to understand and exploit the relationship between time covariates. For left censored times, reversed rate functions more appropriate. In this paper, we develop a parametric proportional rates model using an inverted Weibull distribution. The estimation construction of confidence intervals for parameters discussed. We assess performance proposed procedure based on large number Monte Carlo simulations. illustrate...

10.1155/2014/645719 article EN cc-by Journal of Probability and Statistics 2014-01-01

Experimental designs with performance measures as responses are common in industrial applications. The existing analysis methods often regard sole response variables without replicates. Consequently, no degrees of freedom left for error variance estimation these methods. In reality, obtained from replicated primary-response variables. Precious information is hence lost. this paper, we suggest a jackknife-based approach on the primary to provide an estimate measures. resulting tests factor...

10.1080/00224065.2005.11980308 article EN Journal of Quality Technology 2005-04-01

SYNOPTIC ABSTRACTThe additive reversed hazards model relates the conditional hazard function of lifetime linearly to covariates. It describes association between and covariates in terms risk difference. In present work, we introduce an for modeling analysis presence under left censoring. We develop a closed form semiparametric estimator regression parameter. also provide Breslow-type cumulative baseline function. Asymptotic properties estimators are studied. Simulation studies conducted...

10.1080/01966324.2014.943600 article EN American Journal of Mathematical and Management Sciences 2014-10-02

In large eddy simulation (LES) of turbulent flows, dynamic subgrid models would account for an average cascade kinetic energy from the largest to smallest scales flow. Yet, it is unclear which most critical dynamical processes can ensure criterion mentioned above. Furthermore, evidence vortex stretching being primary mechanism not out question. this article, we study essential statistical characteristics stretching. Our numerical results demonstrate that rate provides dissipation necessary...

10.3390/aerospace8120375 article EN cc-by Aerospace 2021-12-03

In longitudinal data analysis, our primary interest is in the estimation of regression parameters for marginal expectations responses, and correlation are secondary interest. The joint likelihood function challenging, particularly due to correlated responses. Marginal models, such as generalized estimating equations (GEEs), have received much attention based on assumption first two moments a working structure. confidence regions hypothesis tests constructed asymptotic normality. This...

10.4236/ojs.2020.104037 article EN Open Journal of Statistics 2020-01-01

SYNOPTIC ABSTRACTThe Box–Pierce and Ljung–Box tests are portmanteau generally used to test the independence in time series data. These can also be applied squares of observations detect independence. Because most financial data show heavy-tailed behavior, these may incorrectly reject null hypothesis no correlation when there is volatility clustering A modified version introduced capture this behavior showed that performs better an autoregressive conditional heteroscedasticity (ARCH) effect....

10.1080/01966324.2015.1082451 article EN American Journal of Mathematical and Management Sciences 2015-12-09

AbstractIn survival analysis, interval censoring case I or current status happens if each subject is observed only once for of occurrence the event interest. Current data often appear along with covariates in cross sectional studies and tumorigenicity studies. Cox's proportional hazards model has been widely used to explore relationship between lifetime variable covariates. In this paper we propose a novel easy implement Bayesian approach analyzing data. Under model, baseline function...

10.1080/03610918.2023.2266153 article EN Communications in Statistics - Simulation and Computation 2023-10-09

The success of the implementation control chart depends upon assumptions made on distribution quality characteristics. If distributional assumption deviates too much from true one or if it is misspecified, performance seriously affected and may make wrong conclusions about process. To avoid such situations, we propose a new class based empirical likelihood (EL). We to monitor EL ratio statistic for mean use resampling method arrive at its which inverted obtain limits. Our simulation results...

10.1515/eqc-2013-0011 article EN Economic Quality Control 2013-01-01
Coming Soon ...