- Spectroscopy and Chemometric Analyses
- Face and Expression Recognition
- Machine Learning and Data Classification
- Advanced Statistical Methods and Models
- Advanced Multi-Objective Optimization Algorithms
- Fault Detection and Control Systems
- Control Systems and Identification
- Blind Source Separation Techniques
- Sparse and Compressive Sensing Techniques
- Digital Marketing and Social Media
- Sensory Analysis and Statistical Methods
- Impact of Technology on Adolescents
- Advanced Statistical Process Monitoring
- Gene expression and cancer classification
- Income, Poverty, and Inequality
- Consumer Behavior in Brand Consumption and Identification
- Bayesian Modeling and Causal Inference
- Education and Technology Integration
- Privacy-Preserving Technologies in Data
- Statistical Distribution Estimation and Applications
- Collaborative Teaching and Inclusion
- Technology Adoption and User Behaviour
- Manufacturing Process and Optimization
- Media Influence and Health
- Rough Sets and Fuzzy Logic
MIT World Peace University
2023
Xi’an Jiaotong-Liverpool University
2017-2021
RMIT Vietnam
2015-2016
Office for National Statistics
2004
University of Waterloo
2001
Due to the penetration of smartphones and associated mobile devices, gaming has become a ubiquitous industry worldwide. Players now have access games at all times. Extending previous research Uses Gratifications approach this paper presents an alternative conceptual model that can offer explanations towards understanding why players play game they most frequently.
Abstract Asset indices have been used since the late 1990s to measure wealth in developing countries. We extend standard methodology for estimating asset using principal component analysis two ways: by introducing constraints that force increasing value as number of assets owned increases, and sparse with a few key assets. This is achieved combining categorical analysis. also apply this estimation per capita level indices. Using household survey data from northwest Vietnam northeast Laos, we...
Summary Sparse principal components analysis (SPCA) is a technique for finding with small number of non‐zero loadings. Our contribution to this methodology twofold. First we derive the sparse solutions that minimise least squares criterion subject sparsity requirements. Second, recognising not only requirement achieving simplicity, suggest backward elimination algorithm computes large This can be run without specifying loadings in advance. It also possible impose minimum amount variance...
The purpose of this study was to create a conceptual framework and collect some pilot data in order underpin future research on how the Vietnamese use Facebook their day-to-day lives. A number key points were observed study, which informed framework. Firstly, there is paucity topic, that users Vietnam (population 90 million) rank as heaviest consumers world, cultural traditions values need be acknowledged given these differences when compared other nations might influence use. Given studies...
Abstract The authors consider dimensionality reduction methods used for prediction, such as reduced rank regression, principal component regression and partial least squares. They show how it is possible to obtain intermediate solutions by estimating simultaneously the latent variables predictors responses. a continuum of that goes from via maximum likelihood squares estimation. Different are compared using simulated real data.
Zero Re-Burn-In methodology presented in this paper is based on statistical analysis of the historical manufacturing data burn-in (BI) and re-burn-in (REBI) tests employed semiconductor devices manufacturing. The goal to reduce (or, if possible, eliminate) REBI test so lower associated cost time while preserving required low failure rate manufactured devices. processing production sets are performed employing JMP software. re-search has led development a logistic regression model capable...
We propose an algorithmic framework for computing sparse components from rotated principal components. This methodology, called SIMPCA, is useful to replace the unreliable practice of ignoring small coefficients when interpreting them. The algorithm computes genuinely by projecting onto subsets variables. so simplified are highly correlated with corresponding By choosing different simplification strategies solutions can be obtained which used compare alternative interpretations give some...
Sparse Principal Components Analysis aims to find principal components with few non-zero loadings. We derive such sparse solutions by adding a genuine sparsity requirement the original (PCA) objective function. This approach differs from others because it preserves PCA's optimality: \uns\ of and Least Squares approximation data. To identify best subset loadings we propose Branch-and-Bound search an iterative elimination algorithm. last algorithm finds large can be run without specifying...
This study compares final grade results across two different cohorts of accounting students (one using a traditional lecture model and the other inter-teaching – an innovative pedagogy). Boyce Hineline (2002) designed to engage in their learning enhance academic performance. Accounting courses historically have had record high failure rates at offshore campus Australian University, Vietnam. Final comparisons were made between exposed those taught under lecture-tutorial model. The treatments...
Semiconductor device manufacturing process involves a large number of sophisticated production steps that have to be controlled precisely so achieve and maintain the required yield product quality specifications. Therefore, it is critical employ extensive data collection from numerous process-associated sensors. The are then utilized by control system (PCS) monitor process. This paper presents real-world study performed at one major semiconductor manufacturers where massive line sensors were...
We propose a new sparse principal component analysis (SPCA) method in which the solutions are obtained by projecting full cardinality components onto subsets of variables. The resulting guaranteed to explain given proportion variance. computation these is very efficient. proposed compares well with optimal least squares components. show that other SPCA methods fail identify best approximations and less variance than our solutions. illustrate compare real dataset containing socioeconomic data...
Hyperparameter tuning plays a crucial role in optimizing the performance of predictive learners. Cross--validation (CV) is widely adopted technique for estimating error different hyperparameter settings. Repeated cross-validation (RCV) has been commonly employed to reduce variability CV errors. In this paper, we introduce novel approach called blocked (BCV), where repetitions are with respect both partition and random behavior learner. Theoretical analysis empirical experiments demonstrate...
Abstract Hyperparameter tuning plays a crucial role in optimizing the performance of predictive learners. Cross--validation (CV) is widely adopted technique for estimating error different hyperparameter settings. Repeated cross--validation (RCV) has been commonly employed to reduce variability CV errors. In this paper, we introduce novel approach called blocked (BCV), where repetitions are with respect both partition and random behavior learner. Theoretical analysis empirical experiments...
The topic of this tutorial is Least Squares Sparse Principal Components Analysis (LS SPCA) which a simple method for computing approximated are combinations only few the observed variables. Analogously to Components, these components uncorrelated and sequentially best approximate dataset. derivation LS SPCA intuitive anyone familiar with linear regression. Since based on different optimality from other methods does not suffer their serious drawbacks. I will demonstrate two datasets how...