- Inertial Sensor and Navigation
- Machine Learning and Data Classification
- Statistical and numerical algorithms
- Target Tracking and Data Fusion in Sensor Networks
- Structural Health Monitoring Techniques
- Market Dynamics and Volatility
- Financial Markets and Investment Strategies
- Sepsis Diagnosis and Treatment
- COVID-19 Clinical Research Studies
- Monetary Policy and Economic Impact
- Explainable Artificial Intelligence (XAI)
- Cancer-related molecular mechanisms research
- Gene expression and cancer classification
- GNSS positioning and interference
- MicroRNA in disease regulation
- Advanced Multi-Objective Optimization Algorithms
- Decision-Making and Behavioral Economics
- Long-Term Effects of COVID-19
- Forecasting Techniques and Applications
- Data Stream Mining Techniques
- scientometrics and bibliometrics research
- Music and Audio Processing
- Data Analysis with R
- Circular RNAs in diseases
- Scientific Measurement and Uncertainty Evaluation
Auburn University
2021-2024
École de management de Lyon
2020-2023
University of Geneva
2017-2021
University of Lausanne
2020
ABSTRACT In statistics, samples are drawn from a population in data‐generating process (DGP). Standard errors measure the uncertainty estimates of parameters. science, evidence is generated to test hypotheses an evidence‐generating (EGP). We claim that EGP variation across researchers adds uncertainty—nonstandard (NSEs). study NSEs by letting 164 teams same on data. turn out be sizable, but smaller for more reproducible or higher rated research. Adding peer‐review stages reduces NSEs....
We propose a new approach for designing personalized treatment colorectal cancer (CRC) patients, by combining ex vivo organoid efficacy testing with mathematical modeling of the results.The validated phenotypic called Therapeutically Guided Multidrug Optimization (TGMO) was used to identify four low-dose synergistic optimized drug combinations (ODC) in 3D human CRC models cells that are either sensitive or resistant first-line chemotherapy (FOLFOXIRI). Our findings were obtained using second...
The calibration of low-cost inertial sensors has become increasingly important over the last couple decades, especially when dealing with sensor stochastic errors. This procedure is commonly performed on a single error measurement from an taken certain amount time, although it extremely frequent for different replicates to be same sensor, thereby delivering information which often left unused. In order address latter problem, this paper presents general wavelet variance-based framework...
We develop a penalized two-pass regression with time-varying factor loadings. The penalization in the first pass enforces sparsity for time-variation drivers while also maintaining compatibility no-arbitrage restrictions by regularizing appropriate groups of coefficients. second delivers risk premia estimates to predict equity excess returns. Our Monte Carlo results and our empirical on large cross-sectional data set US individual stocks show that without grouping can yield nearly all...
The task of inertial sensor calibration has required the development various techniques to take into account sources measurement error coming from such devices. stochastic errors these sensors been focus increasing amount research in which method reference so-called “Allan variance (AV) slope method” which, addition not having appropriate statistical properties, requires a subjective input makes it prone mistakes. To overcome this, recent started proposing “automatic” approaches where...
Inertial sensor calibration plays a progressively important role in many areas of research among which navigation engineering. By performing this task accurately, it is possible to significantly increase general performance by correctly filtering out the deterministic and stochastic measurement errors that characterize such devices. While different techniques are available model remove errors, there has been considerable over past years with respect modelling have complex structures. In...
In statistics, samples are drawn from a population in data-generating process (DGP). Standard errors measure the uncertainty sample estimates of parameters. science, evidence is generated to test hypotheses an evidence-generating (EGP). We claim that EGP variation across researchers adds uncertainty: non-standard errors. To study them, we let 164 teams six on same sample. find sizeable, par with standard Their size (i) co-varies only weakly team merits, reproducibility, or peer rating, (ii)...
The use of Inertial Measurement Units (IMU) for navigation purposes is constantly growing and they are increasingly being considered as the core dynamic sensing device Navigation Systems (INS). However, these systems characterized by sensor errors that can affect precision devices consequently a proper calibration sensors required. first step in this direction usually taken evaluating deterministic type errors, such bias scale factor, which be into account through known physical models....
The task of inertial sensor calibration has become increasingly important due to the growing use low-cost measurement units which are however characterized by errors. Being widely employed in a variety mass-market applications, there is considerable focus on compensating for these errors taking into account deterministic and stochastic factors that characterize them. In this paper we part error signal where it customary register latter observed identify estimate models, often complex nature,...
Predictive power has always been the main research focus of learning algorithms with goal minimizing test error for supervised classification and regression problems. While general approach these is to consider all possible attributes in a dataset best predict response interest, an important branch focused on sparse order avoid overfitting which can greatly affect accuracy out-of-sample prediction. However, many practical settings we believe that only extremely small combination different...
The task of inertial sensor calibration has always been challenging, especially when dealing with stochastic errors that remain after the deterministic have filtered out. Among others, number observations is becoming increasingly high since measurements are taken at frequencies over longer periods time, thereby placing considerable limitations on estimation complex models characterize (without considering testing and selection procedures). Moreover, before estimating these models, there a...
We develop a penalized two-pass regression with time-varying factor loadings. The penalization in the first pass enforces sparsity for time-variation drivers while also maintaining compatibility no-arbitrage restrictions by regularizing appropriate groups of coefficients. second delivers risk premia estimates to predict equity excess returns. Our Monte Carlo results and our empirical on large cross-sectional data set US individual stocks show that without grouping can yield nearly all...
Abstract Non-coding micro RNAs (miRNAs) dysregulation seems to play an important role in the pathways involved breast cancer occurrence and progression. In different studies, opposite functions may be assigned same miRNA, either promoting disease or protecting from it. Our research tackles following issues: (i) why aren’t there any concordant findings many studies regarding of miRNAs progression cancer? (ii) could a miRNA have activating effect inhibiting one according other with which it...
Abstract Breast cancer is one of the most frequent cancers affecting women. Non-coding micro RNAs (miRNAs) seem to play an important role in regulation pathways involved tumor occurrence and progression. Extending on research Haakensen et al ., where significant miRNAs were selected as being associated with progression from normal breast tissue cancer, this work we put forward 112 sets miRNA combinations, each including at 5 expressions high accuracy discriminating healthy carcinoma. Our...
The task of inertial sensor calibration has required the development various techniques to take into account sources measurement error coming from such devices. stochastic errors these sensors been focus increasing amount research in which method reference so-called "Allan variance slope method" which, addition not having appropriate statistical properties, requires a subjective input makes it prone mistakes. To overcome this, recent started proposing "automatic" approaches where parameters...
Abstract Objective To set up simple and reliable predictive scores for intensive care admissions deaths in COVID-19 patients. These adhere to the TRIPOD (transparent reporting of a multivariable prediction model individual prognosis or diagnosis) guidelines. Design Monocentric retrospective cohort study run from early March end May Clinique Saint-Pierre Ottignies, secondary hospital located Ottignies-Louvain-la-Neuve, Belgium. The outcomes are (i) admission Intensive Care Unit (ii) death....
Abstract Background Simple and reliable predictive scores for intensive care admissions death based on clinical data are still lacking. The goal of this study is to implement such patients coming from our population catchment area compare them available ones. These adhere the TRIPOD (transparent reporting a multivariable prediction model individual prognosis or diagnosis) guidelines. Methods Monocentric retrospective cohort run early March end May in Clinique Saint-Pierre Ottignies,...
The majority of machine learning methods and algorithms give high priority to prediction performance which may not always correspond the users. In many cases, practitioners researchers in different fields, going from engineering genetics, require interpretability replicability results especially settings where, for example, all attributes be available them. As a consequence, there is need make outputs more interpretable deliver library "equivalent" learners (in terms performance) that users...
We develop a penalized two-pass regression with time-varying factor loadings. The penalization in the first pass enforces sparsity for time-variation drivers while also maintaining compatibility no-arbitrage restrictions by regularizing appropriate groups of coefficients. second delivers risk premia estimates to predict equity excess returns. Our Monte Carlo results and our empirical on large cross-sectional data set US individual stocks show that without grouping can yield nearly all...