Lei Shi

ORCID: 0000-0002-9512-5273
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Sparse and Compressive Sensing Techniques
  • Numerical methods in inverse problems
  • Statistical Methods and Inference
  • Neural Networks and Applications
  • Distributed Sensor Networks and Detection Algorithms
  • Face and Expression Recognition
  • Stochastic Gradient Optimization Techniques
  • Image and Signal Denoising Methods
  • Advanced Algorithms and Applications
  • Microwave Imaging and Scattering Analysis
  • Control Systems and Identification
  • Model Reduction and Neural Networks
  • Machine Learning and Algorithms
  • Fault Detection and Control Systems
  • Blind Source Separation Techniques
  • Matrix Theory and Algorithms
  • Numerical methods in engineering
  • Network Security and Intrusion Detection
  • Bayesian Methods and Mixture Models
  • Anomaly Detection Techniques and Applications
  • Bayesian Modeling and Causal Inference
  • Photoacoustic and Ultrasonic Imaging
  • Cardiac electrophysiology and arrhythmias
  • Flavonoids in Medical Research
  • Diverse Scientific and Engineering Research

Fudan University
2014-2024

Hong Kong Baptist University
2024

North China Electric Power University
2015

Guangxi University
2015

KU Leuven
2013-2014

iMinds
2014

City University of Hong Kong
2009-2011

University of Science and Technology of China
2009-2010

Suzhou University of Science and Technology
2010

Zhongyuan University of Technology
2009

10.1016/j.acha.2011.01.001 article EN publisher-specific-oa Applied and Computational Harmonic Analysis 2011-01-13

Within the statistical learning framework, this paper studies regression model associated with correntropy induced losses. The correntropy, as a similarity measure, has been frequently employed in signal processing and pattern recognition. Motivated by its empirical successes, aims at presenting some theoretical understanding towards maximum criterion problems. Our focus is two-fold: first, we are concerned connections between loss least squares model. Second, study convergence property. A...

10.5555/2789272.2886783 article EN Journal of Machine Learning Research 2015-01-01

The ramp loss is a robust but non-convex for classification. Compared with other losses, local minimum of the can be effectively found. effectiveness search comes from piecewise linearity loss. Motivated by fact that l1-penalty linear as well, applied loss, resulting in programming support vector machine (ramp-LPSVM). proposed ramp-LPSVM minimization problem and related optimization techniques are applicable. Moreover, enhance sparsity. In this paper, corresponding misclassification error...

10.5555/2627435.2670321 article EN Journal of Machine Learning Research 2014-01-01

Noise-inclusive fully unsupervised anomaly detection (FUAD) holds significant practical relevance. Although various methods exist to address this problem, they are limited in both performance and scalability. Our work seeks overcome these obstacles, enabling broader adaptability of (UAD) models FUAD. To achieve this, we introduce the Synergy Scoring Filter (SSFilter), first approach leverage sample-level filtering. SSFilter facilitates end-to-end robust training applies filtering complete...

10.48550/arxiv.2502.13992 preprint EN arXiv (Cornell University) 2025-02-18

Abstract By selecting different filter functions, spectral algorithms can generate various regularization methods to solve statistical inverse problems within the learning-from-samples framework. This paper combines distributed with Sobolev kernels tackle functional linear regression problem. The design and mathematical analysis of require only that covariates are observed at discrete sample points. Furthermore, hypothesis function spaces generated by kernels, optimizing both approximation...

10.1088/1361-6420/adbd6b article EN Inverse Problems 2025-03-06

10.1016/j.acha.2012.05.001 article EN publisher-specific-oa Applied and Computational Harmonic Analysis 2012-05-03

Applying the pinball loss in a support vector machine (SVM) classifier results pin-SVM. The is characterized by parameter τ . Its value related to quantile level and different values are suitable for problems. In this paper, we establish an algorithm find entire solution path pin-SVM with values. This based on fact that optimal continuous piecewise linear respect We also show nonnegativity constraint not necessary, i.e., can be extended negative First, some applications, leads better...

10.1109/tnnls.2016.2547324 article EN IEEE Transactions on Neural Networks and Learning Systems 2016-04-09

10.1007/s10208-023-09616-9 article EN Foundations of Computational Mathematics 2023-07-26

10.1016/j.acha.2017.06.001 article EN publisher-specific-oa Applied and Computational Harmonic Analysis 2017-06-22

The ranking problem aims at learning real-valued functions to order instances, which has attracted great interest in statistical theory. In this paper, we consider the regularized least squares algorithm within framework of reproducing kernel Hilbert space. particular, focus on analysis generalization error for algorithm, and improve existing rates by virtue an decomposition technique from regression Hoeffding’s U-statistics.

10.1142/s0219530517500063 article EN Analysis and Applications 2017-04-11

10.1007/s40305-014-0069-4 article EN Journal of the Operations Research Society of China 2015-02-25

10.1007/s10444-012-9288-6 article EN Advances in Computational Mathematics 2012-11-29

In this paper, we study the gradient descent algorithm generated by a robust loss function over reproducing kernel Hilbert space (RKHS). The is defined windowing G and scale parameter σ, which can include wide range of commonly used losses for regression. There still gap between theoretical analysis optimization process empirical risk minimization based on loss: estimator needs to be global optimal in while method not ensure optimality its solutions. aim fill developing novel performance...

10.1088/1361-6420/aabe55 article EN Inverse Problems 2018-04-16

Abstract The main contribution of this paper is the derivation non-asymptotic convergence rates for Nyström kernel canonical correlation analysis (CCA) in a setting statistical learning. Our theoretical results reveal that, under certain conditions, CCA can achieve rate comparable to that standard CCA, while offering significant computational savings. This finding has important implications practical application particularly scenarios where efficiency crucial. Numerical experiments are...

10.1088/1361-6420/ad2900 article EN Inverse Problems 2024-02-13

10.1016/j.acha.2017.11.005 article EN publisher-specific-oa Applied and Computational Harmonic Analysis 2017-12-02

10.1007/s10444-024-10165-0 article Advances in Computational Mathematics 2024-07-10

10.1016/j.mcm.2011.03.042 article EN publisher-specific-oa Mathematical and Computer Modelling 2011-04-19

In recent years there has been massive interest in precision medicine, which aims to tailor treatment plans the individual characteristics of each patient. This paper studies estimation individualized rules (ITR) based on functional predictors such as images or spectra. We consider a reproducing kernel Hilbert space (RKHS) approach learn optimal ITR maximizes expected clinical outcome. The algorithm can be conveniently implemented although it involves infinite-dimensional data. provide...

10.3934/mfc.2019012 article EN Mathematical Foundations of Computing 2019-01-01

We investigate the distributed learning with coefficient-based regularization scheme under framework of kernel regression methods. Compared classical ridge (KRR), algorithm consideration does not require function to be positive semi-definite and hence provides a simple paradigm for designing indefinite The approach partitions massive data set into several disjoint subsets, then produces global estimator by taking an average local on each subset. Easy exercisable performing subset in parallel...

10.1142/s021953051850032x article EN Analysis and Applications 2019-01-09

10.1016/j.cam.2009.11.059 article EN publisher-specific-oa Journal of Computational and Applied Mathematics 2009-12-07
Coming Soon ...