Alexandre Belloni

ORCID: 0000-0001-9368-8833
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Statistical Methods and Inference
  • Statistical Methods and Bayesian Inference
  • Advanced Statistical Methods and Models
  • Advanced Causal Inference Techniques
  • Monetary Policy and Economic Impact
  • Control Systems and Identification
  • Sparse and Compressive Sensing Techniques
  • Advanced Optimization Algorithms Research
  • Auction Theory and Applications
  • Consumer Market Behavior and Pricing
  • Markov Chains and Monte Carlo Methods
  • Bayesian Methods and Mixture Models
  • Spatial and Panel Data Analysis
  • Bayesian Modeling and Causal Inference
  • Advanced Statistical Process Monitoring
  • Health Systems, Economic Evaluations, Quality of Life
  • Probabilistic and Robust Engineering Design
  • Point processes and geometric inequalities
  • Supply Chain and Inventory Management
  • Fault Detection and Control Systems
  • Matrix Theory and Algorithms
  • Machine Learning and Algorithms
  • Game Theory and Applications
  • Statistical Methods in Clinical Trials
  • Complexity and Algorithms in Graphs

Duke University
2014-2024

Amazon (United States)
2024

Institute for Fiscal Studies
2009-2018

Columbia University
2018

University of California, Los Angeles
2018

Cornell University
2014-2017

Boston University
2016-2017

University of Warwick
2017

The Econometric Society
2016

University of Maryland, College Park
2015

We propose robust methods for inference about the effect of a treatment variable on scalar outcome in presence very many regressors model with possibly non-Gaussian and heteroscedastic disturbances. allow number to be larger than sample size. To make informative feasible, we require approximately sparse; that is, confounding factors can controlled up small approximation error by including relatively variables whose identities are unknown. The latter condition makes it possible estimate...

10.1093/restud/rdt044 article EN The Review of Economic Studies 2013-11-24

Journal Article Square-root lasso: pivotal recovery of sparse signals via conic programming Get access A. Belloni, Belloni Duke University, Fuqua School Business, 100 Street, Durham, North Carolina 27708, U.S.A.abn5@duke.edu Search for other works by this author on: Oxford Academic Google Scholar V. Chernozhukov, Chernozhukov Massachusetts Institute Technology, Department Economics, 52 Memorial Drive, Cambridge, 02142, U.S.A.vchern@mit.edu L. Wang Mathematics, 77 Avenue, 02139,...

10.1093/biomet/asr043 article EN Biometrika 2011-11-24

Data with a large number of variables relative to the sample size—“high-dimensional data”—are readily available and increasingly common in empirical economics. Highdimensional data arise through combination two phenomena. First, may be inherently high dimensional that many different characteristics per observation are available. For example, US Census collects information on hundreds individual scanner datasets record transaction-level for households across wide range products. Second, even...

10.1257/jep.28.2.29 article EN The Journal of Economic Perspectives 2014-05-01

We consider median regression and, more generally, a possibly infinite collection of quantile regressions in high-dimensional sparse models. In these models, the number regressors p is very large, larger than sample size n, but only at most s have nonzero impact on each conditional response variable, where grows slowly n. Since ordinary not consistent this case, we ℓ1-penalized (ℓ1-QR), which penalizes ℓ1-norm coefficients, as well post-penalized QR estimator (post-ℓ1-QR), applies to model...

10.1214/10-aos827 article EN The Annals of Statistics 2010-12-03

In this article we study post-model selection estimators that apply ordinary least squares (OLS) to the model selected by first-step penalized estimators, typically Lasso. It is well known Lasso can estimate nonparametric regression function at nearly oracle rate, and thus hard improve upon. We show OLS post-Lasso estimator performs as in terms of rate convergence, has advantage a smaller bias. Remarkably, performance occurs even if Lasso-based “fails” sense missing some components “true”...

10.3150/11-bej410 article EN other-oa Bernoulli 2013-03-13

In this paper, we provide efficient estimators and honest confidence bands for a variety of treatment effects including local average (LATE) quantile (LQTE) in data-rich environments.We can handle very many control variables, endogenous receipt treatment, heterogeneous effects, function-valued outcomes.Our framework covers the special case exogenous either conditional on controls or unconditionally as randomized trials.In latter case, our approach produces (functional) (ATE) (QTE).To make...

10.3982/ecta12723 article EN Econometrica 2017-01-01

Journal Article Uniform post-selection inference for least absolute deviation regression and other Z-estimation problems Get access A. Belloni, Belloni Fuqua School of Business, Duke University, 100 Drive, Durham, North Carolina 27708, U.S.A., abn5@duke.edu Search works by this author on: Oxford Academic Google Scholar V. Chernozhukov, Chernozhukov Department Economics, Massachusetts Institute Technology, 50 Memorial Cambridge, 02142, vchern@mit.edu K. Kato Graduate University Tokyo, 7-3-1...

10.1093/biomet/asu056 article EN Biometrika 2014-12-24

Alexandre Bellonia, Victor Chernozhukovb & Ying Weica The Fuqua School of Business, Duke University, Durham, NC 27708 ()b Dept. Economics, Massachusetts Institute Technology, Cambridge, MA 02139 ()c Department Biostatistics, Columbia New York, NY 10032 ()

10.1080/07350015.2016.1166116 article EN Journal of Business and Economic Statistics 2016-03-22

We consider estimation and inference in panel data models with additive unobserved individual specific heterogeneity a high-dimensional setting. The setting allows the number of time-varying regressors to be larger than sample size. To make informative feasible, we require that overall contribution variables after eliminating can captured by relatively small available whose identities are unknown. This restriction problem proceed as variable selection problem. Importantly, treat fixed...

10.1080/07350015.2015.1102733 article EN Journal of Business and Economic Statistics 2015-11-18

We propose a self-tuning $\sqrt{\mathrm {Lasso}} $ method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity and (drastic) non-Gaussianity of noise. In addition, our analysis allows for badly behaved designs, example, perfectly collinear regressors, generates sharp bounds even extreme cases, such as infinite variance case noiseless case, contrast to Lasso. establish various...

10.1214/14-aos1204 article EN other-oa The Annals of Statistics 2014-04-01

We take advantage of recent advances in optimization methods and computer hardware to identify globally optimal solutions product line design problems that are too large for complete enumeration. then use this guarantee global optimality benchmark the performance more practical heuristic methods. two sources data: (1) a conjoint study previously conducted real problem, (2) simulated various sizes. For both data sources, several consistently find or near-optimal solutions, including...

10.1287/mnsc.1080.0864 article EN Management Science 2008-07-02

We propose a pivotal method for estimating high-dimensional sparse linear regression models, where the overall number of regressors p is large, possibly much larger than n, but only s are significant. The modification lasso, called square-root lasso. in that it neither relies on knowledge standard deviation σ or nor does need to pre-estimate σ. Moreover, not rely normality sub-Gaussianity noise. It achieves near-oracle performance, attaining convergence rate σ{(s/n) log p}1/2 prediction...

10.2139/ssrn.1910753 article EN SSRN Electronic Journal 2011-01-01

We develop results for the use of LASSO and Post-LASSO methods to form first-stage predictions estimate optimal instruments in linear instrumental variables (IV) models with many instruments, p, that apply even when p is much larger than sample size, n. rigorously asymptotic distribution inference theory resulting IV estimators provide conditions under which these are asymptotically oracle-efficient. In simulation experiments, LASSO-based estimator a data-driven penalty performs well...

10.2139/ssrn.1910169 article EN SSRN Electronic Journal 2011-01-01

In this paper, we develop procedures to construct simultaneous confidence bands for ${\tilde{p}}$ potentially infinite-dimensional parameters after model selection general moment condition models where is much larger than the sample size of available data, $n$. This allows us cover settings with functional response data each a function. The procedure based on construction score functions that satisfy Neyman orthogonality approximately. proposed rely uniform central limit theorems...

10.1214/17-aos1671 article EN The Annals of Statistics 2018-09-11

Summary We consider the linear regression model with observation error in design. In this setting, we allow number of covariates to be much larger than sample size. Several new estimation methods have been recently introduced for model. Indeed, standard lasso estimator or Dantzig selector turns out become unreliable when only noisy regressors are available, which is quite common practice. work, propose and analyse a errors-in-variables Under suitable sparsity assumptions, show that attains...

10.1111/rssb.12196 article EN Journal of the Royal Statistical Society Series B (Statistical Methodology) 2016-07-02

We consider median regression and, more generally, quantile in high-dimensional sparse models. In these models the overall number of regressors p is very large, possibly larger than sample size n, but only s have non-zero impact on conditional response variable, where grows slower n. Since this case ordinary not consistent, we penalized by 1-norm coefficients (L1-QR). First, show that L1-QR up to a logarithmic factor, at oracle rate which achievable when minimal true model known. The affects...

10.2139/ssrn.1394734 article EN SSRN Electronic Journal 2009-01-01

In this note, we propose the use of sparse methods (e.g. LASSO, Post-LASSO, p and Post-p LASSO) to form first-stage predictions estimate optimal instruments in linear instrumental variables (IV) models with many canonical Gaussian case. The apply even when number is much larger than sample size. We derive asymptotic distributions for resulting IV estimators provide conditions under which these sparsity-based are asymptotically oracle-efficient. simulation experiments, a estimator data-driven...

10.2139/ssrn.1908409 article EN SSRN Electronic Journal 2011-01-01

Quantile regression (QR) is a principal method for analyzing the impact of covariates on outcomes.The described by conditional quantile function and its functionals.In this paper we develop nonparametric QR-series framework, covering many regressors as special case, performing inference entire linear approximate combination series terms with quantile-specific coefficients estimate function-valued from data.We large sample theory coefficient process, namely obtain uniform strong...

10.1920/wp.cem.2016.4616 preprint EN 2016-08-30
Coming Soon ...