STATISTICS LAB FOR CAUSAL & ROBUST MACHINE LEARNING
  • Home
  • Lab
  • About
  • Research
  • Papers
  • Contact

Journal Articles


29. Detangling robustness in high-dimensions: composite vs model-averaged estimation |  2019+
Illustrated that composite estimation has benefits over model-averaged
estimation in high-dimensions with sparsity 

(with Jing Zhou and Gerda Claeskens) 

28. Sparsity double robust inference of average treatment effects |  2019+
Proposed new balancing via moment-targeting for average
treatment effect estimation and inference 

(with Stefan Wager and Yinchu Zhu) 

27. Causal quantile learner: robust structural equations |  2019+
Proposed quantile invariance for learning cause-effect relationship between many 
variables with interventional and observational data

(with Denise Rava) 

26. Synthetic learner: model-free inference on treatments over time |  2019+
Inference for counterfactuals using  machine learning
where the true model may not be captured in the dictionary

(with Davide Viviano) 

25. Estimating treatment effects under additive hazards models with high-dimensional covariates |  JASA:T&M, revision
New score developed for confidence intervals of treatment effects
in the presence of censored outcomes

(with Jue Hou and Ronghui Xu) 

24. Censored quantile regression forest |  AIStats, to appear
Censored quantile random forest  estimates
quantiles of censored responses in a regression setting non-parametrically.

(with Hanbo Li ) 

23. High-dimensional semi-supervised learning: in search of optimal inference of the mean  |  Biometrika, major revision
Semi-supervised inference of the responses  mean when covariates
​are high-dimensional and model is not necessarily correctly specified

(with Yuqian Zhang) 

22. Tuning free robust and efficient approach to high-dimensional regression |  JASA:T&M, major revision
Variable selection without tuning that is robust as well as efficient
(with Lan Wang, Runze Li, Bo Peng and Yunan Wu )
 

21. Confidence intervals for high-dimensional Cox models |  Statistica SINICA,  to appear
Assumption-lean asymptotic theory for the inference in the Cox model 
(with Yi Yu and Richard J. Samworth )
 

20. Minimax rates and adaptivity of tests in high-dimensional linear models with non-sparse structures |  AOS, to appear
New minimax optimality results regarding confidence intervals -- no sparsity restrictions needed 
(with Jianqing Fan and Yinchu Zhu )


19. Asymptotic theory of rank estimation for high-dimensional accelerated failure time models |  AOS, revision
Finite-sample estimation properties of  new regularizer  for AFT models 
(with Lan Wang )


18. High-dimensional classification with errors in variables using high-confidence sets |  2018+
Classifiers for noisy data with possibly non-sparse classification boundaries
(with Emre Barut, Jianqing Fan and Jiancheng Jiang )


17. Inference under Fine-Gray competing risks model with high-dimensional covariates  |  EJS,   13(2), 4449-4507, (2019)
Regularization and testing for Fine-Gray model with many more parameters than samples
(with Jue Hou and Ronghui Xu )


16. Testing fixed effects in high-dimensional misspecified linear mixed models  | JASA:T&M, to appear
Estimators and tests adaptive to misspecification of random effects
(with Gerda Claeskens and Thomas Gueuning )


15. Breaking the curse of dimensionality in high-dimensions  |  JMLR, revision
Tests of multivariate parameters in high-dimensional setting that does not rely 
on the sparsity of the underlying linear model
(with Yinchu Zhu)


14. A projection pursuit framework for testing general high-dimensional hypothesis  |  2017+
New projection estimator and test statistic based on the contrast of two competing estimators 
(with Yinchu Zhu)


13. Two-sample testing in high-dimensional and dense models   |  2016+
New algorithm TIERS for distinguishing between coefficients of two 
​ high-dimensional regressions
(with Yinchu Zhu)


12. Uniform inference for high-dimensional quantile process:  linear testing and regression rank scores   |  AOS, revision
Uniform bahadur representation, dual problems and uniform multivariate tests
(with Mladen Kolar)


11. Generalized M-estimators for high-dimensional tobit I models |  EJS,  13(1), 582-645, (2019)
Mallow's, Hill-Ryan's and Sweepe's one-step estimators for
left (fixed) censored data: Tobit I model and its variants

(with Jiaqi Guo)

10. Linear hypothesis testing in dense high-dimensional linear models   | JASA:T&M, 113(524), 1583-1600, (2018)
New restructured regression method for testing hypothesis with
dense parameters and dense loadings in high-dimensions
(with Yinchu Zhu)


9. Significance testing in non-sparse high-dimensional linear models   |  EJS, 12(2), 3312-3364 (2018)
New method CorrT proposed that preserves Type I error and Type II error   even in
dense (non-sparse) and ultra high-dimensional models

(with Yinchu Zhu)

8. Boosting in the presence of outliers: adaptive classification in the presence of outliers   | JASA:T&M ,  113(512), 660-674, (2018)
New boosting method ArchBoost that is robust to data or label perturbations
-- adversarial or not 
(with Alexander Hanbo Li)

7. Comment on  "High dimensional simultaneous inference via bootstrap"
​                                                                  by R. Dezeure, P. Buhlmann  and C-H. Zhang
 
 |  TEST, 26(4), 720-728 (2017) 

Discussed residual bootstrap efficiency and proposed new residual bootstrap
​for mixture of sparse and dense models
(with Yinchu Zhu)


6. Robustness in sparse  high-dimensional models: 
relative efficiency based on approximate message passing  |  EJS, 10(2), 3894-3944 (2016)

New AMP algorithm is proposed, RAMP, that is shown to be
efficient regardless of the error distribution

5. Randomized maximum contrast selection: subagging for large-scale regression |  EJS, 10(1), 121-170, (2016)
Model selection for big data: naive selection of variables fails
whereas maximum contrast selection succeeds 

4. Cultivating disaster donors using data analytics |  Mgmt. Science, 62(3), 849-866, (2016)
Importance sampling and logistic regression in
​divide and conquer setting
(with Ilya Ryzhov and Bin Han)

3. Structured estimation in non-parametric Cox model | EJS, 9(1), 492-534, (2015)
Estimation in misspecified high-dimensional Cox model
(with Rui Song)


2. Regularization for Cox's proportional hazard model with NP dimensionality |  AOS, 39(6), 3092-3120, (2011)
Lasso and SCAD model selection properties for ultra high dimensional data
(with Jianqing Fan and Jiancheng Jiang)

1. Composite quasi-likelihood for high-dimensional variable selection | JRSSB, 73(3), 325-349, (2011)
Model selection robust and adaptive to the error distribution
​(with Jianqing Fan and Weiwei Wang)

Proudly powered by Weebly