Reinhard Heckel is a Rudolf Mößbauer assistant professor in the Department of Electrical and Computer Engineering at the Technical University of Munich. Before that, he was an assistant professor in the Department of Electrical and Computer Engineering at Rice University. Before that, he spent one and a half years as a postdoctoral researcher in the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley, and a year in the Cognitive Computing & Computational Sciences Department at IBM Research Zurich, where he co-designed a recommender system currently in use. He completed his PhD in 2014 at ETH Zurich and was a visiting PhD student at the Statistics Department at Stanford University. Reinhard Heckel is working in machine learning and his current research focuses on active learning, deep generative models for learning and solving inverse problems, and DNA data storage.
2018, NSF IIS Small “Actively learning from the crowd”
2015, ETH Zurich medal for outstanding Ph.D. thesis
2015, IBM patent application invention achievement award
2014, Early Postdoc.Mobility fellowship from the Swiss National Science Foundation
2012, Best student paper award at the Int. Workshop on Comp. Systems Biology
My research lies in the intersection of machine learning, statistics, and signal processing. Currently I'm particularly interested in: Learning from few and noisy examples, fundamentals of deep learning, solving inverse problems with deep learning, and DNA data storage.
R. Heckel, N. B. Shah, K. Ramchandran, and M. J. Wainwright: Active ranking from pairwise comparisons and when parametric assumptions don’t help, Annals of Statistics, 2018.
R. Heckel and P. Hand: Deep decoder: Concise image representations from untrained non-convolutional networks, ICLR 2019 (International Conference on Learning Representations).
R. Heckel, M. Simchowitz, K. Ramchandran, and M. J. Wainwright: Approximate ranking from pairwise comparisons, AISTATS 2018 (International Conference on Artificial Intelligence and Statistics).
R. Heckel and K. Ramchandran: The sample complexity of online one-class collaborative filtering, ICML 2017 (International Conference on Machine Learning).
R. N. Grass, R. Heckel, M. Puddu, D. Paunescu, and W. J. Stark: Robust chemical preservation of digital information on DNA in silica with error-correcting codes, Angewandte Chemie International Edition, 2015.
R. Heckel and M. Soltanolkotabi: Generalized line spectral estimation via convex optimization, IEEE Transactions on Information Theory, 2018.
R. Heckel, M. Tschannen, and H. Bölcskei: Dimensionality-reduced subspace clusteringm, Information and Inference: A Journal of the IMA, 2017.
R. Heckel and H. Bölcskei: Robust subspace clustering via thresholding, IEEE Transactions on Information Theory.
Darestani, Mohammad Zalbagi; Chaudhari, Akshay S.; Heckel, Reinhard: Measuring Robustness in Deep Learning Based Compressive Sensing. , 2021 more…
Donhauser, Konstantin; Ţifrea, Alexandru; Aerni, Michael; Heckel, Reinhard; Yang, Fanny: Interpolation can hurt robust generalization even when there is no noise. , 2021 more…
Fabian, Zalan; Heckel, Reinhard; Soltanolkotabi, Mahdi: Data augmentation for deep learning based accelerated MRI reconstruction with limited data. , 2021 more…
Huang, Wen; Hand, Paul; Heckel, Reinhard; Voroninski, Vladislav: A Provably Convergent Scheme for Compressive Sensing Under Random Generative Priors. Journal of Fourier Analysis and Applications 27 (2), 2021 more…
Shomorony, Ilan; Heckel, Reinhard: DNA-Based Storage: Models and Fundamental Limits. IEEE Transactions on Information Theory 67 (6), 2021, 3675-3689 more…
Zalbagi Darestani, Mohammad; Heckel, Reinhard: Accelerated MRI With Un-Trained Neural Networks. IEEE Transactions on Computational Imaging 7, 2021, 724-733 more…
Dai, Zhenwei; Desai, Aditya; Heckel, Reinhard; Shrivastava, Anshumali: Active Sampling Count Sketch (ASCS) for Online Sparse Estimation of a Trillion Scale Covariance Matrix. , 2020 more…
Heckel, Reinhard; Yilmaz, Fatih Furkan: Early Stopping in Deep Networks: Double Descent and How to Eliminate it. , 2020 more…