Lennart Schneider
About
Since June 2021, I am a PhD student at the Chair of Statistical Learning and Data Science. I obtained both an MSc degree in Statistics as well as an MSc degree in Psychology after studying Statistics (LMU Munich), Psychology and Mathematics (Eberhard Karls University of Tübingen).
Prior to joining as a PhD student, I worked as a student assistant doing software development within the mlr3 ecosystem. Moreover, I am author and maintainer of mlr3mbo - an R package for flexible Bayesian Optimization.
My main research interests include Hyperparameter Optimization (HPO), Neural Architecture Search (NAS), Automated Machine Learning (AutoML), Black Box Optimization in general, and Benchmarking.
Contact
Institut für Statistik
Ludwig-Maximilians-Universität München
Ludwigstraße 33
D-80539 München
lennart.schneider [at] stat.uni-muenchen.de
Research Interests
- Hyperparameter Optimization
- NAS
- AutoML
- Black Box Optimization
- Multi Objective and Quality Diversity Optimization
- Benchmarking
You Can Find me on
References
- Nagler T, Schneider L, Bischl B, Feurer M (2024) Reshuffling Resampling Splits Can Improve Generalization of Hyperparameter Optimization. arXiv:2405.15393 [stat.ML].
arXiv | PDF | Code. - Hornung R, Nalenz M, Schneider L, Bender A, Bothmann L, Bischl B, Augustin T, Boulesteix A-L (2023) Evaluating Machine Learning Models in Non-Standard Settings: An Overview and New Findings. arXiv:2310.15108 [cs, stat].
link | pdf. - Prager RP, Dietrich K, Schneider L, Schäpermeier L, Bischl B, Kerschke P, Trautmann H, Mersmann O (2023) Neural Networks as Black-Box Benchmark Functions Optimized for Exploratory Landscape Features Proceedings of the 17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms, pp. 129–139.
link | pdf. - Purucker L, Schneider L, Anastacio M, Beel J, Bischl B, Hoos H (2023) Q(D)O-ES: Population-based Quality (Diversity) Optimisation for Post Hoc Ensemble Selection in AutoML AutoML Conference 2023,
link | pdf. - Schneider L, Bischl B, Thomas J (2023) Multi-Objective Optimization of Performance and Interpretability of Tabular Supervised Machine Learning Models Proceedings of the Genetic and Evolutionary Computation Conference, pp. 538–547.
link | pdf. - Karl F, Pielok T, Moosbauer J, Pfisterer F, Coors S, Binder M, Schneider L, Thomas J, Richter J, Lang M, Garrido-Merchán EC, Branke J, Bischl B (2023) Multi-Objective Hyperparameter Optimization in Machine Learning – An Overview. ACM Transactions on Evolutionary Learning and Optimization 3, 1–50.
- Moosbauer J, Binder M, Schneider L, Pfisterer F, Becker M, Lang M, Kotthoff L, Bischl B (2022) Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers. IEEE Transactions on Evolutionary Computation 26, 1336–1350.
link | pdf. - Pfisterer F, Schneider L, Moosbauer J, Binder M, Bischl B (2022) Yahpo Gym – An Efficient Multi-Objective Multi-Fidelity Benchmark for Hyperparameter Optimization International Conference on Automated Machine Learning, pp. 3–1. PMLR.
link | pdf. - Schneider L, Pfisterer F, Kent P, Branke J, Bischl B, Thomas J (2022) Tackling Neural Architecture Search With Quality Diversity Optimization International Conference on Automated Machine Learning, pp. 9–1. PMLR.
link | pdf. - Schneider L, Pfisterer F, Thomas J, Bischl B (2022) A Collection of Quality Diversity Optimization Problems Derived from Hyperparameter Optimization of Machine Learning Models Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 2136–2142.
link | pdf. - Schneider L, Schäpermeier L, Prager RP, Bischl B, Trautmann H, Kerschke P (2022) HPO X ELA: Investigating Hyperparameter Optimization Landscapes by Means of Exploratory Landscape Analysis Parallel Problem Solving from Nature – PPSN XVII, pp. 575–589.
link | pdf. - Binder M, Pfisterer F, Lang M, Schneider L, Kotthoff L, Bischl B (2021) mlr3pipelines - Flexible Machine Learning Pipelines in R. Journal of Machine Learning Research 22, 1–7.
link | pdf. - Schneider L, Pfisterer F, Binder M, Bischl B (2021) Mutation is All You Need 8th ICML Workshop on Automated Machine Learning,
pdf.