Janek Thomas


I am group leader of the Fraunhofer IIS group AutoML & XAI funded by the Ada Lovelace Center. The group is closely connected to the Working Group Computational Statistics as well as the Chair of Databses Systems and Data Mining. Group members conduct part of their PhD research in collaboration with the working group.

I finished my PhD at the Working Group Computational Statistics in April 2019 focusing on Automated Machine Learning and Gradient Boosting. During my PhD I did research internships at the Microsoft Cloud and Information Services Lab and H2O.ai.


Fraunhofer Institute for Integrated Circuits (IIS)

Hansastraße 32

D-8068 München

Room 04.24

+49 160 94493668

janek.thomas [at] iis.fraunhofer.de

(old) janek.thomas [at] stat.uni-muenchen.de

Research Interests

You Can Find me on



  1. Goschenhofer J, Pfister FMJ, Yuksel KA, Bischl B, Fietzek U, Thomas J (2019) Wearable-based Parkinson’s Disease Severity Monitoring using Deep Learning Joint European Conference on Machine Learning and Knowledge Discovery in Databases,
  2. Pfister FMJ, Schumann A von, Bemetz J et al. (2019) Recognition of subjects with early-stage Parkinson from free-living unilateral wrist-sensor data using a hierarchical machine learning model JOURNAL OF NEURAL TRANSMISSION, pp. 663–663. SPRINGER WIEN SACHSENPLATZ 4-6, PO BOX 89, A-1201 WIEN, AUSTRIA.
  3. Thomas J (2019) Gradient boosting in automatic machine learning: feature selection and hyperparameter optimization.
  4. Gijsbers P, LeDell E, Thomas J, Poirier S, Bischl B, Vanschoren J (2019) An Open Source AutoML Benchmark ICML AutoML Workshop,
  5. Pfisterer F, Coors S, Thomas J, Bischl B (2019) Multi-Objective Automatic Machine Learning with AutoxgboostMC ECML PKDD 2019 Workshop on Automating Data Science (ADS 2019),
  6. Pfisterer F, Thomas J, Bischl B (2019) Towards Human Centered AutoML. arXiv preprint arXiv:1911.02391.
  7. Binder M, Moosbauer J, Thomas J, Bischl B (2019) Model-Agnostic Approaches to Multi-Objective Simultaneous Hyperparameter Tuning and Feature Selection. arXiv preprint arXiv:1912.12912.
  8. Thomas J, Mayr A, Bischl B, Schmid M, Smith A, Hofner B (2018) Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates. Statistics and Computing 28, 673–687.
  9. Kühn D, Probst P, Thomas J, Bischl B (2018) Automatic Exploration of Machine Learning Experiments on OpenML. arXiv preprint arXiv:1806.10961.
  10. Thomas J, Coors S, Bischl B (2018) Automatic Gradient Boosting ICML AutoML Workshop,
  11. Schalk D, Thomas J, Bischl B (2018) compboost: Modular Framework for Component-Wise Boosting. Journal of Open Source Software 3, 967.
  12. Rijn JN van, Pfisterer F, Thomas J, Muller A, Bischl B, Vanschoren J (2018) Meta Learning for Defaults–Symbolic Defaults Neural Information Processing Workshop on Meta-Learning,
  13. Thomas J, Hepp T, Mayr A, Bischl B (2017) Probing for sparse and fast variable selection with model-based boosting. Computational and Mathematical Methods in Medicine 2017, 8–pages.
  14. Bischl B, Richter J, Bossek J, Horn D, Thomas J, Lang M (2017) mlrMBO: A modular framework for model-based optimization of expensive black-box functions. arXiv preprint arXiv:1703.03373.
  15. Kotthaus H, Richter J, Lang A et al. (2017) Rambo: Resource-aware model-based optimization with scheduling for heterogeneous runtimes and a comparison with asynchronous model-based optimization International Conference on Learning and Intelligent Optimization, pp. 180–195. Springer, Cham.
  16. Schiffner J, Bischl B, Lang M et al. (2016) mlr Tutorial. arXiv preprint arXiv:1609.06146.
  17. Rietzler M, Geiselhart F, Thomas J, Rukzio E (2016) FusionKit: a generic toolkit for skeleton, marker and rigid-body tracking Proceedings of the 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, pp. 73–84.