My goal is to build machine learning models capable of approaching the performance of biological brains in terms of flexibility to changes in tasks and environments. Drawing inspiration from adaptation behavior of biological systems, I study methods for domain adaptation and self-supervised learning and build machine learning tools for robust scientific inference in neuroscience.
I pursue my doctoral studies at the Max Planck International Research School for Intelligent School and the Swiss Federal Institute of Technology Lausanne (EPFL), advised by Matthias Bethge and Mackenzie Mathis in the ELLIS PhD & PostDoc program.
During my PhD, I also worked as a Research Scientist Intern advised by Laurens van der Maaten and Ishan Misra on multimodal representation learning in the FAIR team at Meta NYC, and at Amazon Web Services in Tübingen as an Applied Science Intern where I worked on self-learning and object centric representations with Matthias Bethge, Bernhard Schölkopf and Peter Gehler.
Prior to starting my PhD, I worked on wav2vec and vq-wav2vec, two self-supervised representation learning algorithms for speech processing with Michael Auli, Alexei Baevski and Ronan Collobert at Facebook AI Research in Menlo Park, CA.
Aside from my research, I’m a strong supporter of exposing children to modern computer science topics early on during their school education. That’s why I co-founded and advised IT4Kids to teach CS in elementary school, KI macht Schule to teach AI and Machine Learning fundamentals in high school and helped organizing the German National Competition in AI for high school students. If you want to join our team at KI macht Schule and bring AI education to every school in Germany, Austria and Switzerland, don’t hesitate to reach out!
Likewise, if you are a student looking for opportunities for an internship, Bachelor or Master’s thesis, have a look at my past work and current student projects and ping me if you’re interested in working with me.
Visiting PhD Student (ELLIS)
Swiss Federal Institute of Technology Lausanne (EPFL)
2021 - now
PhD Candidate, Machine Learning
Intl. Max Planck Research School, Tübingen
2019 - now
Research Scientist Intern
FAIR at Meta, New York City
Applied Science Intern
Amazon Web Services, Tübingen
AI Resident, Self-Supervised Learning for Speech Recognition
Facebook AI Research, Menlo Park, CA
2018 - 2019
MSc in Neuroengineering
Technical University of Munich
2016 - 2018
BSc in Electrical Engineering, Information Technology and Computer Engineering
RWTH Aachen University
2013 - 2016
robusta, our PyTorch library for robustness & adaptation on Github. Beyond ImageNet, we demonstrate improved ood. performance of pre-trained pose estimation networks and discuss principles, pitfalls and perspectives of pose estimation algorithms in our recent Neuron primer.
robustasoftware package on Github.
My full publication list is available on Google Scholar.
* denotes co-first authorship.
I am always looking for motivated students interested in joining me in the Bethge and/or Mathis lab. If you’re interested in working with me on topics around robustness, domain adaptation, reinforcement learning and self-supervised learning at the intersection of neuroscience and machine learning, please contact me at email@example.com.
I am currently working with:
In the past, I also worked with: