My goal is to build machine learning models capable of approaching the performance of biological brains in terms of flexibility to changes in tasks and environments. Drawing inspiration from adaptation behavior of biological systems, I study methods for domain adaptation and self-supervised learning and build machine learning tools for robust scientific inference in neuroscience.

I pursue my doctoral studies at the Max Planck International Research School for Intelligent School and the Swiss Federal Institute of Technology Lausanne (EPFL), advised by Matthias Bethge and Mackenzie Mathis in the ELLIS PhD & PostDoc program.

During my PhD, I also worked as a Research Scientist Intern advised by Laurens van der Maaten and Ishan Misra on multimodal representation learning in the FAIR team at Meta NYC, and at Amazon Web Services in Tübingen as an Applied Science Intern where I worked on self-learning and object centric representations with Matthias Bethge, Bernhard Schölkopf and Peter Gehler.

Prior to starting my PhD, I worked on wav2vec and vq-wav2vec, two self-supervised representation learning algorithms for speech processing with Michael Auli, Alexei Baevski and Ronan Collobert at Facebook AI Research in Menlo Park, CA.

Aside from my research, I’m a strong supporter of exposing children to modern computer science topics early on during their school education. That’s why I co-founded and advised IT4Kids to teach CS in elementary school, KI macht Schule to teach AI and Machine Learning fundamentals in high school and helped organizing the German National Competition in AI for high school students. If you want to join our team at KI macht Schule and bring AI education to every school in Germany, Austria and Switzerland, don’t hesitate to reach out!

Likewise, if you are a student looking for opportunities for an internship, Bachelor or Master’s thesis, have a look at my past work and current student projects and ping me if you’re interested in working with me.

Interests

  • Self-Supervised Learning
  • Sensorimotor Adaptation
  • Domain Adaptation
  • Computational Neuroscience

Education & Research

  • Visiting PhD Student (ELLIS)

    Swiss Federal Institute of Technology Lausanne (EPFL)

    2021 - now

  • PhD Candidate, Machine Learning

    Intl. Max Planck Research School, Tübingen

    2019 - now

  • Research Scientist Intern

    FAIR at Meta, New York City

    Spring 2022

  • Applied Science Intern

    Amazon Web Services, Tübingen

    Fall 2020

  • AI Resident, Self-Supervised Learning for Speech Recognition

    Facebook AI Research, Menlo Park, CA

    2018 - 2019

  • MSc in Neuroengineering

    Technical University of Munich

    2016 - 2018

  • BSc in Electrical Engineering, Information Technology and Computer Engineering

    RWTH Aachen University

    2013 - 2016

News

Latest Research

  • Machine learning for Neuroscience: Growing datasets in neuroscience require tools for jointly analyzing high-dimensional behavioral and neural recordings. We built a contrastive learning framework for estimating Consistent EmBeddings of high-dimensional Recordings using Auxiliary variables (CEBRA). Stay tuned for the code release.
  • Robustness & Adaptation in Computer Vision: We demonstrated that robustness estimates of ImageNet-scale architectures can be drastically improved by adapting models with simple domain adaptation methods. Batch norm adaptation yields consistent 5-15% points gains across various model architectures. Self-learning can further improve scores, even for models pre-trained on large amounts of data. We obtain scores as low as 22% mCE on ImageNet-C, 17.4% top-1 error on ImageNet-R and 14.8% top-1 error on ImageNet-A. Check out robusta, our PyTorch library for robustness & adaptation on Github. Beyond ImageNet, we demonstrate improved ood. performance of pre-trained pose estimation networks and discuss principles, pitfalls and perspectives of pose estimation algorithms in our recent Neuron primer.
  • Self-supervised Learning: wav2vec and vq-wav2vec demonstrated the effectiveness of contrastive learning for reducing the need for labeled data in speech recognition models, and a lot of progress in contrastive learning has been obtained in other fields of machine learning. We worked on understanding the effectiveness of contrastive pre-training and found that contrastive learning can invert the data-generating process.

Events

  • October 2022: We updated the CEBRA pre-print with additional experiments and identifiability results. Make sure to watch the repository to stay tuned about updates on the code release.
  • May 2022: I joined the FAIR team at Meta in New York as a research scientist intern! I will be working with Laurens van der Maaten and Ishan Misra on multimodal representation learning.
  • April 2022: We released the pre-print of CEBRA, our contrastive representation learning method for joint behavioral and neural data. Read more at cebra.ai!
2021
  • Jan 2021: I started as a visiting PhD student at Campus Biotech, EPFL, in Geneva.
2020
  • Sep 2020: I joined Amazon Web Services in Tübingen on September 1st as a full-time Applied Science intern, advised by Matthias Bethge, Bernhard Schölkopf and Peter Gehler.
  • Aug 2020: Our team at KI macht Schule organized a four day AI & ML bootcamp for students; learn more at KI-Camp.de.
  • Feb 2020: Together with fellow doctoral students of the Tübingen AI Competence Center, I am organizing a one-day doctoral symposium in February.
  • Feb 2020: I joined the Mouse Motor Lab led by Mackenzie Mathis as an ELLIS PhD student. I’ll be working at the Rowland Institute at Harvard in February and March.
  • Jan 2020: I co-organized a course on quantum machine learning with Luisa Eck (LMU Munich) and Lucas Stoffl (TU Munich) at the CdE winter school 2020 in Kaub and Oberwesel, Germany.

Meetings, Workshops & Talks

2020
2017-2020

Software

Press

Student Projects

I am always looking for motivated students interested in joining me in the Bethge and/or Mathis lab. If you’re interested in working with me on topics around robustness, domain adaptation, reinforcement learning and self-supervised learning at the intersection of neuroscience and machine learning, please contact me at steffen@bethgelab.org.

I am currently working with:

  • Jin Hwa Lee (now PhD student at UCL) joined the Bethge & Mathis labs for a summer internship in August 2020, and continued working in the Mathis lab as a Master’s thesis student and research assistant on self-supervised representation learning for neural data analysis [Paper].
  • … a few incoming Master’s students I’ll announce here soon …

In the past, I also worked with:

  • Shubham Krishna (now DevOps/MLOps) started as a research assistant in the Bethge lab in April 2020. Shubham works on topics around invariant representation learning and is currently doing a Master’s thesis with Bosch AI [Paper].
  • Mert Yüksekgönül (now PhD student in Stanford). Mert worked on 3D lifting approaches for analyzing behavioral data.
  • Khushdeep Singh (now Research Engineer at INRIA) joined the Bethge lab as a Master thesis student in April 2020. Khushdeep worked on benchmarking and improving the robustness of reinforcement learning algorithms with self-supervised learning. [Paper]
  • Xingying Chen was a summer intern in the Bethge & Mathis labs from August - November 2020, modeling adaptation paradigms in neuroscience using reinforcement learning.
  • Jan Hansen-Palmus from the Bringmann group at Uni Tübingen was co-advised by Evgenia Rusak and me. He wrote his Bachelor’s thesis on constrained optimization approaches for pseudo-labeling.

Projects

KI macht Schule

KI macht Schule provides classes in AI & Machine Learning for German high school students

NeuBtracker

An Imaging Platform for Neurobehavioral Research

Biomodels Retreat

Establishing fruitful collaborations between biologists, computer scientists and mathematicians in a yearly one-week retreat.

MSNE Blog

M.Sc. Neuroengineering Student Blog with latest information about our study program and events.

Ecurie Aix eace04/05

Contributions include the design of electric control units

Campus Weggemeinschaft

Homepage der Campus Weggemeinschaft

IT4Kids

IT4Kids provides computer science classes to elementary school pupils - Providing software, teaching materials and easy communication …