My goal is to build machine learning models capable of approaching the performance of biological brains in terms of flexibility to changes in tasks and environments. Drawing inspiration from adaptation behavior of biological systems, I study methods for domain adaptation and self-supervised learning.

I pursue my doctoral studies at the Max Planck International Research School for Intelligent Systems and the Swiss Federal Institute of Technology Lausanne (EPFL), advised by Matthias Bethge and Mackenzie Mathis in the ELLIS PhD & PostDoc program.

I previously spent time at Amazon Web Services in Tübingen as an Applied Science Intern where I worked on self-learning and object centric representations with Matthias Bethge, Bernhard Schölkopf and Peter Gehler. Prior to starting my PhD, I worked on wav2vec, a self-supervised representation learning algorithm for speech processing with Michael Auli, Alexei Baevski and Ronan Collobert at Facebook AI Research in Menlo Park, CA.

Aside from my research, I’m a strong supporter of exposing children to modern computer science topics early on during their school education. That’s why I co-founded and advised IT4Kids to teach CS in elementary school, KI macht Schule to teach AI and Machine Learning fundamentals in high school and helped organizing the German National Competition in AI for high school students. If you want to join our team at KI macht Schule and bring AI education to every school in Germany, don’t hesitate to reach out!

Likewise, if you are a student looking for opportunities for an internship, Bachelor or Master’s thesis, have a look at my past work and current student projects and ping me if you’re interested in working with me.

Interests

  • Self-Supervised Learning
  • Domain Adaptation
  • Sensorimotor Adaptation
  • Information Theory
  • Computational Neuroscience

Education & Research

  • Visiting PhD Student (ELLIS)

    Swiss Federal Institute of Technology Lausanne (EPFL)

    2021 - now

  • PhD Candidate, Machine Learning

    Intl. Max Planck Research School, Tübingen

    2019 - now

  • Applied Science Intern

    Amazon Web Services, Tübingen

    Fall 2020

  • AI Resident, Self-Supervised Learning for Speech Recognition

    Facebook AI Research, Menlo Park, CA

    2018 - 2019

  • MSc in Neuroengineering

    Technical University of Munich

    2016 - 2018

  • BSc in Electrical Engineering, Information Technology and Computer Engineering

    RWTH Aachen University

    2013 - 2016

News

Latest Research

  • Robustness & Adaptation on ImageNet scale: We demonstrated that robustness estimates of ImageNet-scale architectures can be drastically improved by adapting models with simple domain adaptation methods. Batch norm adaptation yields consistent 5-15% points gains across various model architectures. Self-learning can further improve scores, even for models pre-trained on large amounts of data. We obtain scores as low as 22% mCE on ImageNet-C, 17.4% top-1 error on ImageNet-R and 14.8% top-1 error on ImageNet-A. Check out robusta, our PyTorch library for robustness & adaptation on github: https://github.com/bethgelab/robustness.
  • Self-supervised Learning: wav2vec and vq-wav2vec demonstrated the effectiveness of contrastive learning for reducing the need for labeled data in speech recognition models, and a lot of progress in contrastive learning has been obtained in other fields of machine learning. We worked on understanding the effectiveness of contrastive pre-training and found that contrastive learning can invert the data-generating process.
  • Tools for neuroscience: Approaches for markerless tracking of animals became invaluable for behavioral recording in neuroscience research. We demonstrate improved ood. performance of pre-trained pose estimation networks and discuss principles, pitfalls and perspectives in our recent Neuron primer.

Events

  • Jan 2021: I started as a visiting PhD student at Campus Biotech, EPFL, in Geneva.
  • Sep 2020: I joined Amazon Web Services in Tübingen on September 1st as a full-time Applied Science intern, advised by Matthias Bethge, Bernhard Schölkopf and Peter Gehler.
  • Aug 2020: Our team at KI macht Schule organized a four day AI & ML bootcamp for students; learn more at KI-Camp.de.
  • Feb 2020: Together with fellow doctoral students of the Tübingen AI Competence Center, I am organizing a one-day doctoral symposium in February.
  • Feb 2020: I joined the Mouse Motor Lab led by Mackenzie Mathis as an ELLIS PhD student. I’ll be working at the Rowland Institute at Harvard in February and March.
  • Jan 2020: I co-organized a course on quantum machine learning with Luisa Eck (LMU Munich) and Lucas Stoffl (TU Munich) at the CdE winter school 2020 in Kaub and Oberwesel, Germany.

Meetings, Workshops & Talks

Software

Press

Student Projects

I am always looking for motivated students interested in joining me in the Bethge and/or Mathis lab. If you’re interested in working with me on topics around robustness, domain adaptation, reinforcement learning and self-supervised learning at the intersection of neuroscience and machine learning, please contact me at steffen@bethgelab.org.

I am currently working with:

  • Shubham Krishna started as a research assistant in the Bethge lab in April 2020. Shubham works on topics around invariant representation learning and is currently doing a Master’s thesis with Bosch AI. [Paper]
  • Jin Hwa Lee joined the Bethge & Mathis labs in August 2020. Jin is building self-supervised representation learning algorithms for analyzing neuroscience datasets. [Paper]

In the past, I also worked with:

  • Mert Yüksekgönül (now PhD student in Stanford). Mert worked on 3D lifting approaches for analyzing behavioral data.
  • Khushdeep Singh (now Research Engineer at INRIA) joined the Bethge lab as a Master thesis student in April 2020. Khushdeep worked on benchmarking and improving the robustness of reinforcement learning algorithms with self-supervised learning. [Paper]
  • Xingying Chen was a summer intern in the Bethge & Mathis labs from August - November 2020, modeling adaptation paradigms in neuroscience using reinforcement learning.
  • Jan Hansen-Palmus from the Bringmann group at Uni Tübingen was co-advised by Evgenia Rusak and me. He wrote his Bachelor’s thesis on constrained optimization approaches for pseudo-labeling.

Publications

A full and up-to-date list is also available on Google Scholar.

Projects

KI macht Schule

KI macht Schule provides classes in AI & Machine Learning for German high school students

NeuBtracker

An Imaging Platform for Neurobehavioral Research

Biomodels Retreat

Establishing fruitful collaborations between biologists, computer scientists and mathematicians in a yearly one-week retreat.

MSNE Blog

M.Sc. Neuroengineering Student Blog with latest information about our study program and events.

Ecurie Aix eace04/05

Contributions include the design of electric control units

Campus Weggemeinschaft

Homepage der Campus Weggemeinschaft

IT4Kids

IT4Kids provides computer science classes to elementary school pupils - Providing software, teaching materials and easy communication …