Stefano Sarao Mannelli
1 week ago
Postdoctoral Positions in Theoretical Foundations of AI Safety Chalmers University of Technology in Sweden
Degree Level
Postdoc
Field of study
Computer Science
Funding
Full funding availableDeadline
December 31, 2026Country
Sweden
University
Chalmers University of Technology

How do I apply for this?
Sign in for free to reveal details, requirements, and source links.
Where to contact
Official Email
Keywords
About this position
This postdoctoral opportunity at Chalmers University of Technology in Gothenburg, Sweden, focuses on the theoretical foundations of AI safety and alignment. The position is part of a collaborative project with University College London (UCL) and is funded by OpenAI's Alignment Team through the UK AI Security Institute. The research group, led by Dr. Stefano Sarao Mannelli, operates within the Division of Data Science and AI at the Department of Computer Science and Engineering, a joint department of Chalmers and the University of Gothenburg.
The project, titled "Theoretical Model Organisms of Misalignment," aims to transform AI alignment research from reactive approaches to a predictive science. By leveraging tools from statistical physics and high-dimensional probability, the team seeks to develop tractable models that explain the emergence of misalignment in AI systems. The research will focus on understanding the learning dynamics that lead to harmful capabilities and deriving quantitative laws for alignment. The project benefits from close collaboration with UCL and engagement with industrial advisors from leading AI labs such as Anthropic, Meta, and Google DeepMind/Mila.
As a postdoc, you will investigate inductive bias, fine-tuning, and mitigation strategies in AI systems. Your work will involve both theoretical modelling and empirical validation, and you are expected to participate in conferences and regular meetings with the UCL team. The position includes a teaching component (20% of your time), which may involve lecturing, TA duties, or student supervision. The role is designed to be meritorious for future academic, industry, or public sector careers.
Applicants must have a doctoral degree in Physics, Mathematics, Computer Science, or Machine Learning by the time of employment. Essential qualifications include strong English communication skills, a solid background in mathematical modelling and analysis, and programming proficiency. Experience with statistical physics of disordered systems, control theory, high-dimensional probability, or theoretical frameworks such as teacher-student models and deep linear networks is advantageous. The doctoral degree should preferably have been obtained within the last three years. A valid residence permit is required at the start of employment.
The position is a full-time, temporary contract for two years, with the possibility of a one-year extension. Funding is provided by OpenAI's Alignment Team, and the role includes all Chalmers employee benefits. Chalmers offers a dynamic and inclusive work environment, with support for international staff, including Swedish language courses and information on relocation benefits. The university is committed to gender equality and diversity through initiatives like the GENIE project.
To apply, submit your application in English via the provided online form, including a comprehensive CV, publication list, personal letter, and contact details for three references. The application deadline is April 1, 2026. For further information, contact Dr. Stefano Sarao Mannelli at [email protected].
Funding details
Full funding including tuition fees and living expenses is available for this position. The scholarship covers all educational costs and provides a monthly stipend.
How to apply
Please submit your application including a cover letter, CV, academic transcripts, and contact information for two references. Applications should be sent via the online portal before the deadline.
Ask ApplyKite AI
Professors

How do I apply for this?
Sign in for free to reveal details, requirements, and source links.