AI Safety Training

A database of training programs, course, conferences, and other events for AI existential safety. Book a free call with AI Safety Quest if you want to get into AI safety!

Open Applications

Programs Timeline

Exact dates may be inaccurate if they were added before dates were announced, refer to the program websites for reliable information.

Upcoming Table


Facilitated courses are usually heavily oversubscribed. However, materials are openly available and lots of other people want to learn, so you can form your own study groups! Pick your preferred course, then introduce yourself in #study-buddies on the AI Alignment Slack to make a group, or go to AI Safety Quest and form a Quest Party.

AGI Safety Fundamentals

8 week courses by BlueDot Impact covering much of the foundations of the field and ongoing research directions.

Intro to ML Safety

40 hours of recorded lectures, written assignments, coding assignments, and readings by the Center for AI Safety, used in the ML Safety Scholars program.

Alignment Forum Curated Sequences

Sequences of blog posts by researchers on the Alignment Forum covering diverse topics.

Arkose's Resources List

Curated and tested list of resources that Arkose sends to AI researchers, excellent for getting a grounding in the problem.

Reading What We Can

Collection of books and articles for a 20 day reading challenge.

CHAI Bibliography

Extensive annotated reading recommendations from the Center for Human-Compatible AI.


AI Safety Communities
Living document of online and offline communities.

AI Safety Info
Interactive crowdsourced FAQ on AI Safety.

Alignment Ecosystem Development
Volunteering opportunities for devs and organizers.

© AI Safety Support, released under CC-BY.