We find and support top talent working to address risks from advanced AI.
The largest research fellowship focused on risks from advanced AI.
A part-time, remote research fellowship matching aspiring AI safety researchers with experts in the field to work on impactful projects, together.
Research conducted through SPAR has been accepted at ICML and NeurIPS, covered by TIME, and led to full-time job offers for mentees.
Our Fall 2025 round features 80 projects and 309 mentees across 55 countries.
Mentorship and financial support for university clubs focused on AI safety and policy
A fellowship providing funding, mentorship, community, and other resources to organizers of AI safety student groups at universities around the world.
Pathfinder enables organizers to run ambitious programming, helping their members upskill for careers in the field.
Our Fall 2025 round features 65 Pathfinder Fellows at 51 universities across 11 countries.
A three-month residency building the infrastructure and capacity the AI safety ecosystem needs.
For highly agentic executors from any background (operations, engineering, policy, design, or writing), shipping real projects with mentorship and placement support.
Hosted in-person at the Constellation office in Berkeley, June 15 – August 28, 2026, with a $6,000/month stipend and housing and travel fully covered.
Run in partnership with Constellation. Applications open through April 27, 2026 for our inaugural cohort of 15–30 residents.
Intensive three-day workshops on how to think critically about AI safety and biosecurity.
Residential workshops for 20–35 attendees featuring foundational lectures, guest speakers, and 1:1 career conversations with experts.
Open to applicants worldwide, with travel support available for those who need it.
Running again in June 2026, with meals and accommodation included.
We're a nonprofit focused on accelerating talent into the fields of AI safety and policy. We believe the development of transformative AI will be one of the most consequential events in history. Getting it right requires building a robust ecosystem of researchers, policymakers, technical professionals, and skilled operators who can navigate the complex challenges ahead.
We operate with urgency. If transformative AI arrives soon, we need to move fast to build this ecosystem. For us, that means staying lean, shipping quickly, and constantly reassessing whether our priorities are the right ones.
Our current strategy focuses on interventions that can scale flexibly in response to public interest, allowing the fields of AI safety and policy to rapidly absorb large amounts of talent. We execute on this strategy by facilitating scalable research mentorship through SPAR, supporting decentralized community building at universities through Pathfinder, building the ecosystem's operational capacity through the Generator Residency, and introducing students to AI safety and biosecurity through the Global Challenges Project.
Co-Director
Co-Director
Founding Generalist
Operations Associate
We're always looking for talented individuals who are passionate about AI safety. If you'd like to contribute to our mission, we'd love to hear from you.
Express your interest