Generator Residency

A three-month residency for AI safety generalists. Berkeley, Summer 2026. Kairos × Constellation Apply by April 27
About Program Advisors FAQ Apply
3 months
June 15 – August 28, 2026
$6k/mo
stipend, plus housing & travel
15–30
residents this summer

AI safety has a lot of people who are great at thinking about ideas. We need more people who are great at thinking about people and projects.

Generator is a 3-month residency where residents pitch, build, and ship projects that build capacity and infrastructure across the AI safety ecosystem, and then get support landing full-time roles at the orgs that need them.

Applications close April 27, 2026. Apply here.

Who Should Apply

Highly agentic, conscientious executors.

You identify gaps and fill them without being asked. You're mission-driven, comfortable with ambiguity, and willing to do whichever job the project needs this week.

We welcome any background: technical, operations, policy, engineering, design, writing. What matters is your track record of execution and genuine commitment to making AI go well for humanity. That might look like running an AI safety group, organizing a large event, shipping a novel project, or starting an initiative from scratch.

Advisors

Residents are mentored by experienced generalists across core AI safety organizations.

Alexandra Bates
Alexandra Bates
Program Manager, Constellation
Aryan Bhatt
Aryan Bhatt
Senior Member of Technical Staff, Redwood
Agus Covarrubias
Agus Covarrubias
Co-Director, Kairos
Lauren Mangla
Lauren Mangla
COO, AI Futures Project
Henry Sleight
Henry Sleight
Astra Lead, Constellation
Neav Topaz
Neav Topaz
Co-Director, Kairos
Sydney Von Arx
Sydney Von Arx
Member of Technical Staff, METR
Rachel Weinberg
Rachel Weinberg
Events, AI Futures Project

Pick a gap. Build the fix. Ship in 3 months.

Some ideas of what residents might build:

Workshops & conferences

Run an impactful domain-specific conference like ControlConf, or one that brings new talent into AI safety, like GCP. Focus on reaching high-leverage, new audiences, or covering emerging subfields of AI safety.

AI comms fellowship

Design and manage a short fellowship for skilled writers and communicators to produce content about AI safety. Draft a curriculum, identify mentors, acquire funding, and prepare a pilot cohort.

Recruiting pipelines

Work with two or three small AI safety orgs and build the systems they need to scale quickly: work tests, candidate sourcing, referral pipelines. Solve recruiting coordination challenges between orgs.

Travel grants program

Design a program to fund visits to AI safety hubs by promising students and professionals. Set admission criteria, build an application flow, line up partner referrals, and run a pilot round.

Shared compute fund

Scope a fund that can rapidly cover the compute needs of independent safety researchers. Model whether a full-on cluster is needed. Acquire compute, deliver a plan, and distribute a pilot round of grants.

Strategic awareness tools

Reduce adversarial pressure during takeoff by scaling AI-powered superforecasting and scenario planning in safety infrastructure. Build support among impactful stakeholders and run a pilot.

Human data collection

Build robust systems to collect thousands of hours of human data (RCTs, uplift studies) in just weeks. Work with multiple organizations to scale these systems as part of core research workflows.

Your idea

These are examples. Residents will get a more exhaustive list of scoped ideas, or can pitch their own projects to build capacity or infrastructure across the AI safety ecosystem.

We offer

Project budgets

In addition to the $6k/mo stipend, we provide generous funding to execute your project: events, contractors, tools, travel.

Placement support

Support landing a full-time role at an AI safety org, spinning your project into a new org, or handing it off to one that can keep it going.

Constellation office

Full access to the AI safety coworking space, including meals on working days.

Mentorship & support

1:1s with successful generalists and deep dives on the state of the field.

Timeline
Scope weeks 1–2

Pick a project from our list or pitch your own, meet the Constellation network, build context on the field and its gaps.

Build weeks 3–11

Execute individually or in groups with generous budgets and mentorship from generalists across our partner organizations.

Extend weeks 13–24

Selected residents continue their projects for another three months—full-time in-person or part-time remote. Stipend, office access, and housing (for in-person extenders) all continue.

Place week 12+

Land a full-time role at a serious AI safety org, spin up a new org, or hand your project to one that can keep it going. We aim to place the majority of residents seeking jobs within 12 months.

Scope weeks 1–2

Pick a project from our list or pitch your own, meet the Constellation network, build context on the field and its gaps.

Build weeks 3–11

Execute individually or in groups with generous budgets and mentorship from generalists across our partner organizations.

Extend weeks 13–24

Selected residents continue their projects for another three months—full-time in-person or part-time remote. Stipend, office access, and housing (for in-person extenders) all continue.

Place week 12+

Land a full-time role at a serious AI safety org, spin up a new org, or hand your project to one that can keep it going. We aim to place the majority of residents seeking jobs within 12 months.

FAQ

When do applications close?
Regular applications close April 27, 2026. If you need an earlier answer, you can opt into early decision on the application form; early decision applications close April 17, 2026. Click here to apply.
When will I hear back about my application?
Early decision applicants will hear back by around May 1, 2026. Regular applicants will hear back by around May 10, 2026.
What is early decision?
If you have a hard deadline to commit somewhere else, like another program, a job offer, or a visa timeline, you can opt into early decision on the application form. Early decision applicants need to apply by April 17, 2026 and will hear back by around May 1, 2026. Everyone else goes through regular decision (apply by April 27, hear back by around May 10). Opting into early decision doesn't affect your chances; it just moves your timeline up.
What is the application process?
You will submit your initial application. If invited to continue, you will complete a ~2 hour trial task followed by a short (~15 minute) interview, after which we will make final acceptance decisions.
What is the program timeline?
June 15 – August 28, 2026 for the core residency. The first two weeks match residents to projects and help them build context on the field. The remaining nine weeks are project execution. Selected residents are then invited into a three-month extension period, roughly September – November 2026.
What happens after the core three months?
We help residents land a full-time role at an AI safety org, spin their project into a new org, or hand it off to one that can keep it going. Selected residents are also invited into a three-month extension period to keep building—full-time in-person at the Constellation office, or part-time remote. The stipend continues throughout the extension (prorated for part-time), along with office access and, for full-time in-person extenders, housing.
Is this a paid opportunity?
Yes. Residents receive a $6,000/month stipend. On top of that, Kairos covers housing in Berkeley for the full residency and pays for travel to and from Berkeley — neither comes out of the stipend. Residents invited into the extension period continue receiving the stipend (prorated for part-time), and full-time in-person extenders keep their housing too.
Where is the program? Is in-person work required?
Generator takes place in the Constellation office in Berkeley, California. In-person work is expected for the full duration of the core three months. Kairos covers travel to and from Berkeley and housing in Berkeley for the entire residency, on top of the stipend. Residents invited into the extension period can then either continue full-time in-person or switch to part-time remote. If you have a special circumstance, please apply anyway and let us know.
Am I eligible if I'm not a U.S. citizen?
Yes. We welcome applicants from anywhere in the world, and for most people we'll be able to sponsor J-1 visas for the program's duration (though we can't guarantee obtaining such a visa).
How will I be matched with a project and advisor?
The first two weeks of the program will focus on matching. Kairos and Constellation will organize residents into groups of 1-4, and match each group with a project and a mentor. Residents may work on projects from our list, or on their own ideas.
What kinds of roles does this opportunity lead to?
Generator is designed as a launchpad into full-time generalist roles in AI safety: program managers, fieldbuilders, operators, chiefs of staff, COOs, and founders. During and after the program we actively help residents land these roles, turn their projects into new orgs, or hand them off to existing ones.
What are some example projects?

Here are a few examples:

Workshops and conferences. Run an impactful domain-specific conference like ControlConf, or one that brings new talent into AI safety, like GCP. Focus on reaching high-leverage, new audiences, or covering emerging subfields of AI safety.

AI comms fellowship. Design and manage a short fellowship for skilled writers and communicators to produce content about AI safety. Draft a curriculum, identify mentors, acquire funding, and prepare a pilot cohort.

Recruiting pipelines. Work with two or three small AI safety orgs and build the systems they need to scale quickly: work tests, candidate sourcing, referral pipelines. Solve recruiting coordination challenges between orgs.

Travel grants program. Design a program to fund visits to AI safety hubs by promising students and professionals. Set admission criteria, build an application flow, line up partner referrals, and run a pilot round.

Shared compute fund. Scope a fund that can rapidly cover the compute needs of independent safety researchers. Model whether a full-on cluster is needed. Acquire compute, deliver a plan, and distribute a pilot round of grants.

Strategic awareness tools. Reduce adversarial pressure during takeoff by scaling AI-powered superforecasting and scenario planning in safety infrastructure. Build support among impactful stakeholders and run a pilot.

Mass human data collection. Build robust systems to collect thousands of hours of human data (RCTs, uplift studies) in just weeks. Work with multiple organizations to scale these systems as part of core research workflows.

How can I refer someone?
If you know someone who'd be a great fit for Generator, send us a referral through this short form. We follow up on every referral.

Still have questions? Email us at contact@generatorresidency.org.

Applications close April 27.

Three months. Generous budgets. A cohort of 15–30. Berkeley, June 15 – August 28.

Apply now