This program aims to provide support – in the form of funding for graduate study, unpaid internships, independent study, career transition and exploration periods, and other activities relevant to building career capital – for individuals at any career stage who want to pursue careers that could help to reduce global catastrophic risks or otherwise improve the long-term future.
Apply for funding here.
We’re especially interested in supporting individuals who want to pursue careers that are in some way related to mitigating potential risks posed by future advances in artificial intelligence or global catastrophic biological risks.
Applications are open until further notice and will be assessed on a rolling basis.
Generally speaking, we aim to review proposals within 6 weeks of receiving them, although this may not prove possible for all applications. Candidates who require more timely decisions can indicate this in their application forms, and we may be able to expedite the decision process in such cases.
Until recently, this program was known as the “early-career funding program”, but we’ve decided to broaden its scope to explicitly include later-career individuals.
Scope
We’re open to receiving applications from individuals who are already pursuing careers related to reducing global catastrophic risk (or otherwise improving the long-term future), looking to transition into such careers from other lines of work, or only just starting their careers. We think there are many career tracks which are potentially promising from this perspective (including many of the ones in this list from 80,000 Hours), and there is therefore a correspondingly wide range of proposals we would consider funding.
We’re open to supporting a variety of career development and transition activities, including (but not necessarily limited to) graduate study, unpaid internships, independent study, career transition and exploration periods, postdocs, obtaining professional certifications, online courses, and other types of one-off career-capital-building activities.
To name a few concrete examples of the kinds of applicants we’re open to funding, in no particular order:
- A final-year undergraduate student who wants to pursue a master’s or a PhD program in machine learning in order to contribute to technical research that helps mitigate risks from advanced artificial intelligence.
- An individual who wants to do an unpaid internship at a think tank focused on biosecurity, with the aim of pursuing a career dedicated to reducing global catastrophic biological risk.
- A former senior ML engineer at an AI company who wants to spend six months on independent study and career exploration in order to gain context on and investigate career options in AI risk mitigation.
- An individual who wants to attend law school or obtain an MPP, with the aim of working in government on policy issues relevant to improving the long-term future.
- A recent physics PhD who wants to spend six months going through a self-guided ML curriculum and working on projects in interpretability, in order to transition to contributing to technical research that helps mitigate risks from advanced AI systems.
- A software engineer who wants to spend the next three months doing independent study in order to gain relevant certifications for a career in information security, with the longer-term goal of working for an organization focused on reducing global catastrophic risk.
- An experienced management consultant who wants to spend three months exploring different ways to apply their skill set to reducing global catastrophic risk and applying to relevant jobs, with an eye to transitioning to a related career.
- A PhD graduate in an unrelated sub-area of computational biology who wants to spend four months getting up to speed on DNA synthesis screening in order to transition to working on this topic.
- A professor in machine learning, theoretical computer science, or another technical field who wants funding to take a one-year sabbatical to explore ways to contribute to technical AI safety or AI governance.
Funding criteria
- This program aims to provide support for individuals who want to pursue careers that could help to reduce global catastrophic risk or otherwise improve the long-term future. We are particularly interested in funding people who have deeply engaged with questions about global catastrophic risk and/or the long-term future, and who have skills and abilities that could allow them to make substantial contributions in the relevant areas.
- Candidates should describe how the activity for which they are seeking funding will help them enter or transition into a career path that plausibly allows them to make these contributions. We appreciate that candidates’ plans may be uncertain or even unlikely to work out, but we are looking for evidence that candidates have thought in a critical and reasonably detailed manner about those plans — not just about what career path(s) might open up for them, but also about how entering said career path(s) could allow them to reduce global catastrophic risk or otherwise positively impact the long-term future.
- We are looking to fund applications where our funding would make a difference — i.e. where the candidate is otherwise unable to find sufficient funding, or the funding they were able to secure imposes significant restrictions or requirements on them (for example, in the case of graduate study, restrictions on their research focus or teaching requirements). We may therefore turn down promising applicants who were able to secure equivalent support from other sources.
- If you receive a grant from us through this program, we will ask that you notify us about any other income or funding (e.g. paid work, fellowships, grant income, etc.) that you receive during the grant period. If your funding relates to a degree program, we will also ask you to notify us of any changes to your enrollment status. Depending on the nature of any additional income/enrollment status changes, we may alter the grant amount or timeframe in accordance with this policy.
- The program is open to applicants in any country.
Other information
- There is neither a maximum nor a minimum number of applications we intend to fund; rather, we intend to fund any applications that seem sufficiently promising to us to be above our general funding bar for this program.
- In some cases, we may ask outside advisors to help us review and evaluate applications. By submitting your application, you agree that we may share your application with our outside advisors for evaluation purposes.
- We encourage individuals with diverse backgrounds and experiences to apply, especially self-identified women and people of color.
- We plan to respond to all applications.
- This program now subsumes what was previously called the Open Philanthropy Biosecurity Scholarship; for the time being, candidates who would previously have applied to that program should apply to this program instead. (We may decide to split out the Biosecurity Scholarship again as a separate program at a later point, but for practical purposes, current applicants can ignore this.)
- We may make changes to this program from time to time. Any such changes will be reflected on this page.