The Alignment Project

What is The Alignment Project? An international, cross-sector coalition offering funding to advance the field of alignment. Advanced Transformative AI has the potential to deliver unprecedented benefits to humanity. But this future depends on ensuring powerful AI systems reliably act as we intend them to, without unintended or harmful behaviours. Without advances in alignment research, future systems risk operating in ways we cannot fully understand or control, with profound implications for global safety and security. The Alignment Project is a £15 million global fund aiming to accelerate progress in AI alignment research and ensure advanced AI systems are developed safely, reliably, and for society’s benefit. Full information here: https://alignmentproject.aisi.gov.uk/

  • Opening date:
  • Closing date: (Midnight)

Get updates about this grant

Sign up for updates

Contents

Summary

What is The Alignment Project? 

An international, cross-sector coalition offering funding to advance the field of alignment. Advanced Transformative AI has the potential to deliver unprecedented benefits to humanity. But this future depends on ensuring powerful AI systems reliably act as we intend them to, without unintended or harmful behaviours. Without advances in alignment research, future systems risk operating in ways we cannot fully understand or control, with profound implications for global safety and security. The Alignment Project is a £15 million global fund aiming to accelerate progress in AI alignment research and ensure advanced AI systems are developed safely, reliably, and for society’s benefit.  

Who is involved? The Alignment Project is supported by an international coalition of government, industry, and philanthropic funders — including the UK AI Security Institute, the Canadian AI Safety Institute, Schmidt Sciences, Amazon Web Services, Halcyon Futures, Safe AI Fund, UKRI, Anthropic and the Advanced Research and Invention Agency — and a world-leading team of expert advisors. By fostering interdisciplinary collaboration, providing financial support and dedicated compute resources, we are tackling one of AI’s most urgent problems: developing AI systems that are beneficial, reliable and remain under human control at every step.   

Why apply? 

The Alignment Fund will provide funding, typically ranging from £50,000 to £1 million per project. The coalition partners may consider funding higher value projects.  As a recipient you will receive: 

  • Funding: For researchers across disciplines from computer sciences to cognitive science. 

  • Compute access: Dedicated compute resources from AWS, enabling technical experiments beyond typical academic reach.     

  • Venture capital: Investment from private funders to accelerate commercial alignment solutions.   

What we are looking for Alignment research is complex and multi-disciplinary; we want to expand the scale and diversity of ideas and approaches being brought to the challenge. We are focused on funding projects to develop mitigations to safety and security risks from misaligned systems. Priority research questions include:  

  • How can we prevent AI systems from carrying out actions that pose risks to our collective security, even when they are attempting to carry out such actions? 

  • How can we design AI systems which do not attempt to carry out such actions in the first place? 

 We welcome applications from everyone—whether you are deeply versed in these research areas or just beginning to explore them. Solving this problem will mean bringing expertise to bear from researchers across many fields who may have never considered this topic before. 

Full information can be found here: The Alignment Project by AISI — The AI Security Institute

Eligibility

Academic institution: defined as an institution that is eligible to receive funding from the research funding agency of the country in which it is based (e.g. eligible for Research Council funding in the UK, eligible for NSF/NIH funding in the USA, eligible for ERC funding in the EU, etc).  Relevant non-profit: defined as an organisation that is a legal entity operated for a collective, public or social benefit and is registered as such in the country in which it is based (e.g. registered with the Charities Commission in the UK, registered as a 501(c)(3) in the USA, etc).  

Relevant for-profit: defined as an organisation that is primarily a commercial entity. These applications will be subject to additional terms according to the requirements of our coalition of funding partners.

Objectives

Transformative AI systems — future models more powerful than any we have today - could revolutionise the world. From medical breakthroughs and sustainable energy to solving the global housing crisis, advanced AI has the potential to deliver unprecedented benefits to humanity. But this future depends on making certain these systems reliably act as we intend them to.  

AI alignment means developing AI systems that operate according to our goals, without unintended or harmful behaviours. It’s about making certain that AI performs reliably in high-stakes environments while remaining under human oversight and control. As AI becomes more capable and autonomous, solving this problem is a critical generational challenge. 

Today’s methods for training and controlling AI systems are likely insufficient for the systems of tomorrow. We’re already seeing signs from small-scale experiments that advanced AI could act unpredictably or in ways that actively undermine its intended objectives. Without advances in alignment research, future systems risk operating in ways we cannot fully understand or control, with profound implications for global safety and security. Progress in this field won’t come from one discipline alone, it will require contributions from fields spanning cognitive sciences to learning theory and information theory.

The Alignment Project was set up in an effort to close this gap. We provide funding up to £1 million (and in some cases, more) to accelerate AI alignment research and innovation. Through our funding programme, we are building the tools and techniques needed to help make future AI systems beneficial, reliable, and aligned with human intent.

Dates

Key dates for Round 1 

Fund Launch 

  • 30th July 2025 

Applications open 

  • 30th July 2025 

Information Webinar 

  • 11th August 2025 – 4.00pm GMT 

End of Clarification Questions 

  • 13th August 2025 

Application Deadline 

  • 10th September 

Stage 1: Assessment of Eligibility and Expression of Interest (EOI) 

Outcomes from the Stage 1 assessment 

  • 17th September 

Stage 2: Full Application Form Submission. Shortlisted applicants will work with a Research Sponsor for 4 weeks to iterate and complete their full application 

  • 15th October 

Funding decision: AISI will notify applicants of the outcome within 3-5 weeks of full application submission 

  • 5th - 19th November 

Project kick-off meeting 

  • Within 2 weeks of signing the Funding Agreement

How to apply

Application Process | Application Overview 

You can apply for the Alignment Project through the DSIT Funding Portal. To ensure applicants meet the eligibility criteria and align with the objectives of the Alignment Project, we have implemented a two-stage application process:    

Stage 1: Eligibility Statement and Expression of Interest (EOI) 

  • All prospective applicants are required to complete an Eligibility Statement and EOI* 

  • This allows us to assess preliminary eligibility and determine whether applicants’ proposals are suitable to proceed to Stage 2 

Stage 2: Shortlisting and Invitation to Submit a Full Application 

  • Following the review of eligibility and EOI, shortlisted applicants will be invited to submit a full application 

  • As part of this process, they will have the opportunity to work closely with an assigned Research Sponsor to refine their proposals and help strengthen their submissions before final assessment 

  • The full application will include: 

  • Eligibility Declaration – Confirmation that there have been no changes since Stage 1 (EOI) that would affect their eligibility 

  • Full Research Proposal – An opportunity for the applicant to further develop and refine initial ideas from the EOI stage 

  • Budget – This will include a breakdown of anticipated costs, ensuring clear justification and alignment with the project’s scope and objectives 

*Submission of an EOI does not guarantee progression to the full application stage.

Supporting information