This programme will fund researchers who will collaborate with the UK government to advance systemic approaches to AI Safety.
Systemic AI safety is focused on safeguarding the societal systems and critical infrastructure into which AI is being deployed—to make our world more resilient to AI-related risks and to enable its benefits.
The AI Safety Institute (AISI), in partnership with the Engineering and Physical Sciences Research Council (EPSRC) and Innovate UK, part of UK Research and Innovation (UKRI), is excited to announce support for impactful research that takes a systemic approach to AI safety. We are offering seed grants round of £200,000 for 12 months, and plan to follow in future rounds with more substantial awards. Successful applicants will receive ongoing support, computing resources where needed, and access to a community of AI and sector-specific domain experts.
We are seeking applications focused on a range of safety-related problems: this could involve monitoring and anticipating AI use and misuse in society, or the risks they expose in certain sectors. We want to see applications that could enhance understanding of how government could intervene where needed - with new infrastructure and technical innovations - to make society more resilient to AI-related risks.
We conceive of systemic AI safety as a very broad field of research and interventions. Below we introduce some examples of the kinds of research we are interested in. A longer list of example projects is available here.
We recognise that future risks from AI remain largely unknown. We are open to a range of plausible assumptions about how AI technologies will develop and be deployed in the next 2-5 years. We are excited about work that addresses both ongoing and anticipated risks, as long as it is credible and evidence based.
In the future, we will build on the outputs of this first phase and make larger, longer-term investments in specific interventions that have a promise for increasing systemic safety. Projects in the first phase will be prioritised on their ability to help us make these decisions in the second phase.
Apply by filling out our application form.
Please note that by applying you agree to allow us to make public the answer to question 9 of the proposal (“What problem does your idea solve / what risk does it address / what question does it answer?”). In order to maximise the visibility of the problems in systemic AI safety, we reserve the right to publish all the proposed answers to this question (anonymously) on our website. Our goal here is to create a repository of relevant questions in systemic AI safety, that future researchers might choose to address. Note that we will not publish your proposed solution, or link researcher identities to any published text.
In addition to delivering your proposed projects, successful grantees will be expected to produce quarterly progress updates, against financial and non-financial performance metrics, participate in regular progress meetings with AISI and UKRI, participate in workshops organised by AISI on a regular basis, and to engage with the programme officer to increase the impact of your work. These expectations would be laid out in the grant agreement terms and conditions.
If you have any questions regarding the call, please email AISIgrants@dsit.gov.uk.
We invite proposals from a broad range of researchers. To be a project leader or co-lead (co-applicant), you must be employed at either:
We will run a two-stage review process consisting of a peer review and sift followed by an expert assessment panel. For the sift stage, we may use partial randomisation in the process of shortlisting which applications advance to the expert assessment panel.
Each application during the peer review and sift stage will be assessed by individuals outside of UKRI and AISI. All reviewers are expected to uphold principles of peer review assessment and decision making.
You will be notified of the outcome of the sift stage by mid-December 2024.
If your application advances to the expert assessment panel, then you must provide additional information to EPSRC by 9th January 2024; see list below.
The expert assessment panel will review all shortlisted applications against all the assessment criteria. Final selection will be made by AISI based on the expert assessment panel’s recommendations and in line with the programme’s objectives.
Depending on the level of demand for this opportunity, the delivery partners reserve the right to modify the assessment process as needed.
The shortlisting of proposals during the sift stage will be based exclusively on questions 7 and 9 of the application form, which are stated below
7) Tell us about your background, your track record, and why you are a good person or team to conduct impactful work on systemic AI safety
9) What problem does your idea solve / what risk does it address / what question does it answer?
This step will be taken to exclude proposals for which the problem/risk being addressed is out of scope for our call, or where the applicants do not have a track record of conducting research relevant to the proposal. All proposals will be subjected to peer review based on the above criteria, to deem if the proposal is likely to be competitive at the expert assessment panel. Depending on application volumes, we reserve the right to use partial randomisation when shortlisting applications for the expert panel selection stage, based on the ranking that emerges from reviewers’ scores.
Please note that by applying you agree to allow us to make public the answer to question 9 of the proposal (“What problem does your idea solve / what risk does it address / what question does it answer?”). In order to maximise the visibility of the problems in systemic AI safety, we reserve the right to publish all the proposed answers to this question (anonymously) on our website. Our goal here is to create a repository of relevant questions in systemic AI safety, that future researchers might choose to address. Note that we will NOT publish your proposed solution, or link researcher identities to any published text.
You will be notified of the outcome from the initial sift process regardless of outcome by mid-December 2024. If your application advances to the expert assessment panel, then your full proposal will be evaluated by an expert assessment panel that is recruited by UKRI and AISI.
If you successfully pass to panel stage, then you must provide to UKRI by the 9th of January 2025 the following additional information for the panel meeting or your application will not be considered further for funding:
UKRI and AISI will be available to answer your questions and provide advice where possible, as you finalise your budget. Further information on eligible costs can be found on the application form the FAQ document.
Alongside an additional evaluation of the team track record and the requested budget, the expert panel assessment will be judging research proposals on the following criteria, and expect successful applications to score highly on each of them:
1. Relevant. Proposals should address a pressing risk that arises when AI is deployed in society, for private, professional or industrial use.
2. Feasible. We want feasible proposals that address plausible risks. Risks should be those that could plausibly arise from AI over the near term (e.g., a 2–5-year time horizon). We imagine that over this time horizon, there will be significant progress in AI development, that will make Frontier AI systems:
3. Innovative. Proposals that address risks or issues that are currently under-explored or under-considered are particularly welcome.
4. Actionable. Proposals should result in clear, actionable deliverables. For example, this could be:
Proposals for which the only deliverable is a literature review are unlikely to be successful. Please see the applicant guidance for more details on the assessment criteria and scoring.
Final selection will be made by AISI based on the expert assessment panel’s recommendations and in line with the programme’s objectives. Depending on the level of demand for this opportunity, UKRI reserves the right to modify the assessment process as needed.
If your proposal is selected for a potential award, AISI and the applicant will discuss, negotiate and agree on the project activities and milestones. We will conduct due diligence and assurance checks before an award is made, in accordance with our internal guidelines. This might include verifying the identity and institutional associations of applicants and project partners, the capacity within the awarding institution to administer and deliver the award, and any additional financial due diligence checks.
The successful applicants should be prepared to start their grants on the 5th February 2025.
Grants are awarded under the standard UKRI grant terms and conditions and terms and conditions of an Innovate UK grant award.
GAC 1: grant extensions
No slippage or grant extensions (beyond exceptional circumstances in line with the Equality Act 2010) will be allowed. We will not be responsible for any cost overrun incurred during the course of this grant. You will be required to make up any shortfall from alternative sources.
GAC 2: governance
We will nominate a member of our staff (the project officer) who will be your primary point of contact from UKRI and AISI. The project officer(s) will ensure that the project is being run in accordance with the terms and conditions and in line with financial due diligence. As funding administrators, all UKRI staff have agreed to maintain the confidentiality required by all parties involved in EPSRC-funded research.
GAC 3: monitoring and reporting
Notwithstanding the requirements set out in standard grant conditions RGC 7.4.3, you are responsible for providing to the project officer(s) quarterly progress updates against financial and non-financial performance metrics. A detailed list of performance metrics and instructions for reporting will be agreed with You upon commencement of the grant. You will also be expected to participate in regular check-in meetings with AISI and UKRI, participate in workshops organised by AISI, and produce an annual report at the end of the grant. We reserve the right to suspend the grant and withhold further payments if the performance metrics requested are not provided by the stated deadlines or are determined to be of an unacceptable standard by our project officer(s).
Additional financial or non-financial information may occasionally be requested outside of the standard annual and quarterly reporting cycle. You agree to undertake all reasonable endeavours to comply with these requests in a timely manner.
GAC 4: expenditure
At the start of the grant the financial spend profile will be agreed by us. In addition to any reporting requirements set out in GAC 3, you must immediately notify our project officer(s) in writing of any accumulation, slippage or variation in expenditure greater than 5% of the annual profiled funding.
We reserve the right to re-profile the grant if required. Any deviation from the agreed allocation of funding and profiled costs must be negotiated and approved through written consent by us. The approval of profile changes should not be assumed and will be dependent on spend across all associated grants. At the end of the grant period a breakdown of the expenditure should be submitted along with the final expenditure statement.
GAC 5: embedding trusted research
The project is expected to embed trusted research and innovation throughout their activities. We reserve the right to suspend the grant and withhold further payments if trusted research is not embedded throughout the project or is deemed to be of an unacceptable standard by us.
The funding rates you can receive depend on your organisation’s size, type, and role in the project. Organisations are classified into three categories:
Each participant must determine which organisation type they fall under. Further details on organisation types can be found here, and the corresponding funding rates are available in the funding guidance.
Please note that any Je-S system requirements mentioned in the linked pages should be disregarded. At this stage, you are only required to provide an approximate budget in the application form (Question 14).
All Intellectual Property Rights generated from grants funded by the AISI Grant Programme will belong to the grant recipient that generate them, provided (unless otherwise agreed in writing with UKRI) that they are compelled to make every reasonable effort to ensure that the knowledge and other intellectual assets obtained in the course of their projects are disseminated to the public and software (if any) is released on an open source licence.
Your application, including personal information and your proposal idea, will be stored on secure computers within the information environment of the Department of Science, Innovation and Technology (DSIT) and at EPSRC and Innovate UK, as part of UKRI. We will handle personal data in line with UK data protection legislation and manage it securely. For more information, including how to exercise your rights, read UKRI's privacy notice, Innovate UK’s privacy notice and DSIT’s personal information charter.
We will share your answers to questions 7 (track record) and 9 (identified problem/risk) with a broad pool of reviewers who will judge whether it is in scope for systemic AI safety and whether the applicants have an appropriate track record. These reviewers will be drawn from academia, industry and government, but you can also apply to be a reviewer yourself when submitting your application, or by emailing AISIgrants@dsit.gov.uk.
By applying, you agree to allow us to share, process, or otherwise modify responses to question 9. If we make responses public, we will not link them to applicants or teams. The reason we want to do this is to create a central repository of systemic risks that future research should address. Note that we will NOT make your solutions public (i.e. answer to question 9).
If you are shortlisted for panel, then your full application will be shared with an expert assessment panel composed of highly qualified evaluators, who will judge applications by the criteria outlined above (in “How will we evaluate proposals?”) and in the applicant guidance document.