Funding opportunities

We're funding research at the edge of what is technologically or scientifically possible.

Types of funding

Programmes (£10m–100m)

Our programmes are designed to advance complex, large-scale ideas which require coordinated investment and management across disciplines and institutions. To build a programme, each Programme Director directs the review, selection, and funding of a portfolio of projects which work in tandem to drive breakthroughs.

Opportunity seeds (up to £500k)

With smaller budgets and less structure than programmes, seeds support individual research teams to uncover new pathways that could inspire future programmes or might justify additional support as a standalone project.

Funding calls

There are no open funding calls at the moment. We will be putting out new calls in the coming months – sign up to be alerted when they go live.

These calls are now all closed. 

 

AI Scientist

We're accepting proposals for a series of short exploratory projects to be undertaken by existing AI systems capable of performing the full end-to-end workflow of scientific knowledge creation – specifically looking for AI systems that can handle ideation and hypothesis creation, the design of experiments to test these hypotheses, the ability to run experiments in ideally fully automated labs, and the interpretation of results to draw conclusions.

Application Date

14 November 2025

Trust Everything, Everywhere: Pre-programme discovery

The Trust Everything, Everywhere opportunity space asks how we can develop new trust infrastructure for an increasingly cyber-physical world. We’re now looking to fund a number of short research projects to help guide the development of an ambitious research programme and build community within this space.

Application Date

12 November 2025

Opportunity space

Trust Everything, Everywhere

Programme Director

Alex Obadia

Programme Funding

Sustained Viral Resilience: Concept papers

We launched a funding call for our £46m Sustained Viral Resilience programme. The programme’s goal is to engineer the body's innate immune system, creating a new class of medicines we're calling ‘sustained innate immunoprophylactics’ (SIIPs). If successful, they’ll provide durable, broad-spectrum protection against multiple respiratory viruses, creating a foundational resilience to viral disease that complements traditional vaccines. We’ll not only fund efforts to create new kinds of prophylactics, but also activities that support adoption and those that unlock smoother translation and commercialisation pathways.

Application Date

10 November 2025

Opportunity space

Sculpting Innate Immunity

Programme Director

Brian Wang

Programme Funding

Precision Mitochondria: Concept papers

ARIA is launching a programme backed by at least £55 million to make the mitochondrial genome programmable in vivo. The immediate goal is to achieve persistent, reproducible expression of a novel gene from engineered mitochondrial DNA (mtDNA) in a vertebrate system. The programme will fund the creation of a versatile toolkit to empower researchers with new capabilities. This toolkit will enable the delivery of nucleic acids into the mitochondrial matrix, the expression of functional proteins from the introduced genetic code, and the maintenance of engineered genomes through cellular replication.

Application Date

27 October 2025

Opportunity space

Bioenergetic Engineering

Programme Director

Nathan Wolfe

Engineering Ecosystem Resilience: Pre-programme discovery

The Engineering Ecosystem Resilience opportunity space asks whether pairing advanced monitoring with resilience-boosting interventions could halt biodiversity loss and enable people and nature to thrive. We’re now looking to fund a number of short research projects to help guide the development of an ambitious research programme within this space.

Application Date

18 September 2025

Opportunity space

Engineering Ecosystem Resilience

Programme Director

Yannick Wurm

Programme Funding

Safeguarded AI: TA2 Machine Learning (Phase 1)

ARIA is launching a multi-phased solicitation for Technical Area 2 (TA2) to support the development of a general-purpose Safeguarded AI workflow. The programme aims to demonstrate that frontier AI techniques can be harnessed to create AI systems with verifiable safety guarantees. In TA2, we will award £18m to a non-profit entity to develop critical machine learning capabilities, requiring strong organizational governance and security standards. Phase 1, backed by £1M, will fund up to 5 teams to spend 3.5 months to develop full Phase 2 proposals. Phase 2 — which will open on 25 June 2025 —will fund a single group, for £18M, to deliver the research agenda. TA2 will explore leveraging securely-boxed AI to train autonomous control systems that can be verified against mathematical models, improving performance and robustness. The workflow will involve forking and fine-tuning mainstream pre-trained frontier AI models to create verifiably safeguarded AI solutions.

Application Date

30 April 2025

Opportunity space

Mathematics for Safe AI

Programme Director

David 'davidad' Dalrymple

Opportunity Seed

Programmable Plants

The Programmable Plants opportunity space asks if we can programme plants to remove more CO2, improve food security, and deliver medicines to those in need. This seed funding call is looking to fund projects to challenge assumptions, open up new research paths, and provide steps towards new capabilities within the space, with up to £500k each.

Application Date

09 April 2025

Opportunity space

Programmable Plants

Programme Director

Angie Burnett

Programme Funding

Scaling Compute: Benchmarking

Our Creators are driving towards one goal – dropping the hardware costs required to train large AI models by >1000x – but as AI hardware and techniques advance rapidly, our baseline metrics and the computational cost of MLPerf benchmark workloads shift, requiring a constant recalibration of our targets. Now, we're looking for a team who can help track these moving targets and publish their findings to the research community. Through this work, we'll create an accurate (and open) source of ground truth for programme targets, and ensure the ambitious technologies developed by our Creators are measured against the most up-to-date advances in the field.

Application Date

10 March 2025

Opportunity space

Nature Computes Better

Programme Director

Suraj Bramhavar

Opportunity Seed

Scalable Neural Interfaces

The Scalable Neural Interfaces opportunity space asks if we can transform our understanding of neurological disorders by interfacing in new ways – at scale – with the human brain. This seed funding call looked to fund projects to challenge assumptions, open up new research paths, and provide steps towards new capabilities within the space, with up to £500k each.

Application Date

13 February 2025

Opportunity space

Scalable Neural Interfaces

Programme Director

Jacques Carolan

Programme Funding

Safeguarded AI: TAs 1.2 Backend + 1.3 Human-Computer Interface

Backed by £14.2m, ARIA (the UK’s Advanced Research and Innovation Agency) are looking to fund teams of software developers to build the scaffolding needed for the success of the Safeguarded AI programme. For TA 1.2, we are looking for Creators to develop the computational implementation of the theoretical frameworks being developed as part of TA 1.1 (the ‘Theory’). This implementation will involve version controlling, type checking, proof checking, security-by-design, flexible paradigms for interactions between humans and AI assistants, among others. For TA 1.3, Creators will work on the ‘Human-Computer Interfaces’ that facilitate interaction between diverse human users and the systems being built in TA 1.2 and TA 2 (‘Machine Learning’). Examples of HCI use cases include AI assistants helping to author and review world models and safety specifications, or helping to review guarantees and sample trajectories for spot/sense-checking or more comprehensive red-teaming.

Application Date

11 February 2025

Opportunity Space

Mathematics for Safe AI

Programme Director

David 'davidad' Dalrymple

Opportunity Seed

Mathematics for Safe AI

The Mathematics for Safe AI opportunity space asks if we can leverage mathematics – from scientific world-models to mathematical proofs – to ensure that powerful AI systems interact safely and as intended with real-world systems and populations. This seed funding call looked to fund projects to challenge assumptions, open up new research paths, and provide steps towards new capabilities within the space, with up to £500k each.

Application Date

11 February 2025

Opportunity Space

Mathematics for Safe AI

Programme Director

David 'davidad' Dalrymple

Programme Funding

Safeguarded AI: TA1.4 Sociotechnical Integration

This solicitation is looking for individuals and teams to work on problems that are plausibly critical to ensuring that the technologies developed a part of the programme will be used in humanity's best interests, and that they are designed in a way that enables their governability through representative collective deliberation and decision-making processes.

Application Date

02 January 2025

Opportunity space

Mathematics for Safe AI

Programme Director

David ‘davidad’ Dalrymple

Programme Funding

Exploring Climate Cooling: Full proposals

This solicitation seeks to fund individuals and teams in a coordinated effort to explore whether approaches designed to delay, or avert, climate tipping points could be feasible, scalable, and safe.

Application Date

09 December 2024

Opportunity space

Future Proofing our Climate and Weather

Programme Director

Mark Symes

Programme Funding

Robot Dexterity: TA3 Call for Expert Committee

This solicitation targets TA3 of the Robotic Dexterity programme and seeks an Expert Committee to investigate how to unlock breakthroughs in modularity, interoperability, and common standards in robotics.

Application Date

27 November 2024

Opportunity space

Smarter Robot Bodies

Programme Director

Jenny Read

Programme Funding

Synthetic Plants: Full proposals

This solicitation focuses on TA1 and TA2 of the Synthetic Plants programme and seeks individuals and teams who will develop and implement functioning synthetic plant units and look to understand the social and ethical considerations around synthetic plants.

Application Date

12 November 2024

Opportunity space

Programmable Plants

Programme Director

Angie Burnett

Programme Funding

Forecasting Tipping Points: Full proposals

This solicitation sought to fund R&D Creators in a coordinated effort to enhance our climate change response by developing an early warning system for tipping points.

Application Date

11 November 2024

Opportunity Space

Scoping Our Planet

Programme Director

Gemma Bale + Sarah Bohndiek

Programme Funding

Safeguarded AI: TA3 Real-World Applications

The second funding call for Safeguarded AI focuses on TA3 and seeks potential individuals or organisations interested in using our gatekeeper AI to build safeguarded products for domain-specific applications, such as optimising energy networks, clinical trials, or telecommunications networks.

Application Date

02 October 2024

Opportunity Space

Mathematics for Safe AI

Programme Director

David ‘davidad’ Dalrymple

Programme Funding

Robot Dexterity: Full proposals

The first solicitation for this programme targets TA1 and focuses on funding individuals and teams in a coordinated effort to create innovative components and new approaches to designing and building hardware, in order to enable robotic dexterity.

Application Date

19 September 2024

Opportunity space

Smarter Robot Bodies

Programme Director

Jenny Read

Programme Funding

Precision Neurotechnologies: Full proposals

This solicitation seeks individuals and teams focused on unlocking new methods to interface with the human brain at a cellular level to understand, identify, and treat neurological and neuropsychiatric disorders with unprecedented precision.

Application Date

09 September 2024

Opportunity Space

Scalabe Neural Interfaces

Programme Director

Jacques Carolan

Programme Funding

Forecasting Tipping Points: Future climate innovators programme

The discovery process for this programme has highlighted systemic barriers that could impede the long-term impact of our programme. This solicitation seeks a partner organisation to deliver a youth-led competition as part of the Forecasting Tipping Points programme to encourage the next generation of climate scientists and technologists to engage with our programme.

Application Date

04 September 2024

Opportunity space

Scoping Our Planet

Programme Director

Gemma Bale + Sarah Bohndiek

Opportunity Seed

Smarter Robot Bodies

The Smarter Robot Bodies opportunity space, led by Programme Director Jenny Read, asks how we could create robots with the grace and robustness of biological organisms, to ease the labour challenges of tomorrow. We’re funding projects to challenge assumptions, open up new research paths, and provide steps towards new capabilities within the space, with up to £500k each.

Application Date

27 August 2024

Opportunity space

Smarter Robot Bodies

Programme Director

Jenny Read

Opportunity Seed

Scoping Our Planet

The Scoping Our Planet opportunity space, led by co-Programme Directors Gemma Bale + Sarah Bohndiek, asks how we can fill the gaps in Earth system measurement to respond confidently to the climate crisis. We're funding projects to challenge assumptions, open up new research paths, and provide steps towards new capabilities within the space, with up to £500k each.

Application Date

18 June 2024

Opportunity Space

Scoping Our Planet

Programme Director

Gemma Bale + Sarah Bohndiek

Programme Funding

Safeguarded AI: TA1 Theory

The first solicitation for this programme targeted TA1.1 Theory and seeks individuals and teams to research and construct computationally practicable mathematical representations and formal semantics to support world-models, specifications about state-trajectories, neural systems, proofs that neural outputs validate specifications, and “version control” (incremental updates or “patches”) thereof.

Application Date

28 May 2024

Opportunity space

Mathematics for Safe AI

Programme Director

David ‘davidad’ Dalrymple

Programme Funding

Scaling Compute: Full proposals

This solicitation seeks to fund individuals and teams in a coordinated effort to redefine our current compute paradigm. If successful, this programme will unlock a new technological lever for next-generation AI hardware, alleviate dependence on leading-edge chip manufacturing, and open up new avenues to scale AI hardware.

Application Date

07 May 2024

Opportunity space

Nature Computes Better

Programme Director

Suraj Bramhavar

Opportunity Seed

Nature Computes Better

The Nature Computes Better opportunity space, led by Programme Director Suraj Bramhavar, asks how we can redefine the way computers process information by exploiting principles found ubiquitously in nature. This seed funding call is now funding projects to challenge assumptions, open up new research paths, and provide steps towards new capabilities within the space, with up to £500k each.

Application Date

17 April 2024

Opportunity Space

Nature Computes Better

Programme Director

Suraj Bramhavar

Accessibility support

If you are disabled or have a long-term health condition, ARIA can offer support to help you with our funding application process or when you are carrying out your project.

Learn more

How we work

We seek out exceptional scientists and engineers and empower them to turn their ideas into reality.

Find out more

Sign up for updates

Keep up-to-date on our latest opportunity spaces, programmes and funding calls.

Sign up