A graphic of abstract shapes.

Mathematics for Safe AI: Opportunity seeds

Opportunity seeds support ambitious research aligned to our opportunity spaces. We’re looking to challenge assumptions, open up new research paths, and provide steps towards new capabilities. We’ll fund high-potential proposals with up to £500k each.

Can we leverage mathematics – from scientific world-models to mathematical proofs – to ensure that powerful AI systems interact safely and as intended with real-world systems and populations?

We launched a call for bold ideas within the Mathematics for Safe AI opportunity space, offering to support projects with up to £500k each.

This seed funding call is now closed.

Selected R&D Creators will be announced in the coming weeks. Click below to receive updates on funded projects and follow their progress. 

Sign up for updates
What's in scope?

+ Ideas that sit within the Mathematics for Safe AI opportunity space. By this, we mean your proposal should show how your idea either aligns with or challenges the assumptions of the Summary, Beliefs, or Observations in the opportunity space;

+ Ideas that could change the conversation about what is possible or valuable;

+ Ideas that range from early stage curiosity-driven research through to pre-commercial science and technology.

What's out of scope?

+ Ideas that live outside the scope of the Mathematics for Safe AI opportunity space;

+ Ideas that are within scope of the Safeguarded AI programme;

+ Ideas that are undifferentiated or are likely to happen without ARIA’s support;

+ Commercial or close-to-commercial stage products.

Who are we looking to fund?

We welcome applications from across the R&D ecosystem, including individuals (those not affiliated with an organisation), universities (including proposals from students, postdocs and staff), research institutions, small, medium and large companies, charities and public sector research organisations. 

What budget and timeline can you propose?

We provide funding from £10k up to £500k per project, inclusive of VAT (where applicable) and all associated costs (both direct and indirect).

There is no minimum length for a proposed project but the maximum length is two years.

Applications open

Clarifying questions deadline

Application deadline

Shortlisting1

Final award decision2

15 January 2025

04 February 2025

11 February 2025 [13:00 GMT]

27 February 2025

07 March 2025

  1. Shortlisted/unsuccessful applicants notified. Shortlisted applicants will be invited to meet with the Programme Director.
  2. Successful/unsuccessful applicants notified.

Meet the programme team

Our Programme Directors are supported by a Programme Specialist (P-Spec) and Technical Specialist (T-Spec); this is the nucleus of each programme team. P-Specs co-ordinate and oversee the project management of their respective programmes, whilst T-Specs provide highly specialised and targeted technical expertise to support programmatic rigour.

Davidad PD 1

David 'davidad' Dalrymple

Programme Director

davidad is a software engineer with a multidisciplinary scientific background. He’s spent five years formulating a vision for how mathematical approaches could guarantee reliable and trustworthy AI. Before joining ARIA, davidad co-invented the top-40 cryptocurrency Filecoin and worked as a Senior Software Engineer at Twitter.

A photo of Yasir Bakki smiling against a grey background.

Yasir Bakki

Programme Specialist

Yasir is an experienced programme manager whose background spans the aviation, tech, emergency services, and defence sectors. Before joining ARIA, he led transformation efforts at Babcock for the London Fire Brigade’s fleet and a global implementation programme at a tech start-up. He supports ARIA as an Operating Partner from Pace.

A photo of Nora Ammann smiling at the camera against a white background.

Nora Ammann

Technical Specialist

Nora is an interdisciplinary researcher with expertise in complex systems, philosophy of science, political theory and AI. She focuses on the development of transformative AI and understanding intelligent behavior in natural, social, or artificial systems. Before ARIA, she co-founded and led PIBBSS, a research initiative exploring interdisciplinary approaches to AI risk, governance and safety.

Find out more

Insights

The Creator experience

What you can expect as an ARIA R&D Creator.

Find out more
A photo of Yoshua Bengio and Davidad Dalrymple standing in front of a white board
Announcements07 August 2024

Yoshua Bengio joins Safeguarded AI as Scientific Director

MIT Technology Review

Read more
Insights

Applicant guidance

The process for applying for ARIA funding, key resources and FAQs.

Learn more
A graphic image of davidad, on the Gradient podcast. The writing says 'Davidad Dalrymple: Towards provably safe AI'.
Insights05 September 2024

Towards Provable Safe AI with davidad

Gradient Podcast

Listen now
A photo of a group of people sitting around the table and looking at davidad writing on a whiteboard.
Insights10 May 2024

'Towards Guaranteed Safe AI: A framework for ensuring robust and reliable AI systems'

arXiv: 2405.06624

Read more
A photo of Matt Clifford, Ilan Gur and Angie Burnett looking at the camera. Matt and Angie are sat down and Ilan is standing.
News02 October 2024

Can ARIA put the UK back on the scientific map?

Wired UK

Read more