
Mathematics for Safe AI: Opportunity seeds
Opportunity seeds support ambitious research aligned to our opportunity spaces. We’re looking to challenge assumptions, open up new research paths, and provide steps towards new capabilities. We’ll fund high-potential proposals with up to £500k each.
Can we leverage mathematics – from scientific world-models to mathematical proofs – to ensure that powerful AI systems interact safely and as intended with real-world systems and populations?
We launched a call for bold ideas within the Mathematics for Safe AI opportunity space, offering to support projects with up to £500k each.
This seed funding call is now closed.
Selected R&D Creators will be announced in the coming weeks. Click below to receive updates on funded projects and follow their progress.
What's in scope?
+ Ideas that sit within the Mathematics for Safe AI opportunity space. By this, we mean your proposal should show how your idea either aligns with or challenges the assumptions of the Summary, Beliefs, or Observations in the opportunity space;
+ Ideas that could change the conversation about what is possible or valuable;
+ Ideas that range from early stage curiosity-driven research through to pre-commercial science and technology.
What's out of scope?
+ Ideas that live outside the scope of the Mathematics for Safe AI opportunity space;
+ Ideas that are within scope of the Safeguarded AI programme;
+ Ideas that are undifferentiated or are likely to happen without ARIA’s support;
+ Commercial or close-to-commercial stage products.
Who are we looking to fund?
We welcome applications from across the R&D ecosystem, including individuals (those not affiliated with an organisation), universities (including proposals from students, postdocs and staff), research institutions, small, medium and large companies, charities and public sector research organisations.
What budget and timeline can you propose?
We provide funding from £10k up to £500k per project, inclusive of VAT (where applicable) and all associated costs (both direct and indirect).
There is no minimum length for a proposed project but the maximum length is two years.
Applications open | Clarifying questions deadline | Application deadline | Shortlisting1 | Final award decision2 |
---|---|---|---|---|
15 January 2025 | 04 February 2025 | 11 February 2025 [13:00 GMT] | 27 February 2025 | 07 March 2025 |
- Shortlisted/unsuccessful applicants notified. Shortlisted applicants will be invited to meet with the Programme Director.
- Successful/unsuccessful applicants notified.
Meet the programme team
Our Programme Directors are supported by a Programme Specialist (P-Spec) and Technical Specialist (T-Spec); this is the nucleus of each programme team. P-Specs co-ordinate and oversee the project management of their respective programmes, whilst T-Specs provide highly specialised and targeted technical expertise to support programmatic rigour.

David 'davidad' Dalrymple
Programme Director
davidad is a software engineer with a multidisciplinary scientific background. He’s spent five years formulating a vision for how mathematical approaches could guarantee reliable and trustworthy AI. Before joining ARIA, davidad co-invented the top-40 cryptocurrency Filecoin and worked as a Senior Software Engineer at Twitter.

Yasir Bakki
Programme Specialist
Yasir is an experienced programme manager whose background spans the aviation, tech, emergency services, and defence sectors. Before joining ARIA, he led transformation efforts at Babcock for the London Fire Brigade’s fleet and a global implementation programme at a tech start-up. He supports ARIA as an Operating Partner from Pace.

Nora Ammann
Technical Specialist
Nora is an interdisciplinary researcher with expertise in complex systems, philosophy of science, political theory and AI. She focuses on the development of transformative AI and understanding intelligent behavior in natural, social, or artificial systems. Before ARIA, she co-founded and led PIBBSS, a research initiative exploring interdisciplinary approaches to AI risk, governance and safety.