Axiom Futures

future_academy_icon
Axiom Futures

What We Do

We connect neglected extraordinary minds with pioneers in AI Safety.

Axiom Futures is a field-building organization incubated and run by Impact Academy. We aim to enable extraordinary STEM talent from AIS-neglected regions to pursue AI Safety careers. We are looking for technically proficient aspiring researchers who have the potential to become Research Leads in AI Safety or Alignment. 

We've run two kinds of programs until now- i) part-time online courses that introduce top technical talent to AI Safety and ii) full-time research fellowships. We’ve already run a wide range of fellowships in the past- from a full-time research fellowship for Indian technical talent to placement programs like the EU Tech Policy Fellowship.

 

What is AI Safety?

AI safety focuses on developing technology and governance interventions to prevent both short-term and long-term harm caused by AI systems.

"Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war."
-
Signed by Yoshua Bengio (one of the Godfathers of AI) and other experts

AI might be the most transformative technology of all time. To make it go well for the future of humanity, we must seriously consider the types of risks transformative AI systems of the future might pose. Fortunately, there is a growing ecosystem of professionals and institutions dedicated to researching and solving these problems.

For instance, the International Dialogues on AI Safety have convened leading scientists worldwide to outline a set of “red lines” in AI development to prevent catastrophic risks. These include constraints on autonomous replication and enhancement, power-seeking behaviour, assistance in weapon development, cyberattacks, deception, etc.



Why do we do it?

We believe we can support extraordinary minds, who would otherwise not have had the opportunity, to play an important role in AI Safety through their careers.  

There are several reasons for this belief, notably:

  • Talent from across the world can play an important role in AI Safety.
  • Career opportunities in AI Safety are limited to specific regions, like the US and the UK.
  • We're capable of running cutting-edge educational and training courses in regions where AI Safety is still finding its footing.

 



Our Team

Untitled design (1)

Varun Agrawal

Managing Director of Axiom Futures. Varun ran India’s first AI safety fellowship and has been working on Indian AIS field building for the past 15 months, producing multiple talent surveys and long-term strategy. He has a background in Econometrics and has 7+ years' of experience with entities ranging from the World Bank to J-PAL. LinkedIn

Taufeeque

Mohammad Taufeeque

Research Manager at Axiom Futures. Taufeeque is an IIT-Bombay graduate and is currently an Alignment Research Engineer at Far AI. LinkedIn

Aditya

Aditya SK

Project Manager at Axiom Futures. Aditya has been involved in building the Indian Effective Altruism community for many years, including leading the EA Hyderabad group. He previously worked at ALLFED and works part-time as Regional Coordinator-India for Animal Ethics. LinkedIn

japan_glasses

Sebastian Schmidt

Acting CEO of Axiom Futures. As a co-founder of Impact Academy, Sebastian has experience establishing teams that create and deliver high-quality educational programs. This includes programs such as Future Academy v2 (India edition) program and teaching courses since his second year of Undergrad. Sebastian is also an executive coach for entrepreneurs and other impact-driven individuals. Previously, he trained as an M.D., did biosecurity research at Stanford, and co-authored a book. Importantly, he’s a mediocre salsa dancer. LinkedIn

Vil

Vilhelm Skoglund

Vilhelm is the CEO of Impact Academy and Nordics/Baltics regional coordinator for the Centre for Effective Altruism. He holds board positions within the non-profit sector and co-founded Nema Problema, a non-profit aiming to make migration policies more effective. Previously, he worked as a consultant and has studied law, developmental economics, and sustainability studies at Uppsala, Yale, and Cornell University. LinkedIn

Jayat Joshi

Jayat Joshi

Project Associate at Axiom Futures. Jayat graduated with an integrated Master's in Development Studies from the Indian Institute of Technology (IIT) Madras. LinkedIn

Frequently Asked Questions

FAQ

Can you tell me more about your past programs?

In 2024, we have run the following programs:

The AI Safety Careers Course (June-August)

A part-time, online course for top technical talent (university students and working professionals) on the foundations of technical AI Safety. We received ~600 applications and >3000 expressions of interest. Of these, we selected 70 promising applicants through a rigorous selection process for the program and also supported them with Career Coaching.

 

The Summer Pilot Fellowship (June-August)

A two-month, full-time, paid fellowship where Mentors selected candidates to work on a project in technical AI Safety/Alignment. We provided Fellows with co-living residence and access to a workspace in Bengaluru, as well as funding for a 10-day visit to London at the London Initiative for Safe AI (LISA). You can read more about it here.

What does the upcoming program look like?

Extraordinary technical talent is scattered globally in regions where AI Safety is a neglected cause area. For our upcoming Research Fellowship, we aim to enable these Neglected, Extraordinary Minds (NEMs) to contribute to AI Safety through their careers. To bridge this gap, we will provide them with opportunities to work full-time with some of the world’s leading Alignment research organisations, AI Safety Labs and companies for up to 6 months. 

What can I do after the programs?

We expect successful participants will pursue careers in the emerging field of AI safety by

  • Joining AI safety labs such as Google DeepMind, METR, and FAR AI.
  • Pursuing PhDs and post-docs at top institutes.
  • Contributing to Policy & Governance by joining Think Tanks or Government Agencies around the world.
  • Establishing independent projects and research agendas.
  • Further upskill in the AI industry or adjacent think tanks.
  • Guest lectures, avenues for professional connections, and networking opportunities.

Have more questions?

E-mail us at axiomfutures@impactacademy.org or consider using this form.