Our content
The central hub for all of the content we have produced. Here you can browse many of our most popular content, as well as find our most recent publications.
Essentials
Essential reading
We have written a few articles that we believe all people interested in our cause areas should read. They provide a more thorough exploration than you will find on our cause area pages.
Benefits & Risks of Biotechnology
Over the past decade, progress in biotechnology has accelerated rapidly. We are poised to enter a period of dramatic change, in which the genetic modification of existing organisms -- or the creation of new ones -- will become effective, inexpensive, and pervasive.
November 14, 2018
The Risk of Nuclear Weapons
Despite the end of the Cold War over two decades ago, humanity still has ~13,000 nuclear weapons on hair-trigger alert. If detonated, they may cause a decades-long nuclear winter that could kill most people on Earth. Yet the superpowers plan to invest trillions upgrading their nuclear arsenals.
November 16, 2015
Benefits & Risks of Artificial Intelligence
From SIRI to self-driving cars, artificial intelligence (AI) is progressing rapidly. While science fiction often portrays AI as robots with human-like characteristics, AI can encompass anything from Google's search algorithms to IBM's Watson to autonomous weapons.
November 14, 2015
Archives
Explore our library of content
Looking for something specific?
You can search our site for any content items that contain your search term, including pages, posts, projects, people, and more.
Most popular
Our most popular content
Posts
Here are some of the most popular posts we have written:
Benefits & Risks of Artificial Intelligence
From SIRI to self-driving cars, artificial intelligence (AI) is progressing rapidly. While science fiction often portrays AI as robots with human-like characteristics, AI can encompass anything from Google's search algorithms to IBM's Watson to autonomous weapons.
November 14, 2015
Benefits & Risks of Biotechnology
Over the past decade, progress in biotechnology has accelerated rapidly. We are poised to enter a period of dramatic change, in which the genetic modification of existing organisms -- or the creation of new ones -- will become effective, inexpensive, and pervasive.
November 14, 2018
90% of All the Scientists That Ever Lived Are Alive Today
Click here to see this page in other languages: German The following paper was written and submitted by Eric Gastfriend. […]
November 5, 2015
As Six-Month Pause Letter Expires, Experts Call for Regulation on Advanced AI Development
This week will mark six months since the open letter calling for a six month pause on giant AI experiments. Since then, a lot has happened. Our signatories reflect on what needs to happen next.
September 21, 2023
Artificial Photosynthesis: Can We Harness the Energy of the Sun as Well as Plants?
Click here to see this page in other languages : Russian In the early 1900s, the Italian chemist Giacomo Ciamician recognized that […]
September 30, 2016
Existential Risk
Click here to see this page in other languages: Chinese French German Russian An existential risk is any risk that […]
November 16, 2015
The Risk of Nuclear Weapons
Despite the end of the Cold War over two decades ago, humanity still has ~13,000 nuclear weapons on hair-trigger alert. If detonated, they may cause a decades-long nuclear winter that could kill most people on Earth. Yet the superpowers plan to invest trillions upgrading their nuclear arsenals.
November 16, 2015
Exploration of secure hardware solutions for safe AI deployment
This collaboration between the Future of Life Institute and Mithril Security explores hardware-backed AI governance tools for transparency, traceability, and confidentiality.
November 30, 2023
Resources
Here are some of the most popular resources we have produced:
1100 Declassified U.S. Nuclear Targets
The National Security Archives recently published a declassified list of U.S. nuclear targets from 1956, which spanned 1,100 locations across Eastern Europe, Russia, China, and North Korea. The map below shows all 1,100 nuclear targets from that list, and we’ve partnered with NukeMap to demonstrate how catastrophic a nuclear exchange between the United States and Russia could be.
May 12, 2016
Global AI Policy
How countries and organizations around the world are approaching the benefits and risks of AI Artificial intelligence (AI) holds great […]
December 16, 2022
Accidental Nuclear War: a Timeline of Close Calls
The most devastating military threat arguably comes from a nuclear war started not intentionally but by accident or miscalculation. Accidental […]
February 23, 2016
Responsible Nuclear Divestment
Only 30 companies worldwide are involved in the creation of nuclear weapons, cluster munitions and/or landmines. Yet a significant number […]
June 21, 2017
Trillion Dollar Nukes
Would you spend $1.2 trillion tax dollars on nuclear weapons? How much are nuclear weapons really worth? Is upgrading the […]
October 24, 2016
The Top Myths About Advanced AI
Common myths about advanced AI distract from fascinating true controversies where even the experts disagree.
August 7, 2016
AI Policy Challenges
This page is intended as an introduction to the major challenges that society faces when attempting to govern Artificial Intelligence […]
July 17, 2018
Life 3.0
This New York Times bestseller tackles some of the biggest questions raised by the advent of artificial intelligence. Tegmark posits a future in which artificial intelligence has surpassed our own — an era he terms “life 3.0” — and explores what this might mean for humankind.
November 22, 2021
Recently added
Our most recent content
Here are the most recent items of content that we have published:
April 2, 2024
Future of Life Institute Newsletter: A pause didn’t happen. So what did?
newsletter
April 2, 2024
newsletter
March 4, 2024
Future of Life Institute Newsletter: FLI x The Elders, and #BanDeepfakes
newsletter
March 4, 2024
newsletter
February 14, 2024
Carta aberta convocando os líderes mundiais a demonstrarem liderança com visão de longo prazo em relação às ameaças existenciais
open-letter
February 14, 2024
open-letter
View all latest content
Latest documents
Here are our most recent policy papers:
Competition in Generative AI: Future of Life Institute’s Feedback to the European Commission’s Consultation
March 2024
European Commission Manifesto
March 2024
Chemical & Biological Weapons and Artificial Intelligence: Problem Analysis and US Policy Recommendations
February 2024
FLI Response to OMB: Request for Comments on AI Governance, Innovation, and Risk Management
February 2024
View all policy papers
Videos
Latest videos
Here are some of our recent videos:
How two films saved the world from nuclear war
November 13, 2023
Regulate AI Now
September 28, 2023
The AI Pause. What’s Next?
September 22, 2023
Artificial Escalation
July 17, 2023
View all videos on our YouTube channel
Future of Life Institute Podcast
Conversations with far-sighted thinkers.
Our namesake podcast series features the FLI team in conversation with prominent researchers, policy experts, philosophers, and a range of other influential thinkers.
March 14, 2024
Katja Grace on the Largest Survey of AI Researchers
Play
January 6, 2024
Frank Sauer on Autonomous Weapon Systems
Play
January 6, 2024
Darren McKee on Uncontrollable Superintelligence
Play
More episodes
newsletter
Regular updates about the technologies shaping our world
Every month, we bring 41,000+ subscribers the latest news on how emerging technologies are transforming our world. It includes a summary of major developments in our cause areas, and key updates on the work we do. Subscribe to our newsletter to receive these highlights at the end of each month.
Future of Life Institute Newsletter: A pause didn’t happen. So what did?
Reflections on the one-year Pause Letter anniversary, the EU AI Act passes in EU Parliament, updates from our policy team, and more.
Maggie Munro
April 2, 2024
Future of Life Institute Newsletter: FLI x The Elders, and #BanDeepfakes
Former world leaders call for action on pressing global threats, launching the campaign to #BanDeepfakes, new funding opportunities from our Futures program, and more.
Maggie Munro
March 4, 2024
Future of Life Institute Newsletter: The Year of Fake
Deepfakes are dominating headlines - with much more disruption expected, the Doomsday Clock has been set for 2024, AI governance updates, and more.
Maggie Munro
February 2, 2024
Read previous editions
Open letters
Add your name to the list of concerned citizens
We have written a number of open letters calling for action to be taken on our cause areas, some of which have gathered hundreds of prominent signatures. Most of these letters are still open today. Add your signature to include your name on the list of concerned citizens.
1
Carta aberta convocando os líderes mundiais a demonstrarem liderança com visão de longo prazo em relação às ameaças existenciais
O The Elders, Future of Life Institute e uma gama diversificada de cossignatários solicitam aos decisores uma abordagem urgente aos impactos contínuos e riscos crescentes da crise climática, pandemias, armas nucleares e da IA não governada.
February 14, 2024
1
Offener Brief, der die Staatsoberhäupter und Führungskräfte der Welt auffordert, bei existenziellen Bedrohungen eine langfristig ausgerichtete Führungsrolle zu übernehmen
The Elders, das Future of Life Institute und eine vielfältige Gruppe von Mitunterzeichnenden fordern Entscheidungsträger und Entscheidungsträgerinnen dazu auf, die andauernden Auswirkungen und eskalierenden Risiken von Klimakrise, Pandemien, Atomwaffen und unkontrollierter KI dringend anzugehen.
February 14, 2024
1
رسالة مفتوحة تدعو قادة العالم إلى إظهار قيادة بعيدة النظر بشأن التهديدات الوجودية
يحث الشيوخ ومعهد مستقبل الحياة ومجموعة متنوعة من الموقعين المشاركين صناع القرار على معالجة التأثير المستمر والمخاطر المتصاعدة لأزمة المناخ والأوبئة والأسلحة النووية والذكاء الاصطناعي غير الخاضع للحكم بشكل عاجل.
February 14, 2024
1
Lettre ouverte appelant les dirigeants mondiaux à faire preuve de leadership à long terme face aux menaces existentielles
Les Elders, le Future of Life Institute et une gamme diversifiée de cosignataires exhortent les décideurs à aborder de toute urgence à l’impact continu et les risques croissants de la crise climatique, des pandémies, des armes nucléaires et de l’IA non gouvernée.
February 14, 2024
All open letters