Skip to content

FLI Open Letters

We believe that scientists need to make their voices heard when it comes to matters of emerging technologies and their risks. The Future of Life Institute has facilitated this dialogue in the form of many open letters throughout the years.
Our content

Featured letters

Add your signature to our most important published open letters:
Signatories
Closed

AI Licensing for a Better Future: On Addressing Both Present Harms and Emerging Threats

This joint open letter by Encode Justice and the Future of Life Institute calls for the implementation of three concrete US policies in order to address current and future harms of AI.
October 25, 2023
Signatories
31810

Pause Giant AI Experiments: An Open Letter

We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.
March 22, 2023
Signatories
5218

Lethal Autonomous Weapons Pledge

Artificial intelligence (AI) is poised to play an increasing role in military systems. There is an urgent opportunity and necessity for citizens, policymakers, and leaders to distinguish between acceptable and unacceptable uses of AI. In this light, we the undersigned agree that the decision to take a human life should never be delegated to a machine.
June 6, 2018
Signatories
5720

Asilomar AI Principles

The Asilomar AI Principles, coordinated by FLI and developed at the Beneficial AI 2017 conference, are one of the earliest and most influential sets of AI governance principles.
August 11, 2017
Signatories
34378

Autonomous Weapons Open Letter: AI & Robotics Researchers

Autonomous weapons select and engage targets without human intervention. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.
February 9, 2016
Signatories
11251

Research Priorities for Robust and Beneficial Artificial Intelligence: An Open Letter

There is now a broad consensus that AI research is progressing steadily, and that its impact on society is likely to increase. The potential benefits are huge, since everything that civilization has to offer is a product of human intelligence. Because of the great potential of AI, it is important to research how to reap its benefits while avoiding potential pitfalls.
October 28, 2015
Our content

All our other open letters

Here are all of the open letters we have published:
Signatories
1

Carta aberta convocando os líderes mundiais a demonstrarem liderança com visão de longo prazo em relação às ameaças existenciais

O The Elders, Future of Life Institute e uma gama diversificada de cossignatários solicitam aos decisores uma abordagem urgente aos impactos contínuos e riscos crescentes da crise climática, pandemias, armas nucleares e da IA não governada.
February 14, 2024
Signatories
1

Offener Brief, der die Staatsoberhäupter und Führungskräfte der Welt auffordert, bei existenziellen Bedrohungen eine langfristig ausgerichtete Führungsrolle zu übernehmen

The Elders, das Future of Life Institute und eine vielfältige Gruppe von Mitunterzeichnenden fordern Entscheidungsträger und Entscheidungsträgerinnen dazu auf, die andauernden Auswirkungen und eskalierenden Risiken von Klimakrise, Pandemien, Atomwaffen und unkontrollierter KI dringend anzugehen.
February 14, 2024
Signatories
1

رسالة مفتوحة تدعو قادة العالم إلى إظهار قيادة بعيدة النظر بشأن التهديدات الوجودية

يحث الشيوخ ومعهد مستقبل الحياة ومجموعة متنوعة من الموقعين المشاركين صناع القرار على معالجة التأثير المستمر والمخاطر المتصاعدة لأزمة المناخ والأوبئة والأسلحة النووية والذكاء الاصطناعي غير الخاضع للحكم بشكل عاجل.
February 14, 2024
Signatories
1

Lettre ouverte appelant les dirigeants mondiaux à faire preuve de leadership à long terme face aux menaces existentielles

Les Elders, le Future of Life Institute et une gamme diversifiée de cosignataires exhortent les décideurs à aborder de toute urgence à l’impact continu et les risques croissants de la crise climatique, des pandémies, des armes nucléaires et de l’IA non gouvernée.
February 14, 2024
Signatories
1

Carta abierta apelando a que líderes mundiales muestren liderazgo a largo plazo frente a las amenazas existenciales

The Elders, el Future of Life Institute y una amplia gama de cosignatarios apelan a que los tomadores de decisiones aborden urgentemente el impacto actual y los crecientes riesgos de la crisis climática, las pandemias, las armas nucleares y la inteligencia artificial no gobernada.
February 14, 2024
Signatories
2672

Open letter calling on world leaders to show long-view leadership on existential threats

The Elders, Future of Life Institute and a diverse range of co-signatories call on decision-makers to urgently address the ongoing impact and escalating risks of the climate crisis, pandemics, nuclear weapons, and ungoverned AI.
February 14, 2024
Signatories
998

Open Letter Against Reckless Nuclear Escalation and Use

The abhorrent Ukraine war has the potential to escalate into an all-out NATO-Russia nuclear conflict that would be the greatest catastrophe in human history. More must be done to prevent such escalation.
October 18, 2022
Signatories
Closed

Foresight in AI Regulation Open Letter

The emergence of artificial intelligence (AI) promises dramatic changes in our economic and social structures as well as everyday life […]
June 14, 2020
Signatories
276

Autonomous Weapons Open Letter: Global Health Community

Given our commitment to do no harm, the global health community has a long history of successful advocacy against inhumane weapons, and the World and American Medical Associations have called for bans on nuclear, chemical and biological weapons. Now, recent advances in artificial intelligence have brought us to the brink of a new arms race in lethal autonomous weapons.
March 13, 2019
Signatories
Closed

2018 Statement to United Nations on Behalf of LAWS Open Letter Signatories

The following statement was read on the floor of the United Nations during the August, 2018 CCW meeting, in which […]
September 4, 2018
Signatories
3789

UN Ban on Nuclear Weapons Open Letter

Nuclear arms are the only weapons of mass destruction not yet prohibited by an international convention, even though they are the most destructive and indiscriminate weapons ever created. We scientists bear a special responsibility for nuclear weapons, since it was scientists who invented them and discovered that their effects are even more horrific than first thought.
June 19, 2018
Signatories
110

An Open Letter to the United Nations Convention on Certain Conventional Weapons

As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel […]
August 20, 2017
Signatories
Closed

The Principles – Signatories List

{“labels”:[],”rewrite”:{“with_front”:true}}
January 11, 2017
Signatories
Closed

Autonomous Weapons Open Letter: AI & Robotics Researchers – Signatories List

Click here to view the Autonomous Weapons Open Letter for AI & Robotics Researchers.
February 9, 2016
Signatories
Closed

AI Open Letter – Signatories List

Click here to view the Research Priorities for Robust and Beneficial AI Open Letter.
February 4, 2016
Signatories
Closed

Digital Economy Open Letter

An open letter by a team of economists about AI’s future impact on the economy. It includes specific policy suggestions […]
January 25, 2016
Signatories
Closed

AI Economics Open Letter

Inspired by our Puerto Rico AI conference and open letter, a team of economists and business leaders have now launched […]
June 19, 2015

Sign up for the Future of Life Institute newsletter

Join 40,000+ others receiving periodic updates on our work and cause areas.
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram