Fall 2022 Fellowships
All fellowships cover eight weeks of content between early October and early November. Each week fellows will complete between one to two hours of readings and/or exercises and join together for a weekly hour-long discussion with a facilitator over lunch or dinner. Fellows will have the chance to hear directly from professionals relevant to their program, be invited to social events, and connect with other members.
Social Impact Career Fellowship
Some efforts to improve the world can be much more impactful than others. Smallpox eradication, the abolitionist movement, germ theory, and the Green Revolution demonstrate what’s possible when people come together to solve pressing problems. But these exceptional opportunities to do good are few and far between. At the other end of the spectrum, many well-intentioned initiatives are ineffective or even counterproductive. Doing good better is one of the defining challenges of our time. Recognizing this need, we established the Social Impact Career Fellowship to equip students with the knowledge to do good in the world more effectively, and the tools to think critically about their career ambitions. By the end of the term, fellows have the knowledge of a few key large-scale and neglected global problems, tools for prioritizing among them, and practical interventions for solving them. Fellows also have access to in-depth career advice and connections. We look forward to learning with you.
Long Term Future Fellowship
Designed for veteran EAs and complete new-comers alike, the Long Term Future Fellowship takes a focused look at the philosophical and practical considerations relating to existential risk and longtermism, a philosophy regarding our orientation towards the long term future. In particular, we will explore the vast opportunities in humanity’s future, the possibilities of a devastating extinction, and the actions we can take to protect the chance of realizing our potential. In you watched Don’t Look Up over winter break, you have a visceral appreciation for how terrible a catastrophe that killed all of humanity would be. While the film is intended as a satire of our response to climate change, it can also reflect our response to all sources of existential risk. From engineered pandemics and nuclear weapons, to comets and super volcanoes, our best guess of the total existential risk facing humanity this century is approximately 1 in 6. If we are really living in such a pivotal moment in history, what should we do about it? Are there any robustly good actions we can take?
Technical AI Alignment Fellowship
In the near future, artificial intelligence could bring about changes to our society which are more dramatic than the agricultural or industrial revolutions. If deliberate steps are not taken to ensure AI serves humanity’s best interests, this technology could pose serious threats to the wellbeing and stability of our society. The Technical AI Alignment Fellowship challenges participants to engage with this reality and explore potential solutions to this problem. The contents of this fellowship will take participants from the foundations of neural networks and reinforcement learning to questions of human alignment and AI governance strategies. Readings and exercises will deepen fellows’ understanding of potential risks due to rapid technological development and model misalignment, while introducing specific threat models and potential solutions to these problems. In this vein, fellows will engage with current research on methods for learning from humans, multi-agent interactions, and other paradigms of AI safety work.