1. EachPod
EachPod
EA Forum Podcast (All audio) - Podcast

EA Forum Podcast (All audio)

Audio narrations from the Effective Altruism Forum, including curated posts, posts with 30 karma, and other great writing.

If you'd like fewer episodes, subscribe to the "EA Forum (Curated & Popular)" podcast instead.

Society & Culture Philosophy Technology
Update frequency
every day
Average duration
12 minutes
Episodes
476
Years Active
2025
Share to:
“EA North 2025 retrospective” by matthes

“EA North 2025 retrospective” by matthes

intro

EA North was a one-day conference in Sheffield (UK) aimed at people in the North of England (Manchester, Liverpool, Sheffield, Leeds, etc.). The event had 35 attendees on the day.[1] The cost …

00:08:26  |   Fri 02 May 2025
“12x more cost-effective than EAG - how I organised EA North 2025 (and how you could, too)” by matthes

“12x more cost-effective than EAG - how I organised EA North 2025 (and how you could, too)” by matthes

I put on a small one-day conference. The cost per attendee was £50 (vs £1.2k for EAGs) and the cost per new connection was £11 (vs £130 for EAGs).

intro

EA North was a one-day event for the North o…

00:14:23  |   Fri 02 May 2025
“Reflections on 7 years building the EA Forum — and moving on” by JP Addison🔸

“Reflections on 7 years building the EA Forum — and moving on” by JP Addison🔸

I’m ironically not a very prolific writer. I’ve preferred to stay behind the scenes here and leave the writing to my colleagues who have more of a knack for it. But a goodbye post is something I mus…

00:04:44  |   Thu 01 May 2025
“Community Polls for the Community” by Will Aldred

“Community Polls for the Community” by Will Aldred

The Meta Coordination Forum (MCF) is a place where EA leaders are polled on matters of EA community strategy. I thought it could be fun (and interesting) to run these same polls on EAs at large.[1]

00:02:43  |   Thu 01 May 2025
“Arkose is closing, but you can help” by Arkose

“Arkose is closing, but you can help” by Arkose

Arkose is an AI safety fieldbuilding organisation that supports experienced machine learning professionals — such as professors and research engineers — to engage with the field. We focus on those n…

00:04:40  |   Thu 01 May 2025
“Should we expect the future to be good?” by Neil Crawford

“Should we expect the future to be good?” by Neil Crawford

Audio note: this article contains 54 uses of latex notation, so the narration may be difficult to follow. There's a link to the original text in the episode description.

1. Introduction

'Should we…

00:36:52  |   Thu 01 May 2025
“Debate: should EA avoid using AI art outside of research?” by titotal

“Debate: should EA avoid using AI art outside of research?” by titotal

There is a growing movement to ban or discourage the use of AI art, citing ethical concerns over unethical data scraping, environmental cost, and harm to the incomes of real artists. This sentiment …

00:06:27  |   Thu 01 May 2025
“Prioritizing Work” by Jeff Kaufman 🔸

“Prioritizing Work” by Jeff Kaufman 🔸

I recently read a blog post that concluded with:

When I'm on my deathbed, I won't look back at my life and wish I had worked harder. I'll look back and wish I spent more time with the p…
00:01:30  |   Thu 01 May 2025
“New Funding Round on Hardware-Enabled Mechanisms (HEMs)” by aog, Longview Philanthropy

“New Funding Round on Hardware-Enabled Mechanisms (HEMs)” by aog, Longview Philanthropy

Longview Philanthropy is launching a new request for proposals on hardware-enabled mechanisms (HEMs).

We think HEMs are a promising method to enforce export controls, secure model weights, and veri…

00:32:43  |   Wed 30 Apr 2025
“EMERGENCY CALL FOR SUPPORT: Mitigating Global Catastrophic Risks (GCRs)” by JorgeTorresC, JuanGarcia, Mónica Ulloa, Michelle Bruno Hz, Jaime Sevilla, Roberto Tinoco, Guillem Bas

“EMERGENCY CALL FOR SUPPORT: Mitigating Global Catastrophic Risks (GCRs)” by JorgeTorresC, JuanGarcia, Mónica Ulloa, Michelle Bruno Hz, Jaime Sevilla, Roberto Tinoco, Guillem Bas

Observatorio de Riesgos Catastróficos Globales (ORCG) is at a critical juncture. We have secured funds for AI governance projects, but we are at risk of discontinuing all projects in other GCR areas…

00:04:44  |   Wed 30 Apr 2025
“EA Funds and CEA are merging” by calebp, Zachary Robinson🔸, Oscar Howie

“EA Funds and CEA are merging” by calebp, Zachary Robinson🔸, Oscar Howie

Caleb is Project Lead of EA Funds. Zach is CEO of the Centre for Effective Altruism, and Oscar is CEA's Chief of Staff.

EA Funds and CEA are currently separate projects within Effective Ventures. EV…

00:14:29  |   Wed 30 Apr 2025
“New EA-adjacent Philosophy Lab” by Walter Veit

“New EA-adjacent Philosophy Lab” by Walter Veit

Hi everyone,

I am a lecturer in philosophy at the University of Reading and currently trying to set-up at a lab focused on animal and AI sentience and welfare. Since many EAs are doing research in t…

00:01:34  |   Wed 30 Apr 2025
“My Research Process: Key Mindsets - Truth-Seeking, Prioritisation, Moving Fast” by Neel Nanda

“My Research Process: Key Mindsets - Truth-Seeking, Prioritisation, Moving Fast” by Neel Nanda

This is post 2 of a sequence on my framework for doing and thinking about research. Start here.

Before I get into what exactly to do at each stage of the research process, it's worth reflecting on t…

00:20:10  |   Wed 30 Apr 2025
“Cultivating doubt: why I no longer believe cultivated meat is the answer” by Tom Bry-Chevalier🔸

“Cultivating doubt: why I no longer believe cultivated meat is the answer” by Tom Bry-Chevalier🔸

Introduction

In this post, I present what I believe to be an important yet underexplored argument that fundamentally challenges the promise of cultivated meat. In essence, there are compelling reason…

00:24:01  |   Wed 30 Apr 2025
“Should people with more forum karma have more powerful votes?” by Henry Howard🔸

“Should people with more forum karma have more powerful votes?” by Henry Howard🔸

My upvotes/downvotes are worth 2 points each and my supervotes are worth 6. A person with between 10 and 100 karma on the forum has an upvote worth 1 and a supervote worth 2 (the scaling system is d…

00:01:37  |   Tue 29 Apr 2025
“University Groups Should Run Socials” by Avik Garg, Noah Birnbaum

“University Groups Should Run Socials” by Avik Garg, Noah Birnbaum

This post is for university group student organizers. We start with the big reason you should run a social, three tips we’re highly certain of, and some additional thoughts.

We think ~40% of the va…

00:12:40  |   Tue 29 Apr 2025
“Aquaculture in space” by Ben Stevenson

“Aquaculture in space” by Ben Stevenson

The Guardian reported yesterday on 'The Lunar Hatch' project, which is aiming to send fertilised sea bass eggs into space, so they can farm fish for astronauts.

Lunar Hatch's ultimate aim is to crea…

00:04:49  |   Tue 29 Apr 2025
“The best Health Systems Strengthening Interventions... aren’t really that - My take on RP’s report” by NickLaing

“The best Health Systems Strengthening Interventions... aren’t really that - My take on RP’s report” by NickLaing


TLDR: RP's best interventions barely qualify as Health Systems Strengthening - they focus directly on the Health worker and their implementation. Not only these, but almost all HSS interventions ar…

00:14:56  |   Mon 28 Apr 2025
“Criticism on the EA Forum” by Toby Tremlett🔹

“Criticism on the EA Forum” by Toby Tremlett🔹

I'm writing this on behalf of the mod team. They've reviewed and commented on this post, but mistakes are mine.

We want and value criticism on the EA Forum. EA organisations often make their decisi…

00:06:08  |   Mon 28 Apr 2025
“The case for multi-decade timelines [Linkpost]” by Sharmake

“The case for multi-decade timelines [Linkpost]” by Sharmake

At the request of @Vasco Grilo🔸 in a post that I can't get out of drafts, here's the full linkpost.

Original post is below:

So this post is an argument that multi-decade timelines are reasonable, an…

00:19:20  |   Mon 28 Apr 2025
Disclaimer: The podcast and artwork embedded on this page are the property of EA Forum Team ([email protected]). This content is not affiliated with or endorsed by eachpod.com.