Alignment is EASY and Roko's Basilisk is GOOD?!


Episode Artwork
1.0x
0% played 00:00 00:00
Mar 16 2025 119 mins   22

Roko Mijic has been an active member of the LessWrong and AI safety community since 2008. He’s best known for “Roko’s Basilisk”, a thought experiment he posted on LessWrong that made Eliezer Yudkowsky freak out, and years later became the topic that helped Elon Musk get interested in Grimes.

His view on AI doom is that:* AI alignment is an easy problem* But the chaos and fighting from building superintelligence poses a high near-term existential risk* But humanity’s course without AI has an even higher near-term existential risk

While my own view is very different, I’m interested to learn more about Roko’s views and nail down our cruxes of disagreement.

00:00 Introducing Roko

03:33 Realizing that AI is the only thing that matters

06:51 Cyc: AI with “common sense”

15:15 Is alignment easy?

21:19 What’s Your P(Doom)™

25:14 Why civilization is doomed anyway

37:07 Roko’s AI nightmare scenario

47:00 AI risk mitigation

52:07 Market Incentives and AI Safety

57:13 Are RL and GANs good enough for superalignment?

01:00:54 If humans learned to be honest, why can’t AIs?

01:10:29 Is our test environment sufficiently similar to production?

01:23:56 AGI Timelines

01:26:35 Headroom above human intelligence

01:42:22 Roko’s Basilisk

01:54:01 Post-Debate Monologue

Show Notes

Roko’s Twitter: https://x.com/RokoMijic

Explanation of Roko’s Basilisk on LessWrong: https://www.lesswrong.com/w/rokos-basilisk

Watch the Lethal Intelligence Guide, the ultimate introduction to AI x-risk! https://www.youtube.com/@lethal-intelligence

PauseAI, the volunteer organization I’m part of: https://pauseai.info

Join the PauseAI Discord — https://discord.gg/2XXWXvErfA — and say hi to me in the #doom-debates-podcast channel!

Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate.

Support the mission by subscribing to my Substack at https://doomdebates.com and to https://youtube.com/@DoomDebates



This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit lironshapira.substack.com