Gödel's Theorem Says Intelligence ≠ Power? AI Doom Debate with Alexander Campbell
Podcast:Doom Debates Published On: Fri Mar 21 2025 Description: Alexander Campbell claims that having superhuman intelligence doesn’t necessarily translate into having vast power, and that Gödel's Incompleteness Theorem ensures AI can’t get too powerful. I strongly disagree.Alex has a Master's of Philosophy in Economics from the University of Oxford and an MBA from the Stanford Graduate School of Business, has worked as a quant trader at Lehman Brothers and Bridgewater Associates, and is the founder of Rose AI, a cloud data platform that leverages generative AI to help visualize data.This debate was recorded in August 2023.00:00 Intro and Alex’s Background05:29 Alex's Views on AI and Technology06:45 Alex’s Non-Doomer Position11:20 Goal-to-Action Mapping15:20 Outcome Pump Thought Experiment21:07 Liron’s Doom Argument29:10 The Dangers of Goal-to-Action Mappers34:39 The China Argument and Existential Risks45:18 Ideological Turing Test48:38 Final ThoughtsShow NotesAlexander Campbell’s Twitter: https://x.com/abcampbellWatch the Lethal Intelligence Guide, the ultimate introduction to AI x-risk! https://www.youtube.com/@lethal-intelligencePauseAI, the volunteer organization I’m part of: https://pauseai.infoJoin the PauseAI Discord — https://discord.gg/2XXWXvErfA — and say hi to me in the #doom-debates-podcast channel!Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate.Support the mission by subscribing to my Substack at https://doomdebates.com and to https://youtube.com/@DoomDebates Get full access to Doom Debates at lironshapira.substack.com/subscribe