Tyler makes the case that, despite what you may have heard, we *can* make rational judgments about what is best for society as a whole. He argues:
1. Our top moral priority should be preserving and improving humanity's long-term future
2. The way to do that is to maximise the rate of sustainable economic growth
3. We should respect human rights and follow general principles while doing so.
We discuss why Tyler believes all these things, and I push back where I disagree. In particular: is higher economic growth actually an effective way to safeguard humanity's future, or should our focus really be elsewhere?
In the process we touch on many of moral philosophy's most pressing questions: Should we discount the future? How should we aggregate welfare across people? Should we follow rules or evaluate every situation individually? How should we deal with the massive uncertainty about the effects of our actions? And should we trust common sense morality or follow structured theories?
Links to learn more, summary and full transcript.
After covering the book, the conversation ranges far and wide. Will we leave the galaxy, and is it a tragedy if we don't? Is a multi-polar world less stable? Will humanity ever help wild animals? Why do we both agree that Kant and Rawls are overrated?
Today's interview is released on both the 80,000 Hours Podcast and Tyler's own show: Conversation with Tyler.
Tyler may have had more influence on me than any other writer but this conversation is richer for our remaining disagreements. If the above isn't enough to tempt you to listen, we also look at:
* Why couldn’t future technology make human life a hundred or a thousand times better than it is for people today?
* Why focus on increasing the rate of economic growth rather than making sure that it doesn’t go to zero?
* Why shouldn’t we dedicate substantial time to the successful introduction of genetic engineering?
* Why should we completely abstain from alcohol and make it a social norm?
* Why is Tyler so pessimistic about space? Is it likely that humans will go extinct before we manage to escape the galaxy?
* Is improving coordination and international cooperation a major priority?
* Why does Tyler think institutions are keeping up with technology?
* Given that our actions seem to have very large and morally significant effects in the long run, are our moral obligations very onerous?
* Can art be intrinsically valuable?
* What does Tyler think Derek Parfit was most wrong about, and what was he was most right about that’s unappreciated today?
Get this episode by subscribing: type 80,000 Hours into your podcasting app.
The 80,000 Hours Podcast is produced by Keiran Harris.