Powered by RND
PodcastsTechnologyAn Introduction to Nick Bostrom
Listen to An Introduction to Nick Bostrom in the App
Listen to An Introduction to Nick Bostrom in the App
(398)(247,963)
Save favourites
Alarm
Sleep timer

An Introduction to Nick Bostrom

Podcast An Introduction to Nick Bostrom
Team Radio Bostrom
A deep dive into Nick Bostrom's ideas, including: existential risk, wise philanthropy, the ethics of AI, transhumanism, and the case for speeding up some invest...

Available Episodes

5 of 13
  • Letter from Utopia (2020)
    By Nick Bostrom.Abstract:The good life: just how good could it be? A vision of the future from the future.Read the full paper:https://nickbostrom.com/utopiaMore episodes at:https://radiobostrom.com/ ---
    --------  
    19:46
  • Base Camp for Mount Ethics (2022)
    By Nick Bostrom.Draft version 0.9Abstract:New theoretical ideas for a big expedition in metaethics.Read the full paper:https://nickbostrom.com/papers/mountethics.pdfMore episodes at:https://radiobostrom.com/ ---Outline:(00:17) Metametaethics/preamble(02:48) Genealogy(09:41) Metaethics(21:30) Value representors(26:56) Moral motivation(30:02) The weak(33:25) Hedonism(41:38) Hierarchical norm structure and higher morality(55:30) Questions for future research---
    --------  
    58:35
  • 11. Crucial Considerations and Wise Philanthropy (2014)
    By Nick Bostrom.Abstract:Within a utilitarian context, one can perhaps try to explicate [crucial considerations] as follows: a crucial consideration is a consideration that radically changes the expected value of pursuing some high-level subgoal. The idea here is that you have some evaluation standard that is fixed, and you form some overall plan to achieve some high-level subgoal. This is your idea of how to maximize this evaluation standard. A crucial consideration, then, would be a consideration that radically changes the expected value of achieving this subgoal, and we will see some examples of this. Now if you stop limiting your view to some utilitarian context, then you might want to retreat to these earlier more informal formulations, because one of the things that could be questioned is utilitarianism itself. But for most of this talk we will be thinking about that component.Read the full paper:https://www.effectivealtruism.org/articles/crucial-considerations-and-wise-philanthropy-nick-bostromMore episodes at:https://radiobostrom.com/ ---Outline:(00:14) What is a crucial consideration?(04:27) Should I vote in the national election?(08:18) Should we favor more funding for x-risk tech research?(14:32) Crucial considerations and utilitarianism(18:52) Evaluation Functions(19:03) Some tentative signposts(20:35) (Text resumes)(27:28) Possible areas with additional crucial considerations(30:03) Some partial remedies---
    --------  
    35:05
  • 10. Are You Living In A Computer Simulation? (2003)
    By Nick Bostrom.Abstract:This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a “posthuman” stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor‐simulations is false, unless we are currently living in a simulation. A number of other consequences of this result are also discussed.Read the full paper:https://www.simulation-argument.com/simulation.pdfMore episodes at:https://radiobostrom.com/ ---Outline:(00:19) Abstract(01:11) Section 1. Introduction(04:08) Section 2. The Assumption of Substrate-independence(06:32) Section 3. The Technological Limits of Computation(15:53) Section 4. The Core of the Simulation Argument(16:58) Section 5. A Bland Indifference Principle(22:57) Section 6. Interpretation(35:22) Section 7. Conclusion(36:53) Acknowledgements---
    --------  
    37:55
  • The Evolutionary Optimality Challenge (2021)
    By Nick Bostrom, Anders Sandberg, and Matthew van der Merwe.This is an updated version of The Wisdom of Nature, first published in the book Human Enhancement (Oxford University Press, 2009).Abstract:Human beings are a marvel of evolved complexity. When we try to enhance poorly-understood complex evolved systems, our interventions often fail or backfire. It can appear as if there is a “wisdom of nature” which we ignore at our peril. A recognition of this reality can manifest as a vaguely normative intuition, to the effect that it is “hubristic” to try to improve on nature, or that biomedical therapy is ok while enhancement is morally suspect. We suggest that one root of these moral intuitions may be fundamentally prudential rather than ethical. More importantly, we develop a practical heuristic, the “evolutionary optimality challenge”, for evaluating the plausibility that specific candidate biomedical interventions would be safe and effective. This heuristic recognizes the grain of truth contained in “nature knows best” attitudes while providing criteria for identifying the special cases where it may be feasible, with present or near-future technology, to enhance human nature.Read the full paper:https://www.nickbostrom.com/evolutionary-optimality.pdfMore episodes at:https://radiobostrom.com/ ---Outline:(00:31) Abstract(01:58) Introduction(07:22) The Evolutionary Optimality Challenge(11:13) Altered tradeoffs(12:18) Evolutionary incapacity(13:33) Value discordance(14:47) Altered tradeoffs(17:50) Changes in resources(23:24) Changes in demands(28:44) Evolutionary incapacity(30:54) Fundamental inability(32:54) Local optima(34:17) Example: the appendix(36:37) Example: the ε4 allele(37:52) Example: the sickle-cell allele(42:33) Lags(45:51) Marker 17(46:26) Example: lactase persistence(47:18) Value discordance(49:05) Example: contraceptives(50:55) Good for the individual(55:22) Example: happiness(56:40) Good for society(58:18) Example: compassion(01:00:03) The heuristic(01:00:30) Current ignorance prevents us from forming any plausible idea about the evolutionary factors at play(01:01:43) We come up with a plausible idea about the relevant evolutionary factors, and they suggest that the intervention would be harmful(01:02:31) We come up with several different plausible ideas about the relevant evolutionary factors(01:03:26) We develop a plausible idea about the relevant evolutionary factors, and they imply we wouldn’t have evolved the enhanced capacity even if it were beneficial(01:08:23) Conclusion(01:09:11) References(01:09:18) Thanks to---
    --------  
    1:10:10

More Technology podcasts

About An Introduction to Nick Bostrom

A deep dive into Nick Bostrom's ideas, including: existential risk, wise philanthropy, the ethics of AI, transhumanism, and the case for speeding up some investments in technology while slowing down others.
Podcast website

Listen to An Introduction to Nick Bostrom, Better Offline and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features
Social
v7.6.0 | © 2007-2025 radio.de GmbH
Generated: 2/7/2025 - 3:34:41 PM