Within a computer, we can create a causally closed world that functions independently from the dynamics of the physical universe, i.e. it works the same regardless of what is happening in the physical world, as long as the computer remains intact
Interesting article. It does feel a bit like getting fixated on definitions, but maybe you'd say that the people who insist that a brain is not a computer are also getting hung up on definitions. (I can't resist pointing to my favourite "discussion" of definitions, from Karl Popper: https://www.youtube.com/watch?v=qfbRAN2OxM8. It's a sobering reminder of what could be done with philosophy, or indeed any debate, but rarely is.)
I found the stuff about recurrent neural networks being "super Turing" especially interesting.
The "algorithms can't do intuition" argument against "strong AI" (as he calls it) feels like it's been side-stepped by the neural-network approach -- instead of making decisions by an algorithm, you use a kind of blind probabilistic mimicry, where it's much easier to imagine decisions that look like intuition. Anyway, it's a hell of an excuse to tour through a vast chunk of theoretical physics.
You say the notion of free will is antiquated (or "agency", which sounds like another word for the same thing), but it's still most people's default assumption that it's real (and that they have it!), and of course people keep writing these books about it.
Most philosophers retain a belief in free will, but their concept of free will is often drastically watered down in an attempt to enable its compatibility with determinism.
Similar ideas in physics are easy to understand (does a breeze blowing on an object exert zero force or more than zero) but when the mind is examining itself it malfunctions (in predictable ways). Interestingly, education can amplify the malfunctioning.
Computers are causal insulators.
Within a computer, we can create a causally closed world that functions independently from the dynamics of the physical universe, i.e. it works the same regardless of what is happening in the physical world, as long as the computer remains intact
Humans are computers.
No, it's not a metaphor.
https://medium.com/the-spike/yes-the-brain-is-a-computer-11f630cad736
Interesting article. It does feel a bit like getting fixated on definitions, but maybe you'd say that the people who insist that a brain is not a computer are also getting hung up on definitions. (I can't resist pointing to my favourite "discussion" of definitions, from Karl Popper: https://www.youtube.com/watch?v=qfbRAN2OxM8. It's a sobering reminder of what could be done with philosophy, or indeed any debate, but rarely is.)
I found the stuff about recurrent neural networks being "super Turing" especially interesting.
Thanks Mark. Always a pleasure.
Nice post ill have to check that Penrose book
The "algorithms can't do intuition" argument against "strong AI" (as he calls it) feels like it's been side-stepped by the neural-network approach -- instead of making decisions by an algorithm, you use a kind of blind probabilistic mimicry, where it's much easier to imagine decisions that look like intuition. Anyway, it's a hell of an excuse to tour through a vast chunk of theoretical physics.
You say the notion of free will is antiquated (or "agency", which sounds like another word for the same thing), but it's still most people's default assumption that it's real (and that they have it!), and of course people keep writing these books about it.
Most philosophers retain a belief in free will, but their concept of free will is often drastically watered down in an attempt to enable its compatibility with determinism.
Similar ideas in physics are easy to understand (does a breeze blowing on an object exert zero force or more than zero) but when the mind is examining itself it malfunctions (in predictable ways). Interestingly, education can amplify the malfunctioning.