SelfAwarePatterns<p><strong>Mind uploading and continuity</strong></p><p>As a computational functionalist, I think the mind is a system that exists in this universe and operates according to the laws of physics. Which means that, in principle, there shouldn’t be any reason why the information and dispositions that make up a mind can’t be recorded and copied into another substrate someday, such as a digital environment.</p><p>To be clear, I think this is unlikely to happen anytime soon. I’m not in the technological singularity camp that sees us all getting uploaded into the cloud in a decade or two, the infamous “rapture of the nerds”. We need to understand the brain far better than we currently do, and that seems several decades to centuries away. Of course, if it is possible to do it anytime soon, it won’t be accomplished by anyone who’s already decided it’s impossible, so I enthusiastically cheer efforts in this area, as long as it’s real science.</p><p>There have always been a number of objections to the idea of uploading. Many people just reflexively assume it’s categorically impossible. Certainly we don’t have the technology today, but short of assuming the mind is at least partially non-physical, it’s hard to see what the ultimate obstacle might be. Even with that assumption, who can say that a copied mind wouldn’t have those non-physical properties? David Chalmers, a property dualist, sees those non-physical properties as corresponding with the right functionality, so for him AI consciousness and mind copying remain a possibility.</p><p>One objection that I often hear is the break in continuity. Most people, including <a href="https://survey2020.philpeople.org/survey/results/5094" rel="nofollow noopener noreferrer" target="_blank">most philosophers</a>, feel like a copied mind just wouldn’t be them. Often they’ll acknowledge that we have breaks in continuity every night. Or the <a href="https://en.wikipedia.org/wiki/Ship_of_Theseus" rel="nofollow noopener noreferrer" target="_blank">Ship of Theseus</a> issue, that the matter that makes up our brain is being constantly recycled and refreshed, so that the matter that we were composed of years ago is not the matter we’re made up of today. But uploading seems like an abrupt shift from one set of matter to another, a very different kind break.</p><p>Interestingly, this is an issue for the original mind, not the copied one, who should be able to remember being the original. But to the original, a new being seems to have been created that acts like them, but isn’t them. (Assuming that the copying process doesn’t result in the destruction of the original, a possibility that itself might make people reluctant to volunteer, at least before they’re on their death bed.) </p><p>I was reminded about these issues when listening to David Eagleman’s interview of Max Hodak. Hodak is the founder of a company that works on neural prosthetics, with current focus on helping people with vision issues. His ideas on how to make progress in this area are fascinating. They involve growing new neurons to interface with the brain, rather than directly attaching technology to the brain, with all the autoimmune issues that typically result. (You don’t have to watch this video to follow the rest of the post, just embedding for reference.)</p><p><a href="https://www.youtube.com/watch?v=Vtp-6GOGfMA" rel="nofollow noopener noreferrer" target="_blank">https://www.youtube.com/watch?v=Vtp-6GOGfMA</a></p><p>Toward the end of the discussion, Hodak discusses the longer range implications, such as substrate independence, aka, mind uploading. He notes that the technology to link two brains together should be possible in the near future, and sees that as possibly getting around the continuity issue. He seems to envisage it working sort of like a Vulcan mind merge. </p><p>I’m personally not sure that’s what would happen, but it highlights some possible workarounds for the continuity concern. It raises the possibility that in a mind upload scenario, the original biological mind could be linked with the copy, possibly being able to experience both sides of the divide.</p><p>Another possibility often covered in science fiction is the ability of various copies of a mind to share memories with each other. It’s much easier to think of those other copies as you if you can remember being them. But there are a couple of obstacles to making this work.</p><p>One is that a biological brain isn’t like a commercial computer. It doesn’t have a data port or addressable memory. So simply copying things in, as we see in the Matrix movies, isn’t really feasible. The brain’s neural network is constantly changing, but there’s no mechanism for those changes except for experiences in the normal fashion. This means there’s no way for the biological version of me to simply receive my digital twin’s memories.</p><p>The second issue is that, even in the digital copies, it’s not clear how to copy memories from one version of the mind to another with fidelity. Consider that learning and forming memories in a neural network means changing the synapses (connection weights), often in a distributed fashion throughout the network. And those changes are relative to the current state of that network. So two copies of a mind, once they start to have different experiences will diverge pretty quickly, changing what a memory means for different versions.</p><p>One possible way around both issues might be recording the signals coming in on the sensory pathways to the mind, and then allowing another copy of that mind to play those signals back. This could give the second copy <em>something like</em> the experience of the original. It would involve an enormous amount of data, although if we’re able to record minds, we’re probably able to handle it. </p><p>However it seems like it would be far more time consuming than the sci-fi versions of memory swapping, since the receiving mind would have to take the time to undergo the experience. It might also be a very strange experience, since the receiving mind would be experiencing all the sensory effects of any actions taken by the sending mind during the original experience, without going through the same volitional thought processes. </p><p>Still, with implants similar to the type Hodak is discussing, we might imagine the original biological mind being able to share in the experience of being the copy, either though a link with the copy, or with sensory playback, which might address the continuity concerns. </p><p>Unless of course I’m missing something. What do you think? Are there conceptual difficulties I’m overlooking? Or other possibilities that might alleviate the continuity concern?</p><p><a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/brain/" target="_blank">#Brain</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/consciousness/" target="_blank">#Consciousness</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/mind-uploading/" target="_blank">#MindUploading</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/neuroscience/" target="_blank">#Neuroscience</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/philosophy/" target="_blank">#Philosophy</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/philosophy-of-mind/" target="_blank">#PhilosophyOfMind</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/science/" target="_blank">#Science</a></p>