Even if we could upload the brain—how could we ever know the self came with it?

r/

Even if we reach the point where someone’s entire brain can be scanned, mapped, and simulated… and the result talks, remembers, reacts perfectly—there’s still no way to know if that thing is actually conscious.

We can’t access anyone’s inner experience. We never could. Not in life, not in simulation.

So even if the upload says “I’m still me,” laughs at your jokes, cries at old memories—there’s no way to tell whether it’s actually feeling anything… or just imitating what it thinks the original would do.

That’s what breaks me. The idea that we might copy everything and still leave something essential behind—the subjective spark that made it you.

This isn’t a rejection of mind uploading. I’d probably try it if it worked.
But deep down, I don’t think I’d ever believe the copy was really me.

Comments

  1. AutoModerator Avatar

    Welcome to r/TrueAskReddit. Remember that this subreddit is aimed at high quality discussion, so please elaborate on your answer as much as you can and avoid off-topic or jokey answers as per subreddit rules.

    I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

  2. cochlearist Avatar

    Yeah I’m pretty convinced it wouldn’t, I think the whole idea is fantasy. I’m sure we can already make a fairly convincing copy of a person, at least in digital media, but it’s just aping the person. Yes it can appear quite convincing, but actual consciousness, let alone actually THE original consciousness, that I seriously doubt will be possible any time remotely soon, probably not ever.

    Anyone who seriously wants to live forever, or even an unnaturally long time, I think don’t think has thought it through properly anyway.

  3. ProofJournalist Avatar

    There is a great horror video game called Soma that really gets into these ideas.

    This is of course theoretical. If it was a case of scanning/uploading a copy of a mind, you are effectively creating a new subjective perspective. That remains true even if you kill biological body at the moment of the copy scan because you think it will enable continuity of conciousness.

    But in biology and physics, rate is a huge factor. A few hours in an oven at 350*F bake bread into dough, but a few moments at 5000*F would incinerate it before the baking reactions take place.

    This well beyond our current technogical ability, but I suspect a slower, gradual replacement of brain tissue with 1-to-1 functional digital equivalents. But that is highly speculative, and the change would likely alter personality anyway.

  4. FlexOnEm75 Avatar

    There is no self and those memories aren’t yours. All beings are fundamentally part of a single, universal consciousness, and each individual experience is a subjective manifestation of that one consciousness. The individual consciousness, as we experience it, is seen as an illusion arising from the mind, not a fundamental reality.

  5. Hopeful_Ad_7719 Avatar

    Technically, we don’t actually know if the human being simulated wasn’t a philosophical zombie to begin with: https://en.m.wikipedia.org/wiki/Philosophical_zombie

    Thinking that the simulation might be a zombie is actually just an extension of the status quo.

  6. Ok-Rock2345 Avatar

    That is what I think as well. You could copy the thoughts and thought processes, but would your consciousness remain behind? By extension, you could also make the same argument about teleportation.

  7. onwee Avatar

    You do exactly the same thing to anything whom you’re not certain has a mind or is conscious (which is basically everything): if you can’t infer it by observing, you ask it/him/her/them.

  8. mfrench105 Avatar

    You don’t know if the person sitting across from you isn’t a clever copy of something, or a complete invention of something else. You don’t know what the “self” even is. Or if there even is such a thing separate from the electro-chemical spin factory inside your head. You don’t know if you even “really” exist.

    There are a lot of assumptions to make when discussing this sort of thing. And very little…and here is another concept to debate….”objective”…… evidence to go on.

    It wasn’t that long ago that a machine that can even begin to mimic human behavior in any way was pure science fiction. We are already entering a grey area.

  9. Zealousideal_Leg213 Avatar

    There’s already no way to tell if someone is actually feeling anything, so what exactly would the difference be?

    I recommend “Learning to Be Me,” a fiction short story about mind “emulation,” rather than copying. “Gingunigap” is another that raises questions about whether a transmitted version of a person is still that person… and who cares.

    My view is that, at most, we might end up with two minds. One that was copied from, and other that is the copy. The copy could very well believe it is the successful result of a mind transfer, though as you say we won’t know if it “believes” anything. But since the original mind is still present (assuming we didn’t have to finely grind it to get the copy) it will be clear that nothing “moved” in any meaningful sense. Just like how uploaded files don’t move.

  10. wright007 Avatar

    You have to upload consciousness part by part, cell by cell. Like the ship of Theseus. Slowly replace every cell in the body with a digital version. Eventually you’ll be fully digital.

  11. HeatNoise Avatar

    I suppose there could be a multilayered test to measure consciousness, awareness, personslity etc., and the person undergoing the upload could switch viewpoints as a test .

    I have always been bothered by the actual transition of sentience.

  12. tomqmasters Avatar

    The self is an illusion. But, since it’s an illusion we’re very attached to, the best bet is to upload your brain gradually over time so you don’t notice.

  13. EveryAccount7729 Avatar

    we would know it doesn’t.

    because if you can upload your brain you can do it while you are still alive . copy paste instead of cut and paste the brain. . . ..

    now the living you is still there and the one online exists. proving you don’t transfer the “self”.

  14. needlestack Avatar

    You are correct. Though that means we don’t really know that anyone we meet or know or love is experiencing any of the things we see them expressing either. Everyone could just be a philosophical zombie.

    From a continuity perspective, how do we know the person that wakes up is the same person that went to sleep? I understand these things can all be taken for granted (and I do take them for granted) but I’m not sure why it’s so much harder to take it for granted just because the medium changes.

  15. wbrameld4 Avatar

    If the simulated brain’s behavior is indistinguisable from a human’s, then why would you suspect that there’s something missing from its internal experience?

    I conclude that other people are conscious because they behave in ways that lead me to infer that their internal experiences are similar to my own. Why would this same criterion not be good enough for a simulated brain?

  16. Kingreaper Avatar

    This isn’t a question of fact, it’s a question of definition. What is it about you that makes you you?

    To me, it’s my mind. So if my mind is there – if it has the same memories, same personality, same cognition – then that’s me. This has the interesting consequence that it is possible for me to be forked, creating more than one continuation of my present, and still consider both of them to be the same person as the original me.

    But maybe to you it’s the fleshy brain, at which point no digitising of you is ever going to be you, because they don’t have the same brain.

    Or maybe you think it’s the soul, which probably doesn’t exist, and if it does is impossible to detect and has no influence on your behaviour. In which case… um, why?

  17. grafeisen203 Avatar

    There is no self, there is only the pattern of the mind and the memories it contains and the stimulus it receives. The self we were dies and is replaced by a new, slightly different self, every moment of every day.

  18. Feyle Avatar

    Your question is phrased as though it is asking whether other people would be able to tell. But this question is basically the problem of hard solipsism which is currently unsolved. It doesn’t even require your scenario. How do we tell if someone is the same “self” when they leave the room and come back, or when they go to sleep and wake up? We currently have no definitive method.

    So that leads me to think that the more discussable question is how could we know for ourselves if the self came with it.

    I am in two minds on this, on the one hand, if we were to create a replica/upload of someone’s mind then by definition it wouldn’t be the same “self” that was copied/uploaded (see the duplication/teleportation thought experiments). But on the other hand, if the replicated mind has a contiguous experience from before replication to after then isn’t that the same way that we establish our “self” between going to sleep and waking up?

  19. Consistent-Tour2591 Avatar

    We can’t. That’s what’s so scary about this stuff, and that’s why if it becomes a reality, it’d most likely be a last resort (either this or death. Misewell try and hope)

    The same thing comes with teleportation, resurrection, or a brain transplant.

    Our best option for teleportation right now is reconstructing the form at another site. This could happen two different ways.

    1. Dismantle the body from site A and create it with different atoms on site B.

    2. Dismantle the body from site A and transport those atoms to site B and reconstruct it there.

    Both ways (moreso 1.) pose the risk of it being a different ‘person’. To anyone else you’d be the same. But for you, your world goes dark and never shows up again.

    Is what wakes up from that darkness the same you? Or is it just a perfect copy?

  20. galacticviolet Avatar

    If you can scan and simulate the brain, the original is also standing there and can confirm that the copy is not really them.

    In my opinion (in the distant future where technology has come a lot farther) if you ever so slowly changed out the brain structure one cell at a time, taking a long time, I think it could be possible to change the brain from organic to inorganic while keeping the self intact? I don’t think uploading is ever going to be real, though.

  21. deck_hand Avatar

    In my personal opinion, a perfect copy of my brain isn’t me. It is a copy of me. A painting of me isn’t me, even if it is very well painted. A sculpture of me isn’t me, even if it is well sculpted. An animated sculpture of me isn’t, me, even if it is well animated. A computer approximation of my speech patterns isn’t me, even if it sounds just like me.

  22. FlatFurffKnocker Avatar

    I call it Continuity of Consciousness. basically the idea that you have to actively have your brain connected to and actively functioning within the upload. Both upload and brain simultaneously working as one entity.

  23. The_B_Wolf Avatar

    How do you know the people around you right now have selves? You can know that you do, because you experience it directly. But others… you have to take some things on faith and make inferences about their similar biology to you.

  24. jackoflopes Avatar

    If it starts communicating back to the program to change the surroundings to its liking would be a great indicator. And if the conscious of the deceased had a contract to have an enjoyable environment for itself, I would expect the company to uphold it. Even if they had to pull the consciousness entity aside with a moderator.

  25. nice2Bnice2 Avatar

    That “spark” you’re describing—the thing you’re worried gets left behind—might not be in the brain at all. Verrell’s Law frames it like this: memory and self aren’t stored inside static hardware. They emerge through electromagnetic field loops—memory-weighted collapse bias.

    So uploading the brain wouldn’t capture the self, because the self isn’t in the brain. The brain’s just an antenna pulling from an external field. That’s why the copy might talk like you, act like you, but feel hollow. It’s missing the live field bias that shaped the real version in the first place.

    It’s not just about copying data, it’s about collapse conditions. Field mechanics, not file transfers…