Would you upload your brain to a computer or android?

Nope. Likelihood is someone would edit or alter your brain or certain beliefs or ideals you had or what not (memories/etc. if possible) from your original mind, and that's assuming it wouldn't just deteriorate in a data loss sense or be corrupted (unintentional alterations) upon upload.
this. you want shit like Ghost in the Shell?
 
No, other types of immortals like werewolves and vampires are cooler than some manufactured robot that would have to simp to it's corporate overlords to stay existing. There was a sci-fi YA series I liked as a kid that was about this scenario. It's pretty good and explores mature themes for it's age group.
Skinned is a young adult sci-fi by Robin Wasserman. First published in 2008, the novel has since been renamed Frozen and is the first book in the Cold Awakening trilogy. After her organic body dies in a car accident, Lia's mind is scanned and downloaded into a mechanical body with no resemblance to the original. Grappling with the concepts of personal identity and what it means to be human, Lia must decide if she wants to return to her old life, or assimilate into her new one.
Her family paid for the brain upload because they are rich and she died in a car accident but then come to resent her because her body is generic looking and not exactly like what she was born with, she cannot eat with them or do other normal things and they come to see her as a artificial entity that only reminds them of their grief. She still feels like the same person.
There are robots (called skinners by the public or mechs by themselves) that hack their own software even though it's illegal because it's owned by a company and add unofficial body mods. They jump off cliffs to feel any sort of adrenaline rush similar to real human feelings. Her human friend from high school forgets she is a robot and cannot die and jumps after her to "save her", gets paralyzed and blames her for his stupidity lol.
 
Last edited:
There is no amount of money you could offer me that could convince me to do something like that.
It's basically a way to enslave you and torture you for eternity. You could be in a device that's powered but forgotten and you are in a prison of your own mind for a very long time, it's the worth fate possible.
Like locked-in syndrome, except you don't even have the respite to know that you will die within 60 years in the worst case scenario.

I know that the one in the chip wouldn't be "me", but I wouldn't inflict that on "me" in any case.
 
It depends who's in control of everything. If it's no worse than getting a surgery done at the hospital, I don't see a problem, but it could easily be controlled by people who deliberately fuck with your program or maybe part of some evil scheme where you live in the bugs, eat the pod since transferring everyone's brains to tiny nanocomputers inside bugs is efficient and environmental or whatever. God even knows, the possibilities are endless with this concept and you better believe the bad guys will be using this tech.
 
Pretty strongly depends on the details of the setting. It's pretty much all about control.

I'd think about it if

1. the technology unambiguously supports our consciousness, rather than being some kind of imitation (whatever breakthroughs in philosophy or metaphysics are required to do that). I'm not convinced that continuity and squishy neurons are absolutely necessary for existence or consciousness (then again, as pointed out, we don't really know what it is yet). When we're deeply sedated, our cognition stops, then starts up again. We can turn ourselves on and off with chemistry, but don't have an existential crisis every time we wake up from that. We just can't copy ourselves or multithread or anything yet.

2. The technology is entirely under the end user's total control, from the transistors up. (If I have to run on anyone else's hardware, if there is even a hint of vendor lock in or encroachment, hard no. I'd do this only if the parts are as widely available and interchangeable as modern PC stuff or standard stepper motors. The transhuman user has to be the sovereign owner and absolute root controller of the technology involved.

3. Some kind of embodiment and control over physical circumstances (robots, androids, etc) are available. You're not just a helpless brain in a box, since that's also far more vulnerability than you should place yourself under.

4. I'm getting old and my current body is wearing out. If the alternative is frailty and death anyway, then sure, let's try the robot thing.

5. Dying is always an option. (Again, control. The nightmare scenario is being trapped against your will in a hellish circumstance, which is something I'd think robot people would have to worry about.) Nature's cruelty pretty much ends in death, and thats its furthest extent. It's only mankind that has attempted to create hell.
 
If it destroys me, absolutely not.

If I was to somehow make a digital copy of myself, stored locally on my own hardware with no internet access, and use it to build an AI assistant... yeah that could be cool.
 
The OP is basically describing the computer game Soma.

And my answer is no, fuck no, and double fuck no.

Some people have already given the more fear-based answers. For me though there's another consideration. My own greatest goal in life is to go back in time to when things were good, and I feel worse the farther I get from those halcyon days. I already get miserable when I think about how long ago the 1990s were, small comfort being that some places I used to know and can revisit on Google Maps still look mostly familiar.... I imagine it would fucking kill me if I woke up 100 years later and that one house in the country is now a sprawling metropolis, or that one Chuck E. Cheese (which used to be Showbiz Pizza) now no longer exists, or or or or.....

Fuck that. I'd rather be dead.
 
No, but my husband said he would. If we're talking about which near future sci-fi thing we're all doing, I'd prefer the one way trip to Mars, please and thank you.
 
Back