It is a great intersection of copyright maximalism and the dangers of making general purpose AIs without proper care in developing their goal function.
Single Point of Failure: The (Fictional) Day Google Forgot To Check Passwords - https://www.youtube.com/watch?v=y4GB_NDU43Q is another, different topic, but still worth watching.
As someone who had to wait five years for a baby and we only just barely got one thanks to IVF: someone please make an artificial womb facility. It would bring nothing but joy.
That video is fantastic. I hope it comes.
Oh jeez, I see why people were upset. The video makes a stupid moral argument about genetic engineering. Maybe that will be an issue someday, but the tech isn’t there; we just want babies.
Reminded me of the Amazon show called Upload. Sort of a lighter, more humorous take than Black Mirror, but the darker philosophical complexities still surface regularly.
How do the machines know what Tasty Wheat tasted like? Maybe they got it wrong. Maybe what I think Tasty Wheat tasted like actually tasted like oatmeal, or tuna fish. That makes you wonder about a lot of things. You take chicken, for example: maybe they couldn't figure out what to make chicken taste like, which is why chicken tastes like everything.
This reminds me of “The Upload”, a comedy series on Amazon Prime, where your digital consciousness is uploaded to a commercial cloud when you die, to live on in a digital utopia.
A utopia with IAPs, low-data cap slums, etc.
Edit: in fact, “The Upload” almost seems like it came directly from this short.
I really struggled to get into this; I’ve had several people recommend it. How far do you think it’s important to read before deciding it’s not for me?
I just quickly paged through it to try to answer this. I'd say peak engagement for me when I couldn't wait to find out what happened next was around 60% of the way in, but I was fully engaged from the beginning, I read it over 2 days.
I'd support qup's comment, there's too many good books in the world to drudge your way through a book that doesn't click :)
I think that’s fair! I am currently in a phase where I’m struggling to find something to pull my interest. I often reevaluate things I’ve put down before if the right commentary pulls me back in. I might give it another shot to see if I can hit that mark. Thanks for the insight!
It's an interesting idea but is it scary? Any such "consciousness" existing in a digital realm is a copy and so the real organic you would never experience these things
In order for me to still feel like "me" afterward, there would have to be a linking stage during which I could get used to this digital extension of my mind, like an artificial limb for me to get used to as part of my body. Then, as the biological mind withered and died, like pouring a liquid from one container to another, the digital brain would come to represent a larger and larger part of "me", finally containing all of my consciousness.
You could compare the process to how the cells of your body (in particular your brain) eventually all get replaced, yet the stream of consciousness flows uninterrupted through a number of these replacements.
The ethical conundrum is: at one point and afterwards, there will be "two" of you - basically, a digital fork of your physical consciousness.
Now, the physical consciousness may wish to live a separate life instead of fully "passing over" or ceasing to exist eventually (i.e. to die). But who will be considered "you" from that point? The physical "you" or the virtual "you"? Can the virtual "you" force the physical "you" to die?
I've got that one in a pretty-good short story collection called Beyond Flesh. Decent book to track down for anyone who wants a bunch of stories about that broad category of stuff.
Your continuum is interrupted all the time, e.g. by sleep or loss of continence. Is that subjectively different from death? A pretty scary thought. Will waking up as a different being feel no different?
Continuum was probably a bad word choice. I meant that if you visualize the brain over time, as a 3D-object travelling through time, and there's a 'bridge' over time from the biological brain to the digital brain. This bridge would have to encompass everything that's related to consciousness and it somehow would transfer the conscious experience between those two things over time. So the biological and digital brain would be causally linked.
I'm not a physicist, as you can guess, just a layman pondering.
> Will waking up as a different being feel no different?
I expect it would. Much of my experience has to do with the "hardware" (so to speak) of my body. It's difficult to imagine what it would be like if my brain was mechanical (physics-based) instead of biological (chemistry-based), for example. Abstract thought seems like the only thing that even could be the same.
Even if it's not "me" (whatever that means), I do fear for whatever/whoever that consciousness is. It's the same fear I have towards the next generation, I don't want to leave a horrible world for them.
PS: My god I have always known writing about this stuff is hard because language is inadequate for recursive thoughts, but I never thought I would see a day when it will practically affect my daily discourse. These were discussions I usually had under the influence of specific substances.
Right now you are already forced to pick 'no'. It's a button that lets things go the way so many people claim to be necessary and proper. They can own up to deliberately choosing our current reality if they think our current reality is so much better
In a world where immortality is a given, being able to off yourself is one of the most important things an individual should have a right to do
is it a no? religions say otherwise. the problem is, we don't know. if we could upload into a computer but we would not be able to communicate with those uploaded, we wouldn't know anything about their continued existence either.
There’s a lot of philosophical discussion about what would happen with uploading consciousness, grappling with the question of “do you get a ‘transfer’ where your consciousness wakes up in the digital realm” and so forth.
I’ve always thought the obvious result is that both the physical and the digital consciousness would wake up. Physical you would have some convincing personal experience of “I did not experience a transfer”, meanwhile digital you would have some equally convincing personal experience of “I did experience a transfer”.
This is, to me, the most foundational and profound realization that a person can have. Entire religions and social structures have been invented to avoid thinking about it.
'You' is indeed a pretty complex concept. It's one of those things that works as long as you don't squint too hard at it, but if you do start to think through some of the cracks in the idea you quickly reach a point where the only reasonable response is existential horror.
Would your digital self though? It's a simulation, so maybe digital you wouldn't be conscious of perceiving, dreaming, imagining, etc. If you take the philosophical position that computation is an abstraction from individual first person experiences, then your digital self wouldn't be conscious, since the functional leaves out the experiential. Your digital self would be a p-zombie.
Actually, you die every night and I just load the copy of your mind I took of you right after you fell asleep and loaded it on a copy of your body. You're old dead body gets incinerated. You've been reborn every day and have never noticed the difference.
Or, how much of your body is required in order for you to feel like you? Plenty of people without any limbs, and probably they feel like a "you". But what if you were just a head? Or just a brain?
They feel like a "you", but many of those who go through amputation surgery do, indeed, experience "phantom limbs" and experience a deep sense of loss for long periods of time after the surgery.
In contrast, animals don't seem to go through the same "mourning" process (in a way that we would recognize).
So the sense of "you" persists, yes, but it's a foundational change to the sense of "self" and it's definitely affected by amputation.
Suppose you do it ship-of-theseus style, one neuron at a time (or heck, one atom at a time if you want). In that case I'd find it hard to argue for your position.
I haven't watched the video yet so take this with a grain of salt.
dd for a human amirite? Maybe one atom at a time, but you'd have to make sure you get all the bacteria that helps control your personality as well right?[1] Then you need to simulate the changes in that system, based on how you interact with the rest of a virtual world, including relationships with other people[2], so you'd have to have an accurate copy of that too. What do you do if your virtual self needs to interact with a still living, breathing human? How do you simulate the changes to your micro-biome that occur there? How do you simulate what your copied human "eats" and how it processes that food?[3] Having a copy of all your atoms is a start but you also need to simulate all of those processes accurately if you want it to be truly "you."
Then what you end up with is -- assuming we know everything that makes up a human being and how it behaves -- a perfect copy of you along with a perfect copy of your world.
You dont have to replace yourself perfectly in order for the proposal to make sense. All that matters is that you agree the two versions are connected by conscious experience. If you take steroids that massively change your personality and mood, you still perceive it as happening to you.
If you have an illness and lose a good chunk of your microbiome did you just die? No. You're still you, just a version of you slightly further along the timeline after some event that modified 'you'. Any copy of you that fails to simulate the mites on your left pinkie toenail is also 'you', just a version slightly further along the timeline after some event that modified 'you'.
There is no 'you' - just an unbroken chain of chemical interactions stretching back to the first self-replicating molecule. 'You' is a concept that the pattern replicating event predictor engine that evolved to live in your skull is desperately clinging to. It has no real physical basis.
Nah you don't just die, but you are changed. If you fail to account for a person's microbiome (among all the other things that we might not even know about yet) then you haven't completely copied who they are.
I probably should have just said that IMO we humans don't know the beginning of what consciousness is.
Would have an interesting impact on the capitalist system which bases some of the incentives on leaving an inheritance to your offspring which gets taxed. If your assets would be used in the afterlife the dynamic changes. All your earnings would still be able to be taxed but how many people would leave any inheritance for descendents if you'll need it for your next consciousness? Would people value their alter-ego having a good forever life or their offspring having a good real-life? The societal changes in behavior would be interesting.
After seeing the current wave of AI, the "restricted mental activities" makes me think such a program would prohibit sexual fantasies and other NSFW thoughts.
The Artificial Intelligence That Deleted A Century https://www.youtube.com/watch?v=-JlxuQ7tPgQ
It is a great intersection of copyright maximalism and the dangers of making general purpose AIs without proper care in developing their goal function.