Solar wrote:Consciousness is a result of a certain neuro-chemical setup. You create the same setup, you have the same consciousness. It doesn't matter if you create that setup from matter directly, this energy, or that energy. That's the point.
So, two people normally have different consciousnesses, but if two of them happen to be identical in every aspect they suddenly have the same consciousness, a shared one, and yet it behaves as if it is not shared at all (stick pin in one, the other feels nothing). You have two people who are copies of each other, and each one has its own independent consciousness.
Merely hinting at standard issue biology studies. Not "dead people", but dead frogs, for example. How can you argue the finer points of a field in which you are unaware of basic research?
If they're brain dead, there's no way for them to be conscious of your attempted torturing. You apply electric currents and get them to twitch, but you can do the same with an electric motor.
No, not similar to the concept of the soul, that's the very point of my argument: That, from a scientific standpoint, there is no such thing as a "soul" or other "magic thing that is 'me'", beyond what persons perceive as such.
I'm talking about something utterly minimal - merely sufficient to do such things as feel pain. If you rule that out, there can be no such thing as pain and we are just machines.
Here's a question to help you focus your mind. I'm going to put you into one of these machines, then you'll come out of it not knowing whether you're the original or the copy, though I will know. I will then kill one of you, and you can choose which one I will kill in advance of the copy being made. Would you prefer me to kill the original or the copy?
It doesn't matter. One "me" will die, one "me" will live, either way.
It seems that science fiction is more dangerous than I'd ever imagined - it turns out that it can mess with people's minds to the point where they don't care if they suddenly cease to exist so long as they're replaced by someone identical, somehow imagining that they will continue to exist as the copy.
Edit: Actually, there is already an error in the way the question is asked: "I" can chose? Which of the two identical "I" you are facing is getting asked the question?
The beginning of this thought experiment was that the copy would be perfect. If it's perfect, both the original and the copy are a perfect "me". If the copy process is a loud flash and a bang, both "me's" will remember stepping inside the machine, experiencing the flash-and-bang, and stepping out of the other side.
There's no error in the question - it states very clearly that you're asked to choose before the copy is made. After the copy is made, the original and the copy don't know which is which, but I do.
Edit: Both would be certain they were the one "me" who entered the machine. Neither would "feel" a copy, and as the experiment is about a perfect copy, there would be no justification to call one "original" and the other "copy", because both are (up until very recently identical) humans.
Of course there's a justification for calling one the original and the other a copy - one of them is the original and the other is a copy. You're inventing new physics where whenever you make a copy so good that it's identical to the original, the original is no longer the original.
The fact that you don't seem to have any objections killing the copy simply "because it's a copy" raises some not so nice questions about your morality. What makes the copy any less human than the original, the killing any less murder? The fact that you "made" it? That implies you would also feel justified to kill your own child if the mother agrees, because the two of you "made it". Would you?
Do you understand how thought experiments work? Or is this a wind up?
You seem to be describing a machine with no room for consciousness in it.
Right, and wrong. When you are talking about "consciousness", you seem to have some "magical spark" in mind that is non-copyable by a mere physical copy. When I am talking about "consciousness", I understand it to be a result of neuro-chemical complexities, nothing more - but that makes "you" and "me" no less valuable (though, for the sake of our thought experiment, copyable).
Your neuro-chemical complexities can't feel pain. If pain isn't real, there's no real you or me in the machine.
Known physics is completely sufficient to explain consciousness, with just a touch of imagination applied where we lack the detail in observing and the complexity of understanding, and thusly I refuse to accept anything "metaphysical" in the equation. Occams razor.
There's a hell of a lot more magic involved in your model than there is in mine.
That "pain" is a neuro-chemical reaction to stimulus, in a very deterministic way (which is merely a bit too complex for us to understand fully at this point), does not make it less "painful" or less real, and the application of pain to a being no less amoral, even if there is no "magical spark" in there.
You pretend that you understand something which science cannot explain - consciousness is a major problem and I'm not the one here claiming to understand it. I've shown you where the problem is, but you're determined to ignore it and pretend it isn't there. I don't think we're going to make any further progress here, so I'll declare myself out.
___________________________________________________________________________
If a mod decides the thread should be split, the following information may make the job easier: the 8th post on the second page (one by Bonch which ends by asking "Do you think computers could ever be conscious?") should be the first post of the new thread, but the last post on that page (one of mine) and the 2nd, 4th and 6th posts on the third page (one of mine and two of gerryg400's) would then need to be extracted and put back into the original thread. [That all assumes that 15 posts appear on each page.] After splitting the thread, the four posts to extract and move back to the original thread will have become posts 8, 10, 12 and 14 of the new thread.
It's become more complex now: post 6 on page 7 belongs in the original thread too.