where are the 1's and 0's?

All off topic discussions go here. Everything from the funny thing your cat did to your favorite tv shows. Non-programming computer questions are ok too.
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re: where are the 1's and 0's?

Post by Solar »

DavidCooper wrote:The physics part of it was replicating the idea of the killer-teleport: the state of one atom is transferred to another at a distance. The philosophical part of it is then going on to assert that the atom to which the state has been transferred has become the original.
No, that was NOT what we were talking about.

We did not talk about whether the copy "becomes" the original. We were talking about how the copy would be indistinguishable from the original.

Sorry, but if you are that fuzzy around the edges, this discussion indeed is worthless.
Every good solution is obvious once you've found it.
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

Solar wrote:Listen, we're really running in circles here.
Indeed - that's precisely why a divorce is in order.
You simply don't acknowledge anything as understood,
Do I have to say "afirmative" in a reply to every statement I agree with?
you don't make your position clear,
I make my position very clear - you appear to skim read everything and misunderstand half of it.
you don't ask when you didn't understand,
Provide an example.
you're only giving the same pseudo-philosophical / linguistic flak to every other statement people make over and over.
If you can't understand a simple point, there are no other options.
This discussion is going nowhere, and I'm tyring of it.
Which is why I said I wanted out. You can't push people beyond their ability to understand.
The idea of the thought experiment was a perfect copy being made.
A copy is a copy - it doesn't become the original by becoming perfectly identical. Look up a dictionary and see what "copy" means!
Please elaborate how a perfect copy of me would be any different from me (i.e., not perfect).
Every electron in the universe is so far as we can tell identical. Does that mean there is only one electron in the universe?
Your neuro-chemical complexities can't feel pain.
Please prove. Experimental evidence of undergraduate Biology studies (which I attended) are against you.
A neuro-chemical complexity is a kind of complexity. Complexities can't feel pain. Here's a complexity of wires. If I hit it, the wires don't feel pain, but the complexity does? I don't think so, somehow! Now repeat the experiment with any other kind of complexity. This isn't a language game - it's applied logical thinking.
Any part of an organism can be "triggered" in a way that is equivalent to what happens when the organism as a whole "experiences pain". You can stimulate a pain receptor. You can make a nerve trigger without a pain receptor attached. You can trigger the pain center in the brain without a nervous system attached. You can make an adrenal gland produce adrenalin without a brain or blood stream attached. Every single part of the organism can be observed and studied in isolation, and the results are deterministic (to a point). The only thing modern Biology and Psychology lacks is a detailed understanding of how it works all together, because the involved networks, feedback loops etc. are of enormous complexity. But it is understood what "feeling pain" is for each part of the organism.
It is possible that pain is felt at each step, but it is not reported by the organism as pain unless it's in the brain. A lightbulb might be in pain while it's lit, but it has no way of reporting that. If you stand on a drawing pin, your foot has no way of reporting pain so there is no way of knowing if it is in pain or not, but it does send a damage-detected signal to the brain, and it's only in the brain that pain is claimed to be experienced. A person whose leg has been amputated can feel pain that appears to be in a foot that no longer exists, but that's because the pain that feels as if it is in the foot is actually experienced in the brain. I don't know how much more detail I need to go to without it sounding like an insult to your intelligence - I expect you to be able to tie all this together yourself and to get the point. When you put a current through a dead frog's leg and it twitches, pain may be felt by it, but even when it was part of a live frog there was no way for the frog to know if its leg was in pain - from the point of view of a live frog it could feel pain that felt as if it was in its leg, but it would actually be experiencing that pain in its brain.
And the part of you that "feels" pain and "thinks" about it is the grey matter in your skull, and the "feeling" and "thinking" is a complex exchange of neuron pulses and releases of messenger chemicals. No more, no less.
If it's that simple, you can easily make a conscious computer. You could even make a conscious liquid that enables pain to be experienced in a test tube when certain chemicals are dropped into it. But the key question remains, and you can't see it. What feels the pain? Are childrren in chemistry labs in schools all round the world accidentally torturing liquids in test tubes?
(Somewhat like weather prediction. The mechanics are well understood, but we lack the processing power to fully harness the complexities involved.)
Stop trying to hide pain in complexities and try to isolate something that could actually be capable of experiencing pain. If none of the components of your complex system feel pain, what makes you imagine that pain can be experienced at all?
I've shown you where the problem is, but you're determined to ignore it and pretend it isn't there.
No, you failed to show where the problem is, and I'm trying to nail it down.
It's difficult to make it any clearer than that! If none of the components feels pain, there is no s***ing pain! The "pain" becomes a fiction: a fake phenomenon, and it renders us as nothing more than machines which produce data which documents an entirely fake phenomenon.
Do you understand the concept of hypothesis, antithesis, synthesis? It's a process of explaining, asking, questioning, understanding.
It only works if you put in the effort to think logically.
That thought experiment of yours with the copy machine was a good first step, but you never fully formed your hypothesis, and when I formulated my antithesis to what I understood your hypothesis to be, you didn't elaborate on your hypothesis or argued the finer points, but started to weazel your way around the topic.
I set things out in sufficient detail for a person of normal intelligence to be able to get the point, assuming they weren't half asleep. Where's the weaselling? The problem seems to be that you don't understand the concept of "copying".
I still do not understand what you are actually trying to say.
Because you don't read anything carefully.
You repeatedly refuted the idea that a human is basically a neuro-chemical machine;
If you'd actually read the whole thread properly you'd have noticed that I was accused of being an extreme reductionist earlier when I set out the case for us being nothing more than machines.
you claim that consciousness is something beyond that.
I claim that if consciousness if real it must be something beyond current science.
At the same time you refute the suggestion that what you're implying is a metaphysical concept of "soul" or "consciousness" beyond physics, chemistry, and biology.
No - I made it quite clear that it was equivalent to a soul, though a minimal one with no magical baggage involving memory. I suggested that conscious feeling might be a property of matter (or energy, given that they are the same stuff), changing perhaps with quantum state.
You talk about perfect copies, yet still insist that the copy, however perfect, is inferior and may be killed with no moral dilemma.
Can you point to any place where I said a copy was inferior? You may think you can, but you'll be wrong in every case - you seem to read all sorts of things that aren't there into what I say. What's this rubbish about me thinking copies can be killed with no moral dilemma? You need to improve you interpretation skills. The context is always important. If I say "a dog is a bird" and "a bird is a fish", then you can correctly state that if those two sentences are true, it is also true that a dog is a fish, regardless of the fact that a dog is not a fish. If I set out a though experiment in which I kill the copy of someone, that's just part of the rules of the thought experiment - the morality of me killing someone within the thought experiment is completely irrelevent and doesn't negate the point of the thought experiment. Also, when I talked about killing copies of people rather than teleporting them back to the space ship, that's within the context of people gaily jumping into teleports at the drop of a hat and killing themselves to be replaced with copies - I was pointing out that in such an insane society, life is so worthless that it would make more sense to keep the original and destroy the copy at the end of the mission rather than destroying the original when the copy is made - it's a pragmatic way of guarding against losing a useful member of the crew where the copy may be killed during a mission and the original no longer exists. I didn't state that last bit before, but there are some things you really ought to be able to think out for yourself, particularly given your expertise in exploring all these science fiction dilemmas.
You aren't even consistent with yourself, and that makes the discussion somewhat... pointless (pun intended).
There are many deliberate inconsistencies involving the use of the word "I" which no one appears to have picked up. I don't know which inconsistencies you imagine you've found: anything that's important to the discussion will be something where you've misunderstood my position due to your sloppy reading of this thread.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

Solar wrote:
DavidCooper wrote:The physics part of it was replicating the idea of the killer-teleport: the state of one atom is transferred to another at a distance. The philosophical part of it is then going on to assert that the atom to which the state has been transferred has become the original.
No, that was NOT what we were talking about.

We did not talk about whether the copy "becomes" the original. We were talking about how the copy would be indistinguishable from the original.

Sorry, but if you are that fuzzy around the edges, this discussion indeed is worthless.
You're the one making it worthless. If you mix up the copy and the original in such a way that no one knows which is which, then no one knows which is which, but one of them will still be the original and the other will be a copy.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

bonch wrote:I have to disagree with you DavidCooper, it seems to me rather obvious that a copied twin would have his own consciousness and feel his own pain.
How do you imagine that you're disagreeing with me when that was precisely my point!
I do know this does not really address your point (which I'm finding hard to pin down) :p
Place Bonch A in machine. Press button. Remove Bonch A from the machine and remove Bonch B, the copy, from the other side of the machine. Is Bonch A the same person as Bonch B? They clearly have a lot in common as their memories are almost completely identical, but they are not the same person. If I stick a pin in Bonch B, Bonch A does not feel it. If I stick a pin in Bonch A, Bonch B doesn't feel it. If Solar shoots Bonch B dead with a ray gun (how could you be so nasty, Solar! I'm going to take you to task for this later!), then has Bonch A's life come to an end? No. But if Solar had killed Bonch A instead, then Bonch A's life would be over - he would not live on as Bonch B. That last phrase is the critical one - the point which you're finding hard to pin down. Solar seems to think that he lives on as his copy if the original version of him is killed by a teleport.
I think if the idea of consciousness could be appropriated under "known science" there would be no reason for debate, it would just be a fact. Consciousness = xyz and is made because zyx.
Indeed, and we would then have to ask if a liquid in a test tube can feel pain.
You could say what you're saying about every open problem in science - "the answer isn't clear yet, so it's beyond science!". You could also go back 1000 years and use the same logic to defend the proposition that the earth is flat. There are a lot of open questions. But you would be a brave man to declare them beyond the purview of scientific inquiry.
I haven't suggested for a moment that consciousness, if it isn't just a fake phenomenon, is beyond scientific inquiry - if pain is felt by something, it is absolutely the job of science to work out the mechanism behind that.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

SDS wrote:
DavidCooper wrote:So, two people normally have different consciousnesses, but if two of them happen to be identical in every aspect they suddenly have the same consciousness, a shared one, and yet it behaves as if it is not shared at all (stick pin in one, the other feels nothing). You have two people who are copies of each other, and each one has its own independent consciousness.
Now you are just playing linguistic games. I'm not sure if you are being intentionally perverse, but anyway, you are conflating two meanings of the word same. Two perfect copies have the same consciousness, as in they are identical, rather than that they are 'shared' in some metaphysical manner.
I haven't played a single linguistic game here. My point is, and has been from the start, that the copy is not the original and the original is not the copy - they are identical but not the same individual.

(I hope that no one's going to misinterpret "You have two people who are copies of each other" next - one of the two is not in fact a copy at all, but the original from which the copy was copied, but referring to the original and the copy collectively as "copies" is a standard usage of the word, even though it is not fully logical - natural languages are highly defective in this regard.)
DavidCooper wrote:
No, not similar to the concept of the soul...
I'm talking about something utterly minimal - merely sufficient to do such things as feel pain. If you rule that out, there can be no such thing as pain and we are just machines.
Unless you state what you mean by 'something utterly minimal', then you are being incoherent here. Either a metaphysical soul/consciousness exists, or it does not. Within a scientific explanation, you have to assume the latter unless given very good evidence of the former - beyond simply saying that you don't like a partially-incomplete explanation based on simpler principles.
Something utterly minimal, as in something that can actually feel real pain and other qualia - that is clear enough from the bit you've quoted, so I don't know what your problem is. If science cannot pin down anything capable of feeling real pain (as opposed to just reading a value in a register which represents the idea of pain and claiming that that is pain), then pain is either not real or beyond known science.
DavidCooper wrote:It seems that science fiction is more dangerous than I'd ever imagined - it turns out that it can mess with people's minds to the point where they don't care if they suddenly cease to exist so long as they're replaced by someone identical, somehow imagining that they will continue to exist as the copy.
Hardly. Both the 'original' and the 'clone' would not want to cease to exist. If I knew which of those two I would be, then I would choose to stay alive. However, I would be unable to express a preference between two identical "me's" without further information. That hardly implies that I wouldn't mind ceasing to exist.
You're ignoring the context. If people gaily walk into teleports to be destroyed while a remote copy is made using different material, they (the originals) are being seriously killed - that tells me that either they're stupid or they simply don't mind ceasing to exist.
Your neuro-chemical complexities can't feel pain. If pain isn't real, there's no real you or me in the machine.
This is a statement which needs a lot of substantiation. If it is possible for the neuro-chemical processes to support thought, and therefore consciousness, it is possible for pain to be felt. It is not necessarily felt by the neuro-chemical system, but by the thoughts that are supported by it.
Pain felt by thoughts? Do you know what a thought is?
Consciousness isn't my field, and I don't get the impression that it is Solar's. I don't think either of us would claim to fully understand consciousness - and I don't think any scientist would.
Solar appears to have claimed that consciousness is entirely within science, with lots of gaps of course, but that the gaps are just complexities which have yet to be untangled. (You appear to believe the same thing.) I have pointed out that this is not the case because science simply cannot account for real pain being felt by anything.
However, we are able to sufficiently nail the domain of the problem down to the level where we can see that it is comprehensible.
If you think that, then it's clearly all gone right over your head.
We understand enough of the neuro-chemical machinery of the brain to see that it is able to support thought, as well as just deterministic stimuli-response patterns, and by extension consciousness.
That extension is pure delusion - it's the point where you make a serious error.
Even if the full details of these processes (and associated complexity) are not that well understood.
If you really understood it to that point, your conclusion would be that pain is not real and that we are just machines which generate data to document a fake phenomenon. I set out that possibility right at the start. You have not got anywhere close to being able to demonstrate that pain is an illusion, or how if it is not an illusion it is in any way different from a computer flashing up the word "ouch" while merely pretending to be in pain.
A neuro-chemical understanding of thought and consciousness is consistent with modern science, even if not complete. The additional *spark* of consciousness, especially with the possibilities of 'shared' consciousness as you talk, are not consistent with modern science. The conclusion of this is not to throw out modern science.
Which means you are obliged to conclude that we are just machines, but you keep trying to have it both ways by having consciousness emerge out of complextiy so that magic can step in to feel the pain.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
SDS
Member
Member
Posts: 64
Joined: Fri Oct 23, 2009 8:45 am
Location: Cambridge, UK

Re: where are the 1's and 0's?

Post by SDS »

DavidCooper wrote:I make my position very clear - you appear to skim read everything and misunderstand half of it.
DavidCooper wrote:If you can't understand a simple point, there are no other options.
DavidCooper wrote:It only works if you put in the effort to think logically.
DavidCooper wrote:I set things out in sufficient detail for a person of normal intelligence to be able to get the point, assuming they weren't half asleep.
DavidCooper wrote:Because you don't read anything carefully.
DavidCooper wrote:anything that's important to the discussion will be something where you've misunderstood my position due to your sloppy reading of this thread.
I can accept that your point of view may appear to be very clear in your head. However, you are not the judge of whether you are communicating effectively - that is in the eye of the beholder.

From my perspective you come across as significantly aggressive, not particularly persuasive and somewhat incoherent. It is not clear what point you are trying to make - indeed at various points you appear to have been trying to push opposing points of view - in particular of being very reductionist, saying that was all that was needed, then saying that some spark was needed, then saying that this was outside of science. I'm really not sure what to make of it.

I fear that you have defined yourself into a position where science is unable to help. You have decided that the physical structures that exist are all that can be made of the brain and associated machinery, and as these structures cannot feel pain then some non-physical *spark* is required to make consciousness work. This by definition cannot be physical, and so cannot be explained by the physical sciences.

I am afraid that the assertation that a *spark* is required requires substantive evidence, and does not follow from your perception that the brain could not feel pain by your understanding of what the physical sciences are. If the brain is able to provide the machinery to support thoughts, then the brain is able to provide the machinery to support the thought 'I am in pain'. This is what the brain does. The precise mechanism is not entirely clear, but a significant amount is known. It certainly cannot be said that modern science rules out the brain functioning in this manner.
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re: where are the 1's and 0's?

Post by Solar »

DavidCooper wrote:
The idea of the thought experiment was a perfect copy being made.
A copy is a copy - it doesn't become the original by becoming perfectly identical. Look up a dictionary and see what "copy" means!
I never challenged the copy being a copy. All I say is that a perfect copy is identical to the original, so that it being a copy makes no difference for all practical purposes (including the decision as which one I'd like to continue living). Turn your back for a second, and then the only difference it ever made - your memory which one is the copy and which one the original - is gone.

Actually, I start to believe this is the language barrier at work. Could it be you're getting wound up on the word "identical"? Would it help you if I wrote "100% similar"? If that is the case, you have been insulting people over linguistics, when a bit of socratic method would have served you much better than the insults you've been spewing. I certainly don't feel like digging out my Biology diplomas and / or IQ tests to win a battle of one-upmanship with you.
Every good solution is obvious once you've found it.
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

SDS wrote:I can accept that your point of view may appear to be very clear in your head. However, you are not the judge of whether you are communicating effectively - that is in the eye of the beholder.
It's perfectly clear in this thread if you read it with propler attention to detail.
From my perspective you come across as significantly aggressive,
You get back what you give out. I have not been rude here to anyone in advance of them being rude to me.
not particularly persuasive and somewhat incoherent.
I am not to blame for other people's inability to follow a clearly stated and fully rational argument.
It is not clear what point you are trying to make - indeed at various points you appear to have been trying to push opposing points of view - in particular of being very reductionist, saying that was all that was needed, then saying that some spark was needed, then saying that this was outside of science. I'm really not sure what to make of it.
I have made it abundantly clear what point I'm trying to make. There are two possible realities which are at the core of this discussion and I am open to either one of them being the actual reality. One of those is that we are essentially no different from machines, but that our brains generate data which claims the existence of a fake phenomenon which we call consciousness, and the other that consciousness is a real phenomenon in which qualia such as pain are absolutely real. I discuss both sides, and if you notice the word "if" appearing here and there in front of a claim, it's usually because I'm framing a statement withing the context of one of those two models being correct and the other wrong. I am not pushing either model as being the correct one, but rather pointing to the place where the models divide and attempting to stop you trying to have it both ways. You cannot have real pain in one model, and yet you want to have in in that model where it does not fit. You can only have pain in that model by injecting it through a piece of magic which you then hide under the rug of "complexity".
I fear that you have defined yourself into a position where science is unable to help. You have decided that the physical structures that exist are all that can be made of the brain and associated machinery, and as these structures cannot feel pain then some non-physical *spark* is required to make consciousness work. This by definition cannot be physical, and so cannot be explained by the physical sciences.
I have pointed out that science as it stands cannot account for genuine pain - this is central to the entire consciousness problem and is precisely why it is a problem for science. A peice of paper with "pain" written on it simply isn't pain. A 255 value in a register isn't pain.
I am afraid that the assertation that a *spark* is required requires substantive evidence, and does not follow from your perception that the brain could not feel pain by your understanding of what the physical sciences are. If the brain is able to provide the machinery to support thoughts, then the brain is able to provide the machinery to support the thought 'I am in pain'. This is what the brain does. The precise mechanism is not entirely clear, but a significant amount is known. It certainly cannot be said that modern science rules out the brain functioning in this manner.
"Spark" is your term. All I have said is that if pain is genuinely felt, something real has to be able to feel it, and that thing has the ability to experience qualia as one of its properties. If pain is not real, then nothing needs to be added to science to account for it. Note the "if" in that this time and see if you can understand why I went to the trouble of framing the statement with an "if" in it. You can program a machine to have a thought in which it generates data which states that it feels pain, but that pain is 100% fictitious. It will be the same in your model for a human claiming to feel pain because you don't allow anything to feel pain other than abstract things of no substance such as patterns. Your position is a complete nonsense.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

Solar wrote:I never challenged the copy being a copy. All I say is that a perfect copy is identical to the original, so that it being a copy makes no difference for all practical purposes (including the decision as which one I'd like to continue living). Turn your back for a second, and then the only difference it ever made - your memory which one is the copy and which one the original - is gone.
There's a big difference. Here's how I'd decide it. You tell me that you're going to make a copy of me and that I won't know when I come out of the machine whether I'm the original or the copy. You ask me now to decide which one of the two you are to kill (and don't get hung up on the immorality aspect - it isn't relevent to the thought experiment, though I hope you've already got that point). I tell you to kill the copy. Why? Well, I know that if you were to change the rules and tell the copy that he is the copy and give him the choice of which one to kill, he would chose himself - I know this because he would do the thing that is morally right, not wanting in effect to steal someone else's life.
Actually, I start to believe this is the language barrier at work. Could it be you're getting wound up on the word "identical"? Would it help you if I wrote "100% similar"? If that is the case, you have been insulting people over linguistics, when a bit of socratic method would have served you much better than the insults you've been spewing. I certainly don't feel like digging out my Biology diplomas and / or IQ tests to win a battle of one-upmanship with you.
Changing the word "identical" into "100% similar" doesn't do it. The important point is that they are not the same person. When you kill the original, the original is dead. The copy is a new person who has been given the life history of the original without actually having lived it.

And if you fling insults at people (in the form of accusations of incompetence), don't complain when you get the same back.
You simply don't acknowledge anything as understood, you don't make your position clear, you don't ask when you didn't understand, you're only giving the same pseudo-philosophical / linguistic flak to every other statement people make over and over. This discussion is going nowhere, and I'm tyring of it.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

For those who want to examine the way I initially set out the ground for this argument, here is a copy of my original post on consciousness in this thread:-
Are humans really conscious? Take a pain response to something sharp as an example: you feel something sharp and it hurts, so you are triggered into trying to eliminate the cause of the pain. A machine could be programmed to pretend to do the same - a sensor detects damage being done and sends the value 255 (representing "mega-ouch") to the CPU by some means or other. The CPU runs a routine to handle this data with the result that another routine is run to deal with the problem, maybe just sending the word "OUCH!" to the screen, but nothing felt any actual pain at any point of the process. In the human version of this, something either feels pain or generates data that claims it felt pain, but most of us feel that the pain is real and not just an illusion. This is quite important, because if the pain is just an illusion, there can be no real harm done by torturing someone and that would mean there was no genuine role for morality: you can't torture a computer, and if all the unpleasant sensations of being tortured are just an illusion, you can't really torture a person either.

Let's assume for now that pain is real, because if consciousness is all an illusion the whole question becomes uninteresting (other than why data about a fake phenomenon should need to be generated by the brain). How can we know pain is real? Well, we just feel it. But there's a serious problem with this. Imagine a computer with a magic box in it where pain can be felt by something. The 255 input from the pain sensor is sent into the magic box where it is felt as pain, and then the magic box sends out a signal in the form of another value to say that it felt pain. The program running in the computer takes this output from the magic box and uses it to determine that the magic box felt pain, therefore something must have hurt, and then it sends the numbers 79 12 85 12 67 12 72 12 33 12 to the screen at B8000, and yet it didn't ever feel any pain itself - it just assumes that the pain is real purely on the basis that the magic box is supposed to output a certain value whenever it feels pain. The magic box itself does nothing other than feel pain when it receives a "pain" input and send out a "pain" output signal when it feels the pain - it isn't capable of expressing any actual knowledge of that pain to the outside. If you ask the machine if it genuinely felt pain, all it can do is tell you that the magic box sent out a pain signal, so it's fully possible that the magic box is just faking it, or it may even be feeling pleasure while reporting that it feels pain.

Are people any different from this magic box example? If I ask you if you felt pain when you sat on a drawing pin, you would tell me in no uncertain terms that you did, but the computations done in your head to understand my question and to formulate your answer are all done by mechanisms that aren't feeling the pain - they look up the data and find it recorded somewhere that you are feeling pain (or they read the output from the magic box again), and then they generate a reply to say "it hurts", but they can't know that it hurt - they just trust the data to be true.
It is very clear in its wording and I have every reason to get angry when people with slapdash interpretation skills accuse me of not setting things out in precise language when I have gone to some trouble to do exactly that. Note in particular the words at the start of the second paragraph: "Let's assume for now". This sets a frame for discussing one case in which pain is real. This is how arguments are supposed to be put across: clear statements setting out a position (or two rival positions), setting a frame for each so that you always know which model is under discussion.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
SDS
Member
Member
Posts: 64
Joined: Fri Oct 23, 2009 8:45 am
Location: Cambridge, UK

Re: where are the 1's and 0's?

Post by SDS »

DavidCooper wrote:I am not to blame for other people's inability to follow a clearly stated and fully rational argument.
*sigh*
DavidCooper wrote:...There are two possible realities which are at the core of this discussion and I am open to either one of them being the actual reality. One of those is that we are essentially no different from machines, but that our brains generate data which claims the existence of a fake phenomenon which we call consciousness, and the other that consciousness is a real phenomenon in which qualia such as pain are absolutely real...
You present a false dichotomy. Consciousness is able to both be real, and mechanistic.

I suggest some reading:
Neural Basis of Consciousness Edited by Naopaki Osaka

Consciousness Explained by Daniel Dennett if you like reductionalist, very physicalist approaches (summarised here: http://en.wikipedia.org/wiki/Multiple_Drafts_Model)

For discussion of how consciousness can be considered an emergent property:
Libet B 1993 Neurophysiology of Consciousness: Selected Papers and New Essays. Birkhauser, Boston

It is entirely plausible to say, however, that consciousness is only directly self-perceived. From outside, only the effects of consciousness are observed. As such consciousness could be said to be 'simulated'. You might argue that this makes it fake. I would not.
DavidCooper wrote:I have pointed out that science as it stands cannot account for genuine pain - this is central to the entire consciousness problem and is precisely why it is a problem for science. A peice of paper with "pain" written on it simply isn't pain. A 255 value in a register isn't pain.
Scientists working in the area would disagree with your assertation. As such you require some quite substantive evidence to back it up.
DavidCooper wrote:Well, I know that if you were to change the rules and tell the copy that he is the copy and give him the choice of which one to kill, he would chose himself - I know this because he would do the thing that is morally right, not wanting in effect to steal someone else's life.
However, in doing so you would make the copies indistinguishable. Note that (in)distinguishability in this context is a technical term. You could obtain the same effect by telling the original that he was the copy. You have also played on the morality of the copy, whilst saying that morality is not relevant. In a thought experiment about consciousness, you cannot necessarily assume that the people being copied would behave nicely - I certainly would fight tooth and nail about being killed whether I were the copy or the original.
DavidCooper wrote:Pain felt by thoughts? Do you know what a thought is?
DavidCooper wrote:
SDS wrote:We understand enough of the neuro-chemical machinery of the brain to see that it is able to support thought, as well as just deterministic stimuli-response patterns, and by extension consciousness.
That extension is pure delusion - it's the point where you make a serious error.
I think that you are out of your depth here. I refer you to "Jackendoff R 1987 Consciousness and the Computational Mind. MIT Press, Cambridge, Massachusetts", who discusses the inseperability of consciousness and thoughts in a significant amount of detail.
SDS
Member
Member
Posts: 64
Joined: Fri Oct 23, 2009 8:45 am
Location: Cambridge, UK

Re: where are the 1's and 0's?

Post by SDS »

DavidCooper wrote:It is very clear in its wording and I have every reason to get angry when people with slapdash interpretation skills accuse me of not setting things out in precise language when I have gone to some trouble to do exactly that. Note in particular the words at the start of the second paragraph: "Let's assume for now". This sets a frame for discussing one case in which pain is real. This is how arguments are supposed to be put across: clear statements setting out a position (or two rival positions), setting a frame for each so that you always know which model is under discussion.
It is fairly clear in its wording. If you read my first response to it I did two things:
  • Noted that your initial characterisation of what is was to be conscious was incomplete (not wrong).
  • Noted that the phrase (in your next post) "How can anything be more than the sum of its parts (and the geometrical arrangement of those parts)?" was overly simplistic
Neither of those is confrontational, and my post left a great deal of room for continued discussion. Which ensued. I have however been confused as to how you have become less clear, and more confrontational as time has passed. It has appeared at times that you have been willing to fight against anything which has been said, irrespective of whether it was in the same direction, or opposed, to your earlier statements.
SDS
Member
Member
Posts: 64
Joined: Fri Oct 23, 2009 8:45 am
Location: Cambridge, UK

Re: where are the 1's and 0's?

Post by SDS »

I just want to add, additionally;

I am only involved in debates such as these because I genuinely enjoy talking about and debating science, scientific ideas and a bit of philosophy. I enjoy this even if (possibly particularly if) the discussion gets somewhat heated and robust.

If you are no longer enjoying said discussion, then I am quite happy to let it die quietly. There is no point if it isn't fun, right :twisted:.
User avatar
Brynet-Inc
Member
Member
Posts: 2426
Joined: Tue Oct 17, 2006 9:29 pm
Libera.chat IRC: brynet
Location: Canada
Contact:

Re: where are the 1's and 0's?

Post by Brynet-Inc »

The human brain reacts to various stimuli, pain is a mechanism for self preservation, it is neither mystical or unexplainable.
Image
Twitter: @canadianbryan. Award by smcerm, I stole it. Original was larger.
User avatar
DavidCooper
Member
Member
Posts: 1150
Joined: Wed Oct 27, 2010 4:53 pm
Location: Scotland

Re: where are the 1's and 0's?

Post by DavidCooper »

SDS wrote:You present a false dichotomy. Consciousness is able to both be real, and mechanistic.
Only when you stick magic into it to feel the pain!
I suggest some reading:
If the books you list tackle the issue, why not state how they deal with it directly? The answer is that they don't.
Consciousness Explained by Daniel Dennett if you like reductionalist, very physicalist approaches (summarised here: http://en.wikipedia.org/wiki/Multiple_Drafts_Model)
Infamous for failing to live up to its title.
For discussion of how consciousness can be considered an emergent property:
Libet B 1993 Neurophysiology of Consciousness: Selected Papers and New Essays. Birkhauser, Boston
Belief in magic.
It is entirely plausible to say, however, that consciousness is only directly self-perceived. From outside, only the effects of consciousness are observed. As such consciousness could be said to be 'simulated'. You might argue that this makes it fake. I would not.
What is the mechanism by which knowledge of qualia can be turned into data that can speak of them?
DavidCooper wrote:I have pointed out that science as it stands cannot account for genuine pain - this is central to the entire consciousness problem and is precisely why it is a problem for science. A peice of paper with "pain" written on it simply isn't pain. A 255 value in a register isn't pain.
Scientists working in the area would disagree with your assertation. As such you require some quite substantive evidence to back it up.
Their failure to answer the core questions is all that's required. They have nothing there capable of feeling pain. They delude themselves into thinking they understand something which they manifestly do not.
DavidCooper wrote:Well, I know that if you were to change the rules and tell the copy that he is the copy and give him the choice of which one to kill, he would chose himself - I know this because he would do the thing that is morally right, not wanting in effect to steal someone else's life.
However, in doing so you would make the copies indistinguishable. Note that (in)distinguishability in this context is a technical term. You could obtain the same effect by telling the original that he was the copy. You have also played on the morality of the copy, whilst saying that morality is not relevant.
Wake up, SDS! This is morality being applied within the thought experiment - totally different from objecting to the thought experiment on the basis that someone is allowed to do something immoral within it!
In a thought experiment about consciousness, you cannot necessarily assume that the people being copied would behave nicely - I certainly would fight tooth and nail about being killed whether I were the copy or the original.
I know that if I was in that position and discovered that I was the copy, I would give way to the original. You can play the game your own way, in which case you'd better choose in advance for the copy to be killed and hope the rules don't change half way through.
DavidCooper wrote:
SDS wrote:We understand enough of the neuro-chemical machinery of the brain to see that it is able to support thought, as well as just deterministic stimuli-response patterns, and by extension consciousness.
That extension is pure delusion - it's the point where you make a serious error.
I think that you are out of your depth here. I refer you to "Jackendoff R 1987 Consciousness and the Computational Mind. MIT Press, Cambridge, Massachusetts", who discusses the inseperability of consciousness and thoughts in a significant amount of detail.
Somehow I don't think I'm out of my depth at all. I know how to build systems that think thoughts, but I can't find any way to inject consciousness into them other than by faking it.
Help the people of Laos by liking - https://www.facebook.com/TheSBInitiative/?ref=py_c

MSB-OS: http://www.magicschoolbook.com/computing/os-project - direct machine code programming
Locked