Argency wrote:Ok, well, if you want to define consciousness that way then I'll agree to change my terminology to suit. But the most important point (the one which directly relates to the thread) still stands - consciousness is no more than a very complex computational process aimed at enhancing fitness. In this case, fitness is the ability to survive on planet earth. So to answer your question, yes, any system not aimed at enchancing fitness is by definition not conscious. Similarly, any system which is not computational cannot, by definition, be conscious.
Imagine a machine that possesses to a great degree all of the abilities I list in my consciousness requirements. To the point that you could converse with it in a very natural and convincingly human way. Yet it does not possess a survival drive because it was never given one, it has no predators, teams of technicians take care of its maintenance, and it cannot be shut off. It has no impetus to enhance its ability to survive because its survival is assured.
Is this system conscious? I would say so, as it has all the abilities I would associate with high-level consciousness. But if it is not "enhancing fitness for survival" then it is not conscious by your definition.
What would you say of this machine?
So, sure, if you think that all conscious things must have the particular attributes that you listed, that's fine. As long as you remember that those criteria are arbitrary, and that a barely conscious organism and a barely unconscious organism (say, cleverbot and a walkman) are much more similar to one another than a barely conscious organism is to a massively conscious one (say, cleverbot and a human). In other words, as long as you remember that your definition is a fuzzy-edged label designed to make conversation easier, I'm happy to use it as such.
Sure the qualities I listed are arbitrary. But no more arbitrary than your requirement of "fitness aimed computation". There's a word for "fitness aimed computation", it's optimization
. But like "fitness aimed computation", "optimization" doesn't mean anything without an optimization context. Since your fitness computation is aimed at survival, your definition might be better termed as "survival optimization". I'm going to use "survival optimization" from now on to refer to your definition to make things more clear. I hope that's ok with you.
But the reason I was using my definition (and yes, it is a much less discerning one, allowing for some pretty trivial cases to be technically termed consciousness) is because my definition is based on the causal mechanism of consciousness. It explains how consciousness arises.If you agree that a machine could be conscious, then you surely must agree on how consciousness comes about. And if you agree with that, you must also agree with me that what you call consciousness is just a very complex example of the sort of process that takes place in any computational device, and that there's nothing extra added in to create consciousness.
I think herein lies a major problem I have with your definition, that it attempts to muddle mechanism with result. Surely when we talk about consciousness we care about systems that appear
"conscious" regardless of the consciousness mechanism. I'll agree that survival optimization has probably been a major causal mechanism for consciousness as we know it, but I think it's absurdly premature to say that survival optimization is the only mechanism of consciousness or that survival optimization equals consciousness. I also agree that computation alone is sufficient to carry out all the processes of consciousness. I can't agree with you about how consciousness comes about, because I have no idea how it comes about. I agree that survival optimization seems to be a potential avenue for consciousness creation, I'm just not convinced it's the only one, as you seem to be.
That's the only real reason I was making the point, and if you were operating on a different terminology then I guess I'm happy to adopt yours for the purposes of the argument. Remember that my original post was replying to your argument against somebody who was presumably going by the same definition as me - I guess I assumed that you would too once the case had been made for it.
Sorry, if I was unclear. When I say a "conscious" system I mean a system that is conscious according to my definition. I'm not sure what definition he was going by as he never offered a detailed explanation of his views.
You pointed out that I haven't offered any evidence for my definition - that's because definitions aren't proven, they're justified. Your definition is conversationally justified, mine's analytically justified.
I'm not asking you to prove your definition. That would be absurd. I'm asking for exactly what you said, justification for why you believe what you believe. What leads you to believe that consciousness can only
exist in systems aimed at survival optimization?
Both are useful in different ways. Surely you can see why it's useful to base your definition on the causes of the phenomenon you're investigating, from a scientific/engineering standpoint?
No, and I think that's a major weakness of this definition. Conflating mechanism and result makes your definition weaker, not stronger. Consciousness (by my definition and probably most interpretations) is an observed property of systems. Keeping the observed property separate from the mechanism is vital. This is especially true in science. Lots of similar observations can have very different mechanisms and vice-versa. If you load the definition of "consciousness" with a particular mechanism then you've needlessly restricted the observed phenomenon you're talking about.
On the point of heaps of sand - I think the reason we disagree on that one is the same reason we disagree on this topic as a whole. My solution to the paradox is just to say that any grouping of sand grains is nominally a heap, even if very small groupings can more usefully be referred to by their individual grains.
The problem with this is that it is not
a solution because it leaves you in an incorrect state at the end. All groupings of sand are not "nominally heaps" because they do not satisfy the definition of a heap. There is an alternative formulation of the paradox which might make this more apparent:
Is 1 grain of sand a heap of sand? (No, by definition of a heap
If we add one, are 2 grains of sand a heap? No.
If we add one, are 3 grains of sand a heap? No.
.... (ad infinitum)
Therefore no matter how many grains of sand we add, we will never have a heap. Therefore, heaps don't exist.
Your "solution" is functionally equivalent to agreeing that "heaps don't exist". But of course heaps exist so it is a non-solution. I understand what you're saying. But you're basically sacrificing correctness for convenience.
Yours seems to be to define some arbitrary point at which a heap ceases to be a heap. You could presumably define a number of water molecules below which a droplet was no longer said to have a temperature. In all three cases, your definition makes conversation easier, and mine makes causal analysis easier.
The analogy to water molecules is completely bogus because the predicates "is a heap" and "temperature" are completely different. Temperature is an exact predicate with a definite answer for any number of water molecules even if that answer isn't "useful". "Is a heap?" on the other hand has an element of subjectivity which makes its placement on a continuum with any authority impossible. It's apples and oranges.
I think you're right in that survival optimization has played a big part in consciousness as we know it. I just don't see it as necessary or sufficient to describe consciousness in the large.