Sure it is. It is far less immoral to cause unnecessary pain to a creature that is not conscious, for example. If machines evolve (or are designed) to become conscious, then there are moral implications about turning them off or disassembling them. And when they become smarter than us, but have already been put in control of things we can no longer handle (stock trading, nuclear defense, the energy grid...), they may decide that we are not "conscious" enough. There will be a narrow window in which we can argue with them, and then it will be over.Copper Bezel wrote:What's the point? It's not going to be significant on any moral axis, for instance.
But we're not trying to define "our kind". We're trying to define a certain essence of self-awareness, feeling, sense of self, which we bundle together into one word which we use to help define how we interact with them.Copper Bezel wrote:If your definition for consciousness contradicts our moral intuitions about what counts as a "person" in one direction or the other, we'll just decide that "consciousness" isn't the deciding factor and default to another axis on which to define "our kind".
ETA: One difficulty we have is that humans are at the top of the intellectual scale of life on earth. We are not used to thinking of other living things as conscious, or even as having the ability to think. This is evident in the way people treat other races, religions, and even the other sex. "They" are clearly inferior. However, had we evolved as the second most intellectual creatures on earth, perhaps as pets to some higher life form, and thus already used to the idea that we aren't the only ones that can {whatever}, this question might not even come up. Of course they are conscious, and yes, we are conscious too, and so probably are {other life forms} and even {some other machines}. Beware the smugness of the peak.
Jose