Cog without the nition

Though I am very interested in AI, I think the focus on consciousness is probably misguided.

I used to believe the direct opposite of this by the way, but I’m not sure anymore that consciousness is even important, much less necessary to create even a human-level AI.

Cognition without consciousness. Intelligence without same. Seems outlandish — because we get to define “consciousness.”

But consciousness itself just might be an adaptive trick to boost limited mental capacity to something actually useful in the world. More useful than not possessing it, that is. Not necessary for anything less limited. Or it might be something else altogether. But I’m no longer convinced it’s necessary. In fact it might even be harmful.

Of course one can argue that if you’ve simulated consciousness, then that is no different than consciousness.

I don’t have a strong rebuttal to that contention.

But.

It’s only contingently the case that the withdrawal reflex is adaptive. It’s completely unconscious. I don’t see any reason why other more complex actions that in humans are “conscious” could not be arrayed just as the withdrawal reflex.

I know, it sounds ridiculous to talk this way as our entire mental architecture is against it — but I can imagine an “empathy reflex” or a “navigation reflex” with no gestalt, with no “thinker” needed or present.

It seems to me that what we think of as higher-level thought is just an accident of adaptation, that this can exist just as well without the architecture of consciousness in a different environment.

Meaning without a “meaner.”

These ideas could be completely wrong. I don’t know. But I’d rather have big questions out there than spend my life in boring certitude.