I was learning some new framework, knew what I wanted to achieve, but had no idea how. It was easier to have conversation with GPT than to go through documentation. Although it sometimes gave examples from older version :)
This is what I think is going to hold us back. The worry, perhaps subconsciously, of discovering that consciousness is much less complex and exceptional than we might want it to be.
I have this inkling because I see a pattern: people, even incredibly smart people, explaining it away with a “but that would diminish the exceptionality of our own minds!” type of argument.
I’m struggling for the right words. Specialness, maybe? That people have this need to think that it’s special and not so easily replicated. And if it’s rather simple then it surely must not be the same because that wouldn’t be special.
The idea you’re talking about is called “human exceptionalism”. It’s the idea that humans are in some way unique when it comes to being, thinking, and feeling.
It's not replicating it that is impressive. That part is easy. Of course consciousness wouldn't be that hard. Of course it would be mostly an emergent property of various building blocks. Of course homo sapiens would eventually be able to do the same thing.
The impressive part is replicating it and then teaching it real Love. Love is a choice. Producing life that chooses to make that choice and defend that choice is impressive.
I think this is the same problem. How would you measure when a human is taught love? How would you differentiate that from measuring when one of these programs is taught love?
It feels like it always boils down to: 1) you can’t. Or 2) “trust me, it’s different.”
It's simple - observe what the candidate does when they have to show true love in order to move on to the next level. If they fail, they don't move on to the next level.