> In effect, what the AI missing is the "overseer" part, which perhaps is what we would identify as our own consciousness (the ability to think about the process of thinking itself).
Term is "metacognition," and that's an interesting concept.
It’s essentially two parts right? I feel like we have something now that I would call cognition. They can probably add math and physics knowledge as their own models. Now you “just” need the part that drives it towards some goals. And you have essentially a bug or an animal at the least. It can also have scary goals that living creatures can’t - like try to grow exponentially bigger.
In retrospect I realized that it's missing not just the higher part of human consciousness but the lower one too (survival, approach/flee), which is what you mentioned. One wonders if perhaps it's not better if AIs do not have those circuits in the first place?
> It can also have scary goals that living creatures can’t - like try to grow exponentially bigger.
This is funny given this is what humans have done. I don't think you can calculate it but when you go from flying an airplane to flying to the moon in a lifespan then that's definitely steeper progress than anything that's ever come before. Population growth has only recently stopped being (truly) exponential too.
Term is "metacognition," and that's an interesting concept.