I think it is like qualia in the sense that it can be something known experientially, but spatial and logical reasoning are things that transcend subjectivity.
No doubt LLMs do not have qualia as humans do, but I also think that the diversity of qualia present in humans is generally unqualified and underestimated at present.
I don’t know if that is true, unless your definition of qualia is so loose that it includes (or emerges from) basic data representations such as vectors comprised of weights and vertices, in which case I think LLMs could perhaps be argued to have basic qualia.
There is often this attempt to draw a bright line on one feature or characteristic between humans and animals, or humans and AI, but I think that is a mistake based on an attempt to simplify the argument through reduction.
No doubt LLMs do not have qualia as humans do, but I also think that the diversity of qualia present in humans is generally unqualified and underestimated at present.