My context window is about a day. I can remember what I had for lunch today, and sometimes what I had for lunch yesterday. Beyond that, my lunches are gone from my context window and are only in my training data. I have vague ideas about what dishes I ate, but don't remember what days specifically. If I had to tell you what separate dishes I ate in the same meal, I don't have specific memories of that. I remember I ate fried plantains, and I ate beans & rice. I assume they were on the same day because they are from the same cuisine, and am confident enough that I would bet money on it, but I don't know for certain.
One of my earliest memories is of painting a ceramic mug when I was about 3 years old. The only reason I remember it is because every now and then I think about what my earliest memory is, and then I refresh my memory of it. I used to remember a few other things from when I was slightly older, but no longer do, because I haven't had reasons to think of them.
I don't think humans have specific black and white differences between types of knowledge that way LLMs do, but there is definitely a lot of behavior that is similar to context window vs training data (and a gradient in between). We remember recent things a lot better than less recent things. The quantity of stuff we can remember in our "working memory" is approximately finite. If you try to hold a complex thought in your mind, you can probably do that indefinitely, but if you then try to hold a second equally complex thought as well, you'll often lose the details of the first thought and need to reread or rederive those details.
A lot of people genuinely can't remember what they did an hour ago, but to be very clear you're implying that an LLM can't "remember" something from an hour, or three hours ago, when it's the opposite.
I can restart a conversation with an LLM 15 days later and the state is exactly as it was.
Can't do that with a human.
The idea that humans have a longer, more stable context window than LLM's, CAN or is even LIKELY to be true given certain activities but please let's be honest about this.
If you talk to someone for an hour about a technical conversation I would guesstimate that 90% of humans would immediately start to lose track of details in about 10 minutes. So they write things down, or they mentally repeat things to themselves they know or have recognized they keep forgetting.
I know this because it's happened continually in tech companies decade after decade.
LLM's have already passed the Turing test. They continue to pass it. They fool and outsmart people day after day.
I'm no fan of the hype AI is receiving, especially around overstating its impact in technical domains, but pretending that LLM's can't or don't consistently perform better than most human adults on a variety of different activities is complete non-sense.
I do hope you're able to remember what you had for lunch without incessantly repeating it to keep it in your context window