As an old time programmer myself, I anecdotally notice that junior developers seem more prone to use dense/clever code, while more experienced devs prefer clear and easy to maintain code.
They're probably being taught by academics who were adapted to the old school low-resource style of programming, and have been out of industry long enough that they never adapted to modern best practice. And assignments tend to praise cleverness over readability and long-term maintainability, because who cares if it's read-only code when your assignment only spans a few weeks and then never matters again - you just need to impress the professor (or TA who does the actual grading) and move on.
The university system does sometimes result in students having strange ideas that are a bit divorced from industry best practices (and in some cases reality), due to the attitudes of their teachers rubbing off on them. I had one professor who continually insisted we were on the verge of a major resurgence of semantic web technologies and strict XHTML, and a friend who took a dev tools course (mostly focusing on git, makefiles, etc.) and was told tools like Maven would never take off and you're better off managing dependencies by hand (this was around ~2017).
I don't want to be harsh on academics / the tertiary education system in general, but I think it's kind of an unavoidable consequence. Maybe "boot camp" students fare better?