It's moot to the aforementioned point. Undefined behavior wasn't introduced as a new language "feature" between C89 and C23; it's existed the whole time. We're talking about specification deltas, not the entire corpus.
But, if you want an answer to your question:
You can learn to avoid undefined behavior in about 30 seconds.
If you're purposefully fiddling with undefined behavior, it's because (ideally) you're A) an advanced developer and you know exactly what you're trying to achieve (and inspecting the generated code) and/or B) you're using a specific compiler and don't plan on porting your code elsewhere.
Before you could assume signed arithmetic overflow will be whatever the CPU does, or null pointer derefs will be trapped by the OS. That is pretty big difference from what can happen now, moved C away from that "portable assembler" moniker so very not moot. Even if it was never explicitly standardized.
> You can learn to avoid undefined behavior in about 30 seconds.
Source? I mean, if it's really that simple then someone already compiled that 30 second advice and you can simply link it here for us. Ideally including examples how to actually do signed arithmetic safely. You can't avoid negative numbers lol.
Yes, you should avoid all of the undefined behavior you just outlined. The fact that compilers have handled that differently over time is precisely why you avoid UB.