For several years now, in the EU at least, all new cars need to have automatic emergency breaks. It's basically impossible to run into something, run someone over, etc. This kind of system reduces stress and gives at least me a certain peace of mind. In this regard Tesla self driving is a step backwards.
You seem to be confused about what AEB is and what it does. It is generally effective at braking for static or rapidly slowing vehicles ahead. Most AEB systems are not effective "safeguards against driving into oncoming traffic and parked cars". These would involve a steering intervention, not a braking intervention.
I assume AEB ignores a lot of stuff on purpose, otherwise it'd be braking due to things on the side of the road, and cars in the other lane on a curve that are heading straight towards you
Correct. Most AEB systems use a forward facing radar, which often receive spurious reflections from various street elements overhead bridges, overhead signage and manhole covers. As a result, most radar-based implementations will actively ignore completely stationary objects while driving at high speed.
That is a good article which summarises the state of AEB and its general capability. It's worth noting that the most effective system they tested doesn't use radar at all, it was an entirely vision-based system: Subaru Eyesight. Worth remembering this when people agonise over Tesla ditching radar sensors on the Model 3.
I chose my parents' most recent new car precisely because of this—a 2019 model year Subaru Forester. Not only was the active safety top notch, its chassis tuning means that it's more capable of remaining composed after swerving to avoid an obstacle. (It was also one of the few vehicles which ticked every box for them.)
Here is the comparison I relied upon for that decision. (Yes, I am Australian.)
If you're interested, Tesla's head of AI recently did a public presentation of their new vision based depth system. It's worth a watch if you find this stuff interesting and enjoy learning about the forefront of technology.
A key takeaway is that they've been running this stack in shadow mode (validating output but not controlling the car) on everyone's Tesla for quite some time. Equivalent to 1000 YEARS worth of real world driving. And from this data they've proven it is now superior to radar in all circumstances.
being very suspicious about that talk, the output they show doesn't matches at all with the output people have extracted from running teslas - https://twitter.com/greentheonly/status/1412597377228226562 - specifically the 'trained' heath signature running horizontally across the dash.
Well,the rest have radar, so that's not a problem.
"Stereoscopic vision works most effectively for distances up to 18 feet. Beyond this distance, your brain starts using relative size and motion to determine depth."