Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

https://www.caranddriver.com/features/a24511826/safety-featu... has some more info, they will stop for things stopped in front of them, but only at city speeds, I assume because randomly braking at 60mph is bad


That is a good article which summarises the state of AEB and its general capability. It's worth noting that the most effective system they tested doesn't use radar at all, it was an entirely vision-based system: Subaru Eyesight. Worth remembering this when people agonise over Tesla ditching radar sensors on the Model 3.

I chose my parents' most recent new car precisely because of this—a 2019 model year Subaru Forester. Not only was the active safety top notch, its chassis tuning means that it's more capable of remaining composed after swerving to avoid an obstacle. (It was also one of the few vehicles which ticked every box for them.)

Here is the comparison I relied upon for that decision. (Yes, I am Australian.)

https://www.youtube.com/watch?v=cs7PSpIMJYk


eyesight has stereoscopic vision, so can compute actual distances from obstacles from the input feed instead of faking it all with machine learning.


If you're interested, Tesla's head of AI recently did a public presentation of their new vision based depth system. It's worth a watch if you find this stuff interesting and enjoy learning about the forefront of technology.

https://www.youtube.com/watch?v=g6bOwQdCJrc

A key takeaway is that they've been running this stack in shadow mode (validating output but not controlling the car) on everyone's Tesla for quite some time. Equivalent to 1000 YEARS worth of real world driving. And from this data they've proven it is now superior to radar in all circumstances.


being very suspicious about that talk, the output they show doesn't matches at all with the output people have extracted from running teslas - https://twitter.com/greentheonly/status/1412597377228226562 - specifically the 'trained' heath signature running horizontally across the dash.


It’s not going to match because it’s different data from a different ML model visualised in a different way. Weird that you’d expect it to.


Well,the rest have radar, so that's not a problem.

"Stereoscopic vision works most effectively for distances up to 18 feet. Beyond this distance, your brain starts using relative size and motion to determine depth."


What has brain performance anything to do with the discussion at hand? Eyesight eyes are further apart than humans anyway.


That humans use size to determine distance when driving




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: