Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From what I've seen of Tesla's solution at least - even busy city centers and complex parking lots are very difficulty for present day autonomous driving technologies. The understanding level necessary just isn't there.

These things are excellent - undeniably better than humans at the boring stuff, highway driving, even major roads. They can rightfully claim massive mileage with high safety levels in those circumstances... but throw them into nastier conditions where you have to understand what objects actually are and things quickly seem to fall apart.



That is like trying to judge modern supercomputing by your experinces with a 6 year old Dell desktop.

Waymo drove 29,944.69 miles between "disengagements" last year. That is an average California driver needing to touch the wheel once every 2.3 years.

Tesla by comparison is classed as a SAE Level 2 driver assist system and isn't even required to report metrics to the state. While they sell it to consumers as self-driving, they tell the state it is basically fancy cruise control.


"disengagements" is a disingenuous statistic - that'd be like a human driver just giving up and getting out of the car.

What you want is "interventions". Additionally, look at where those miles were driven. Most of them are some of the most simplistic road driving scenarios possible.


> That is an average California driver needing to touch the wheel once every 2.3 years

From my experience of California driving, that doesn't sound too bad. Compared to the entire Eastern seaboard, y'all have great roads and better drivers.


> Waymo drove 29,944.69 miles between "disengagements" last year.

You know better. If most of those miles were in sunny Mountain View suburbs, they don't count.


It's unclear to me why Tesla's solution is so discussed. They are definitely not on the same playing field as Waymo or even Cruise.


There's a lot of people on here who have invested in Tesla


also a lot of people on here who have actually experienced tesla's self-driving. certainly a lot more than have experienced any other self-driving product (at least above a "lane-keeping" system)


Are there a lot of people who have experienced tesla's self-driving?

As I understand it, if you pay for FSD, you don't actually get anything like self-driving, you just get lane-changes on the highway in addition to the lane-keeping. Effectively, you get lane-keeping, which you have if you don't pay too.

All the videos of "FSD driving" are from a small number of beta-testers, and there's no way to opt into the beta.

Because of that, my assumption would be very few people on here have experienced tesla's self-driving. It's only open to a small number of beta testers, whether you have purchased it or not.

On the other hand, waymo is available for the general public to use, though only in specific geographic areas.


Would you describe Tesla's tendency to crash full speed into stopped emergency vehicles during highway driving as "excellent"?

https://www.cnn.com/2021/08/16/business/tesla-autopilot-fede...


While controversial, we tolerate a great deal of casualties caused by human drivers without trying to illegalise those.

While we can (and should) hold autonomous vehicle developers to a much, much higher standard than we hold human drivers, it is precisely because of excellence.


We actually do "illegalise" casualties by human drivers.


I'm sure the grand poster meant banning human driving entirely in order to prevent human driving casualties.


The failure modes are going to be very strange and the technology is not strictly comparable to a human driver. It is going to fail in ways that a human never would. Not recognizing obstacles, misrecognizing things, sensors being obscured in a way humans would recognize and fix (you would never drive if you couldn't see out of your eyes!).

It is also possible that if it develops enough it will succeed in ways that a human cannot, such as extremely long monotonous cross-country driving (think 8 hour highway driving) punctuated by a sudden need to intervene within seconds or even milliseconds. Humans are not good at this but technology is. Autonomous cars don't get tired or fatigued. Code doesn't get angry or make otherwise arbitrary and capricious decisions. Autonomous cars can react in milliseconds, whereas humans are much worse.

There will undoubtedly be more accidents if the technology is allowed to develop (and I take no position on this).


That's autopilot, not FSD beta though, at this point it's probably 10 generations old


Ah yes, because "autopilot" is not autonomous.


Well yeah, it's like other autopilots:

An autopilot is a system used to control the path of an aircraft, marine craft or spacecraft without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle, allowing the operator to focus on broader aspects of operations (for example, monitoring the trajectory, weather and on-board systems).


That's just devious marketing on Tesla's part. They can always excuse customer misunderstandings with the original meaning you explained, while normal people can be savely expected to interpret autopilot as full self driving (and I'd be surprised if they didn't have actually tested this with focus groups beforehand). So not really lying (great for the lawsuits), but constructing misunderstanding on purpose (great for the brand image).


Except for the manual and all the warnings that pop up that say you need to pay attention.

3000 people die every day in automobile accidents, 10% of which are from people who are sleepy. Even standard autopilot is better than a tired driver


I would say it's better then the Human's tendency to drive full speed into anything while impaired by a drug. Especially since the bug was fixed in Tesla's case but the bug in Human's case is probably un-fixable.


Drugs (or alcohol)? There are so many more failure modes that drugs are the least of my concerns. Especially of unspecified type. I'm not the least bit worried about drivers hopped up on tylenol. Humans get distracted while driving, by texting, or simply boredom and start daydreaming. Don't forget about driving while tired. Or emotionally disturbed (divorce or a death; road rage). Human vision systems are also pretty frail and have bad failure modes, eg the sun is close to the horizon and the driver is headed towards the sun.


Computer vision systems also have bad failure modes. The camera sensors typically used today have better light sensitivity but less dynamic range than the human eye.


They fixed driving into stationary things? That's news to me. What's your source?

It's not an easy problem to fix at high speed without false positives, and they seem to really hate false positives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: