Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just from a management perspective, even if you're super-confident in your engineers (which you shouldn't be), doing anything other than a very limited slow roll out would be negligent. This is a tectonic shift in liability. It's one thing if you sell 1,000,000 cars and 100,000 get in an accident. It's actually much worse to sell 10,000 cars and 1,000 of them get in an accident and the company is liable.


'This is a tectonic shift in liability. '

Its the only form of liability i consider acceptable - I woupd never use autopilot where I would be to blame if i fail to react in 50 milliseconds once the car fucks up


I think for this reason full self driving is going to be more feasible in taxi-style vehicles first. You hail the driverless taxi, like with another person driving the taxi the passengers are not liable. It’s a model we are already used to, at least. If the taxi company winds up being the car producer, then the liability is the same as if they didn’t produce but operated the car.


That's not really "full self driving" if you are referring to the SAE scale. Level 5, the top, is fully autonomous driving, everywhere. What you are referring to is level 4; fully autonomous driving in specific geofenced areas on predetermined routes. Think Delamain in Cyberpunk 2077.


Fitting username :)


You are already going to be to blamed if you get in an accident for most hardware failures in a car. Why is this one so different?


Uh, no?

If the car's brakes don't work due to a defect, you are not liable at all.

And if you show reasonable due diligence, that is regular maintenance, the same applies doubly so.

Beyond that, I've seen a lot of accidents, and none of them were the car's fault.

All were literally human error, or external factors. EG, deer leaping on to road, too many frogs on the road, etc


"If the car's brakes don't work due to a defect, you are not liable at all."

Morally perhaps but if you drive into the back of me because your brakes have a manufacturing fault I would still expect you to pay for the damage. The manufacturer may then be liable to you but that that is your business.


You'd don't expect the driver to pay, you just expect to be paid. Doesn't matter to you if it is OP, his insurance company or his car manufacturer. It does matter to the driver though, which is the reason of this discussion on liability.


It goes beyond morals.

If you are driving a car and an crash occurs due to a manufacturing defect, you'll probaly not be found criminally liable.

This becomes super important if someone gets injured or killed.


Yeah, and if the self driving functionality doesn't work at all, it would be the manufacturers fault as well. The liability would work the same.


I guess because appropriate usage and maitenance makes that kind of hardware failure extremely unlikely, no?

I can be "warned" to check for issues if the car is older, creaking, seems to be "driving" different etc

A mistake in a mostly completely uninterpretable bundle of software is definetely much sneakier. You can't prepare for it

So I guess that's one argument.

The other aspect of the same argument would be that if you're in an accident due to your car breaking in a way that is obviously the fault of the maker, they will absolutely eat the liability. I think this is how a few recall cases get kickstarted


I don't think the GP is 100% correct anyway, it depends a great deal on what happened to cause the accident. Brakes fail because you haven't changed the pads? Your fault. Steering shaft u-joint comes loose on your 2020 Ford? You were never expected to maintain that item in that time frame so is it your fault?


Because none of the other hardware features give the driver a false sense of being able to not do their job. It doesn't matter how many times the owner is told, we have already seen people are not smart about this. Drivers shooting video of them being in the passenger seat, people shooting porn while the car is driving, and several other stupid things that people will do just because they think it that's what self-driving is meant to do.


Because you can't control it. Drivers aren't liable for design flaws in their cars be they faulty self-driving, unintended acceleration, etc.


The overwhelming majority of crashes are because of human error, not hardware failure.


Because we have rigorous laws and liability managing hardware quality and maintenance.

If liability for software will rest with the driver, there are no laws regulating it's quality and maintenance, what is the basis to believe it will be anywhere near ad reliable?


Legally, they’re all liable.

If your product is unreasonably dangerous and defective, and that defect is the proximate cause of injury to another, you’re responsible.

I know nothing of German law, but from a US legal perspective, it appears that MB is just promising what they already have to do - compensate those harmed by defects in their products.


No, the competition explicitely tells the driver to be ready to take over any second, they promise full self driving only in the ads, but not the contract.

So when a accident happens, the "driver" can be blamed for not reacting.

Mercedes offers full self driving including legal responsibility. You are not oblieged to watch the road or the car while driving (with the current tight limits)


The competition doesn't get to override laws. Tesla may put as many illegal clauses in their contract as they want, if they crash because of autopilot in Europe, they're liable. That's why Autopilot is barely enabled here at all, because they know their software is unreliable.


There are accidents which are not due to defects or unreasonable dangers in the product.

Shit happens and responsibility is not necessarily due to defects/malfeasance/unreasonable risks. There ara honest-to-God accidents. And you are still liable.


I looked up the number of cars sold. Mercedes sells about 1/2 relative to ford and 22% compared to Toyota

Obviously the statistics of a new car is unknown


Mercedes will obviously outsource the liability to an insurance company. That’s what insurance companies are for.


Insurance isn’t magic get out of risk free, if self driving cars have loads of accidents and payouts the cost to insure will correct itself.


Not obvious at all. Companies the size of Mercedes usually self insure for routine risks.


Oh man, I didn't expect to read a comment saying "it's OK if 100 times more cars have accidents as long as a company isn't liable".


I read the parent as meaning to say that because Mercedes will be liable, even 1,000 accidents will be a real problem for them. This is a good thing because it shows that Mercedes expects a very low number of accidents while the system is enabled.

Mercedes accepting the risk like this is a massive step forward for these reasons. It sets a precedent that hopefully others will follow. They wouldn't transfer the risk if they didn't think it would profit them.


I guess, maybe I misunderstood. I was just surprised to read "it's better to have a ton of deaths, rather than a few deaths we're on the hook for".


From the company's point of view this is correct reasoning. The sooner people realize companies do not have any responsibility to be moral the better


They absolutely do have responsibility. There's no reason we should allow investors limited liability if they are going to be assholes about it.

Corporations should only exist because they are net beneficial to the public!

I do agree that enough companies are unethical that it is reasonable to expect it.


This is just naive. Every company that makes any kind of product has made some kind of trade like this.

Costco sells you knives cheaply because they will not be liable if you murder people with them. If the Costco investors were liable for murder every time one of their knives was used to kill someone, you can bet they would just not sell them entirely.

Just because a company thinks about liability doesn’t mean it’s immoral. Individuals avoid liability as much as possible too (see insurance).

The world is dangerous and “fault” is everywhere.


I'm not sure what your point is. My comment is a statement that corporations do have a responsibility to act in the interests of society, not an analysis of the particular ethics of selling knives or avoiding liability.


So do people, which are the ones that run companies. What’s your point?


It's actually a human thing. When bad things happen, we strongly prefer that they don't happen as a result of something that we did.


On the other hand, if incautiously switching from the former to the latter drives your company bankrupt, the end result doesn't benefit anyone.


I feel like almost any summary of the form "so you're saying it's ok that..." is almost without exception not something the other person would agree with.


That's kind of the whole point. If a decision or policy has predictable consequences that aren't being addressed, either the decision-maker is unaware of those consequences or is accepting of those consequences. Asking the question removes ignorance as a possibility, and lets the conversation continue.

Sometimes the answer is "No, I was unaware, and I will adjust my decision." Sometimes the answer is "Yes, here are the consequences of the alternatives, and here's why I believe this set of consequences to be best." Sometimes the answer is "Yes, I don't care about those people." By asking the question, these three types of answers respectively give the other person an opportunity to improve themselves, give you an opportunity to learn from the other person, or give the audience an opportunity to learn not to trust the other person.


You missed one option; "You're falling prey to the is-ought fallacy." That is, saying that something is true is not the same as saying that something should be true. The original claim was that from the perspective of management at a company, 1,000 accidents the company is legally liable for is worse than 100,000 it isn't. Which is true! From that limited perspective! The reply "so you're saying it's ok that..." implies that the comment agreed with that perspective, which isn't necessarily the case. It could simply be pointing out a failure state of current management practices and corporate law. But further than that, that phrase is usually a particularly uncharitable one, and I find this usage of it to be more common than any other. I think "implying the speaker believes that the unfortunate condition they pointed out is right and just" is the normal use case for that phrase, rather than trying to bring attention to the consequences of a policy.


> You missed one option; "You're falling prey to the is-ought fallacy." That is, saying that something is true is not the same as saying that something should be true.

I think I'd put that as a subcategory of the second case, that the options were considered and this one was considered the best. That may mean that it is the least worst of several bad options, or that there are restricted options to choose from.

> Which is true! From that limited perspective! ... It could simply be pointing out a failure state of current management practices and corporate law.

I definitely agree, this is a fantastic example of options having been considered and rejected. In this case, the alternative would be "A self-driving car company accepts more liability than they can handle, and go bankrupt. This saves lives in the short-term, but costs lives in the long-term." It can then be the start of an entirely different conversation of how to avoid that failure state, and what would need to change in order to still get the benefits of that decision.

> The reply "so you're saying it's ok that..." implies that the comment agreed with that perspective, which isn't necessarily the case.

I'd make a distinction between a comment agreeing with a perspective and a commenter agreeing with a perspective. One is based solely on the text as it is written, and the other is a human's internal belief. It's not necessarily a statement that the person is wrong, but that the words they have spoken may have unintended consequences. The difference between "So you're saying $IDEA." and "So you believe $IDEA."

> I think "implying the speaker believes that the unfortunate condition they pointed out is right and just" is the normal use case for that phrase, rather than trying to bring attention to the consequences of a policy.

Good point. In situations where there are no long-term social relationships to be maintained, and where there isn't a good chance for a reply, the message given to the audience is the only one remaining. This is a major issue I have with any social group beyond a few dozen people, and one that I don't have any good solutions for.


> Sometimes the answer is "Yes, I don't care about those people."

Frequently true but rarely admitted


From a corporate perspective, of course it is. It's the Ford Pinto study all over again.

This is why corporate influence on legislation is bad, as their "best interests" often come at odds with morality-based ones.


Fight Club summarized the Pinto thing nicely:

https://www.quotes.net/mquote/31826

without providing the illusion that such cost benefit analyses are a thing of the past.

(Pintos had a problem with the gas tank, not the differential, but it's pretty clear what they were referring to.)


Most car accidents are 100% operator error, it'd be really far fetched to try to blame those on the manufacturer.

Autopilot not so much, the point stands.


Launch control option added by car manufacturers I'd say is 100% the car manufacturer's fault they thought of it, installed it, promoted it, but it's pointless and dangerous.


It clearly has a point, because they successfully market and sell the feature. And everything is dangerous to some degree.

Clearly it is not 100% their fault because the feature can certainly be used responsibly.

Is there a more nuanced and substantive form of your argument against developing and selling a launch control feature?


>launch control How many accidents happen from a standstill? Id love to see some stats, but i highly doubt it would be a high number


That's not really what the OP is saying though, is it?


True, it was more "the company would rather have 100k accidents it's not liable for rather than 1k accidents it is". Doesn't make it much better for me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: