> The driver had previously reported to Tesla 7 to 10 times
I understand this is not a popular opinion on news forums anywhere nor do I write this to absolve in my view Tesla of their poor response nor of their poor choice in naming the system 'autopilot' to start.
But at what point does corporate or government responsibility end and personal responsibility start? If I knew something didn't work and that something could kill me and had reported the issue 7 to 10 times, I would be watching, like a hawk for the issue to recur, or more likely just not using it at all.
People have this insane idea that just because something isn't their fault they don't have to take corrective action in order to avoid the consequences. It is better to be alive than to be correct.
In the US, we have laws that require products to work as advertised. If they don't, generally strict product liability applies to the manufacturer, seller, and all middleman (in this case, they're all Tesla), regardless of whether the customer-victim was misusing the product.
In this case, there is no evidence that the driver was misusing the car. But there is a carload of evidence that Autopilot failed.
At this point, the only thing Tesla is accomplishing with these public statements is adding more zeros to the eventual settlement/damage award.
> In the US, we have laws that require products to work as advertised.
And if they don't, what's the first thing you, as the customer, do? You stop using the product. If autopilot isn't working the way you think it should, you stop using autopilot. This driver did not do that. What kind of sense does that make?
That's a very naive comment. What if there's no emergency lane? Should it just crash towards whatever is on the right side of the car? :)
Tesla's autopilot already slows down then stops moving if the user doesn't touch the steering wheel for a period time, but it gives plenty of warning to the user before that happens, as it should be.
Inconveniencing the driver and making it a learning experience vs crashing into concrete and killing him
Well. Maybe you are the naive one. Machinery shouldn’t give warnings. We already learned those lesson in industrial automation settings.
The average mentality of “just user error bro” seen on hackers new is what scares me the most. That is not how you build secure machines, far from it, and at this point my only hope not to be killed from an autonomous car in the next 30 years is for the regulations hammer coming crashing down full force on the wanton engineers that are treating loss of life as user error
tesla's already detect people not having the hands on the wheel, for starters. then you can fairly easily see where the eyes are looking and I guess with some more effort if the eyes are engaged or wandering.
I agree completely with you and also with grandparent. I don't think these perspectives are in conflict at all. You have to take responsibility for your own safety. But at the same time, Tesla's response makes me lose respect for them.
I understand this is not a popular opinion on news forums anywhere nor do I write this to absolve in my view Tesla of their poor response nor of their poor choice in naming the system 'autopilot' to start.
But at what point does corporate or government responsibility end and personal responsibility start? If I knew something didn't work and that something could kill me and had reported the issue 7 to 10 times, I would be watching, like a hawk for the issue to recur, or more likely just not using it at all.
People have this insane idea that just because something isn't their fault they don't have to take corrective action in order to avoid the consequences. It is better to be alive than to be correct.