Comment by general1726

Comment by general1726 9 hours ago

10 replies

I don't believe that at all. The problem is same like replacing managers with AI - no accountability. Computer can't make decisions which may or may not kill people, because when that will happen, who is responsible for injury? Driver who is not driving? Developer of such system? Company who made the car? Company who created the self driving system?

madamelic 9 hours ago

> Developer of such system? Company who made the car? Company who created the self driving system?

Those would in many cases be the same company. And yes, the entity responsible for injury would be the car manufacturer, not the driver or riders. Taking on legal liability for the car is part of the 'qualifications' for L3.

The specifics of what part of the system failed can be litigated if defendants (companies) need to be added to the case but in no case will the people inside of a vehicle designated higher than SAE L3 be held at fault for the accident.

  • general1726 9 hours ago

    And that's why I think we will never see L4 or L5 system. These system would be constantly "broken" due to i.e. dirty/miscalibrated sensors to prevent possible lawsuits which would be so often that they would become essentially useless.

    • Rexxar 8 hours ago

      The company who operates the car is responsible for this. Broken systems is not a way to avoid lawsuit but a sure way to be convicted.

      • general1726 7 hours ago

        If system is broken, it is not working. If it is not working then it can't drive you around.

joe463369 8 hours ago

If today I buy a brand new car, drive off the lot and the breaks fail causing me to plough into a pedestrian and kill them, who is to blame?

  • scuol 8 hours ago

    The manufacturer obviously, but they can sell the car in the first place because this defect risk is quantifiable for their liability insurance provider, who can evaluate how risky said car company is in terms of their manufacturing and how likely it is they'll need to pay out a claim, etc.

    For self-driving, that evaluation is almost impossible. Sure it can look good statistically, but for things like brake lines, brake pad material, brake boosters, etc, they are governed by the laws of physics which are more understandable than any self-driving algorithm.

    • joe463369 7 hours ago

      I think with Waymo we're probably at the point where an insurer could have decent stab at what their liability would be if asked to cover AI-related accidents. In fact, given that these cars are on the road and have reportedly been in accidents, I would imagine this is past being a hypothetical concern and now well into the territory of 'solved problem'.

  • arnsholt 8 hours ago

    In my jurisdiction, damages from car crashes are strict liability, so you would in fact be legally liable. Limited to ~10 million USD for damages to objects, no limit for damages to persons. Of course the manufacturing defect would give you a credible claim against the manufacturer, but that's a separate matter. Which is why automotive liability insurance is mandatory.

    • joe463369 8 hours ago

      Doesn't this answer the question then? Dodgy breaks or dodgy AI, it's on you if your car cleans out someone crossing the road.

  • general1726 8 hours ago

    You are for failing to check that your car is in drivable state before setting on your journey. This is actual law in my country.