Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.

Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.

The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.

  • HarneyToker@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 days ago

    Got this saved next time someone tells me that a robot can drive better than a human. They almost had me there, but data doesn’t lie.

    • greygore@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 day ago

      This is more specific to Tesla than self driving in general, as Musk decided that additional sensors (like LiDAR and RADAR on other self driving vehicles) are a problem. Publicly he’s said that it’s because of sensor contention - that if the RADAR and cameras disagree, then the car gets confused.

      Of course that raises the problem that when the camera or image recognition is wrong, there’s nothing to tell the car otherwise, like the number of Tesla drivers decapitated by trailers that the car didn’t see. Additionally, I assume Teslas have accelerometers so either the self driving model is ignoring potential collisions or it’s still doing sensor fusion.

      Not to mention we humans have multiple senses that we use when driving; this is one reason why steering wheels still mostly use mechanical linkages - we can “feel” the road, we can detect when the wheels lose traction, we can feel inertia as we go around a corner too fast. On a related tangent, the Tesla Cybertruck uses steer-by-wire instead of a mechanical linkage.

      This is why many (including myself) believe Tesla has a much worse safety record than Waymo. I’ve seen enough drunk and distracted drivers to believe that humans will always drive better than a human robot. Don’t get me wrong, I still have concerns about the technology, but Musk and Tesla has a history of ignoring safety concerns - see the number of deaths related to his desire to have non-mechanical handles and hide the mechanical backup.

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      A robot can theoretically drive better than a human because emotions and boredom don’t have to be involved. But we aren’t there yet and Teslas are trying to solve the hard mode of pure vision without range finding.

      Also, I suspect that the ones we have are set up purely as NNs where everything is determined by the training, which likely means there’s some random-ass behaviour for rare edge cases where it “thinks” slamming on the accelerator is as good an option as anything else but since it’s a black box no one really understands, there’s no way to tell until someone ends up in that position.

      The tech still belongs in universities, not on public roads as a commercial product/service. Certainly not by the type of people who would at any point say, “fuck it, good enough, ship it like that”, which seems to be most of the tech industry these days.

  • Bazoogle@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    2 days ago

    a crash with a bus while the Tesla vehicle was stopped

    Okay, idk why we would blame this one on the self driving car…

    a collision with a heavy truck at 4 mph, and two separate incidents where the Tesla backed into objects, one into a pole or tree at 1 mph and another into a fixed object at 2 mph.

    original source

    The difference is a lot of these are never reported when it’s done by a human driver. I very highly doubt the rate is 4x higher than humans. I’m not saying the self driving cars are good. I am just saying human drivers are really bad.

  • Paranoidfactoid@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    Clearly, AI isn’t just challenging human performance, it’s exceeding it. Four times the crash rate is just the beginning. Just imagine the crash rate when super intelligence comes!

    🚘💥🚗

    • slevinkelevra@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      1
      ·
      3 days ago

      Yeah that’s well known by now. However, safety through additional radar sensors costs money and they can’t have that.

  • dogslayeggs@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    2 days ago

    It’s important to draw the line between what Tesla is trying to do and what Waymo is actually doing. Tesla has a 4x higher rate, but Waymo has a lower rate.

    • merc@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      2 days ago

      Not just lower, a tiny fraction of the human rate of accidents:

      https://waymo.com/safety/impact/

      Also, AFAIK this includes cases when the Waymo car isn’t even slightly at fault. Like, there have been 2 deaths involving a Waymo car. In one case a motorcyclist hit the car from behind, flipped over it, then was hit by another car and killed. In the other case, ironically, the real car at fault was a Tesla being driven by a human who claims he experienced “sudden unintended acceleration”. It was driving at 98 miles per hour in downtown SF and hit a bunch of stopped cars at a red light, then spun into oncoming traffic and killed a man and his dog who were in another car.

      Whether or not self-driving cars are a good thing is up for debate. But, it must suck to work at Waymo and to be making safety a major focus, only to have Tesla ruin the market by making people associate self-driving cars with major safety issues.

      • ramenshaman@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 days ago

        The best time to add lidar would have been years ago, the second best time is right now. I don’t think he would have to update the old cars, it could just be part of the hardware V5 package. He’s obviously comfortable with having customers beta testing production vehicles so he can start creating a lidar set now or he can continue failing to make reliable self-driving cars.

        • matlag@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 days ago

          Agree, but since he stated multiple time that all cars since xxx years were hardware capable of L5 self-driving next year (no need to precise the year, the statement is repeated every year), adding LIDAR now would be opening the way to a major class action. So he painted himself in a corner, and like all gigantic-ego idiots, he doubles down every time he’s asked.