Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.

  • reddig33@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    ·
    2 years ago

    LiDAR doesn’t see skin color or age. Radar doesn’t either. Infra-red doesn’t either.

    • quirk_eclair78@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      2 years ago

      That’s a fair observation! LiDAR, radar, and infra-red systems might not directly detect skin color or age, but the point being made in the article is that there are challenges when it comes to accurately detecting darker-skinned pedestrians and children. It seems that the bias could stem from the data used to train these AI systems, which may not have enough diverse representation.

      • bassomitron@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 years ago

        The main issue, as someone else pointed out as well, is in image detection systems only, which is what this article is primarily discussing. Lidar does have its own drawbacks, however. I wouldn’t be surprised if those systems would still not detect children as reliably. Skin color wouldn’t definitely be a consideration for it, though, as that’s not really how that tech works.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    edit-2
    2 years ago

    I hate all this bias bullshit because it makes the problem bigger than it actually is and passes the wrong idea to the general public.

    A pedestrian detection system shouldn’t have as its goal to detect skin tones and different pedestrian sizes equally. There’s no benefit in that. It should do the best it can to reduce the false negative rates of pedestrian detection regardless, and hopefully do better than human drivers in the majority of scenarios. The error rates will be different due to the very nature of the task, and that’s ok.

    This is what actually happens during research for the most part, but the media loves to stir some polarization and the public gives their clicks. Pushing for a “reduced bias model” is actually detrimental to the overall performance, because it incentivizes development of models that perform worse in scenarios they could have an edge just to serve an artificial demand for reduced bias.

  • 666dollarfootlong@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    2 years ago

    Wouldn’t good driverless cars use radars or lidars or whatever? Seems like the biggest issue here is that darker skin tones are harder for cameras to see

    • mint_tamas@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I think many driverless car companies insist on only using cameras. I guess lidars/radars are expensive.

      • skyspydude1@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        2 years ago

        They’re basically the only one. Even MobilEye, who is objectively the best in the ADAS/AV space for computer vision, uses other sensors in their fleet. They have demonstrated camera only autonomy, but realize it’s not worth the $1000 in sensors to risk killing people.

      • rDrDr@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 years ago

        Even Comma.AI, which is vision-only internally, still implicitly relies on the cars built in radar for collision detection and blind spot monitoring. It’s just Tesla.

        • DoomBot5@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 years ago

          To be fair, that’s because most cars aren’t equipped with cameras for blind spot detection.

          • rDrDr@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            Thats because cameras aren’t good for blind spot detection. Moreover, even for cars that have cameras on the side, the Comma doesn’t use them. AFAIK, in my car with 360 cameras, the OEM system doesn’t use the cameras either for blind spot.

  • AllonzeeLV@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    8
    ·
    edit-2
    2 years ago

    Worse than humans?!

    I find that very hard to believe.

    We consider it the cost of doing business, but self-driving cars have an obscenely low bar to surpass us in terms of safety. The biggest hurdle it has to climb is accounting for irrational human drivers and other irrational humans diving into traffic that even the rare decent human driver can’t always account for.

    American human drivers kill more people than 10 9/11s worth of people every year. Id rather modernizing and automating our roadways would be a moonshot national endeavor, but we don’t do that here anymore, so we complain when the incompetent, narcissistic asshole who claimed the project for private profit turned out to be an incompetent, narcissistic asshole.

    The tech is inevitable, there are no physics or computational power limitations standing in our way to achieve it, we just lack the will to be a society (that means funding stuff together through taxation) and do it.

    Let’s just trust another billionaire do it for us and act in the best interests of society though, that’s been working just gangbusters, hasn’t it?

      • Lucidlethargy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        2 years ago

        Your proving their point. That’s Tesla, the one run by an edgy, narcissistic, billionaire asshole, not the companies with better tech under (and above, in this case) the hood.

        • pendingdeletion@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 years ago

          All I am saying is that the article doesn’t attempt to make any comparison between human’s and AI’s ability to detect dark skinned people or children… the “worse” mentioned in the poorly worded (misleading) headline was comparing the detection rates of AI only.

    • zephyreks@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 years ago

      Ah yes, because all humans are equally bad drivers.

      A self-driving car shouldn’t compete with the average human because the average human is a fucking idiot. A self-driving car should drive better than a good driver, or else you’re just putting more idiots on the road.

      • duffman@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 years ago

        Replacing bad drivers with ok drivers is a net win. Let’s not leave perfection be the enemy of progress.

  • ChromeSkull@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 years ago

    A single flir camera would help massively. They don’t care about colour or height. Only temperature.

    • UsernameIsTooLon@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I could make a warm water balloon in the shape of a human and it would stop the car then. Maybe a combination of all various types of technologies? You’d still have to train the model on all various kinds of humans though.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 years ago

      What? No. They’d need to recognise them better - otherwise how can they swerve to make sure they hit them?

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 years ago

    cars should be tested for safety in collisions with children and it should affect their safety rating and taxes. Driverless equipment shouldn’t be allowed on the road until these sorts of issues are resolved.

  • thantik@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    5
    ·
    edit-2
    2 years ago

    It’s almost like less contrast against a black road or smaller targets are computationally more difficult to detect or something! Weird! How about instead of this pretty clear fact, we get outraged and claim it’s racism or something! Yeah!!