The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • rsuri@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras.

    This of course assumes 1) that cameras are just as good as eyes (they’re not) and 2) that the processing of visual data that the human brain does can be replicated by a machine, which seems highly dubious given that we only partially understand how humans process visual data to make decisions.

    Finally, it assumes that the current rate of human-caused crashes is acceptable. Which it isn’t. We tolerate crashes because we can’t improve people without unrealistic expense. In an automated system, if a bit of additional hardware can significantly reduce crashes it’s irrational not to do it.

    • blady_blah@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      This is directly a result of Elon’s edict that Tesla cars don’t use lidar. If you aren’t aware Elon set that as a requirement at the beginning of Tesla’s self driving project because he didn’t want to spend the money on lidar for all Tesla cars.

      His “first principles” logic is that humans don’t use lidar therefore self driving should be able to be accomplished without (expensive) enhanced vision tools. While this statement has some modicum of truth, it’s obviously going to trade off safely in situations where vision is compromised. Think fog or sunlight shining in your cameras / eyes or a person running across the street at night wearing all black. There are obvious scenarios where lidar is a massive safety advantage, but Elon made a decision for $$ to not have that. This sounds like a direct and obvious outcome of that edict.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          You need slightly more advanced lidar for cars because you need to be able to see further ahead then 10 ft, and you need to be able to see in adverse weather conditions (rain, fog, snow), that I assume you don’t experience indoors. That said, it really isn’t as expensive as he is making it out to be.

    • TheKMAP@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      1 month ago

      If the camera system + software results in being 1% safer than a human, and a given human can’t afford the lidar version, society is still better off with the human using the camera-based FSD than driving manually. Elon being a piece of shit doesn’t detract from this fact.

      But, yes, a lot of “ifs” in there, and obviously he did this to cut costs or supply chain or blahblah

      Lidar or other tech will be more relevant once we’ve raised the floor (everyone getting the additional safety over manual driving) and other FSDs become more mainstream (competition)