A fan of Tesla might think that the automaker just can’t catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars’ performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.

The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.

What the car can “see” is the big issue here. It’s also what Tesla bet its future on.

  • snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    edit-2
    14 days ago

    First of all, Elon isn’t that smart.

    Second, it would cost more money to put multiple types of sensors on the car. Spending money bad!

    Personal speculation based on Elon’s past behavior follows:

    Plus he wanted to focus on visual recognition stuff likely because it would have multiple possible income streams compared to a sensor that is just good at keeping a car from running into things. Focusing on the visible light spectrum means the possibilities for facial recognition, data collection by a fleet of Teslas, including the ones people bought, taking pictures, etc.

    Basically he wanted to focus on the one thing that seemed more profitable and didn’t want to spend money on that stupid thing that just kept the car from crashing.

    • skyspydude1@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      13 days ago

      I can tell you the real reason: Cameras are cheap for the amount of stuff you can kinda-sorta manage with them. That’s literally it. There’s no other 4D chess game of data collecting or anything else. They’re cheap to add and integrate, and adequate for object detection in typical scenarios. No need to worry about the shape of the bumper or paint effecting the radar, no need to have a bunch of individual ultrasonics integrated into the bumper and the associated wiring/labor costs.

      I worked with them, and there were numerous times where they came to us asking for new sensors because their cameras were too shitty for what they wanted to do, then once they got a quote, they miraculously didn’t need them and figured it out. It happened with corner radars on the Y, it happened with them removing the front radars on everything, and it happened with the ultrasonics.

      They bet it all on cameras as a lie to consumers and defraud investors that their cheap shit-boxes would be income generating Robotaxis. Even worse, their own engineers had hard data showing that removing the radar would directly result in pedestrian/motorcyclist deaths, but they had to keep those bullshit production numbers going, so they took them out and it’s directly resulted in dozens of likely preventable deaths.

      Anyone who’s ever worked with Tesla directly knows they’re an absolute fucking nightmare, and even compared to the shitshow of GM or Stellantis, the absolute blatant disregard for human life at that company is disgusting.