A fan of Tesla might think that the automaker just can’t catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars’ performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.
The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.
What the car can “see” is the big issue here. It’s also what Tesla bet its future on.
Isn’t vision cameras the only sensor we have to recognize lane markings? This article is bunk making it seem like that’s not industry standard. RADAR can’t see paint on the road. My understanding is neither can LiDAR well enough for real-time lane markings at highway speeds.
It’s not only about seeing the markings. It’s also about recognizing potential colliding objects in less than ideal scenarios.