Third-party testing conducted by The Dawn Project and partners has revealed serious safety concerns about Tesla’s Full Self-Driving (FSD) software, especially in sensitive areas like residential streets and school zones. The demonstration showed a Tesla vehicle ignoring a stopped school bus’s activated stop sign and failing to stop, resulting in collisions with child-sized dummy models.
The Testing Setup and Findings
Using Tesla’s FSD (Supervised) software, testers created a real-world simulation with a parked school bus actively dropping off and picking up students. Despite the clear stop sign, the Tesla Model Y sped past the bus and struck a dummy child attempting to cross the street from a blind spot behind a parked car.
Tesla claims its FSD can detect pedestrians even in blind spots when it used radar technology. However, since switching to Tesla Vision, which relies only on cameras, pedestrian detection behind vehicles appears compromised. During multiple trials, the vehicle failed to stop or yield eight times, striking the test dummies repeatedly.
Ongoing Scrutiny Over Tesla’s Autonomous Driving
Tesla’s self-driving technology has faced increasing criticism after accidents and fatalities, including a high-profile lawsuit settled last year related to a fatal Model X crash involving Autopilot.
Regulatory scrutiny intensified after the California DMV filed a false advertising lawsuit against Tesla in 2022, accusing the company of misleading claims about autonomous capabilities. An administrative judge recently allowed the DMV to pursue the case further.
Despite these challenges, Tesla continues real-world testing of its FSD technology in cities like San Francisco and Austin, aiming to launch a robotaxi service with Model Ys equipped with FSD Supervised features.
Author’s Opinion
Tesla’s recent testing failures highlight the risks of rolling out autonomous driving technology prematurely, especially in complex environments like school zones. While innovation is crucial, safety cannot be compromised when lives are at stake. The shift from radar to camera-only detection may have unintended consequences that need urgent addressing. Tesla must ensure its systems can reliably protect the most vulnerable road users before expanding its self-driving ambitions.