Home Kripto Tesla Full-Self Driving Tests Reveal Dangers: Speeds Past Stopped School Bus, Strikes Dummy Kids
Kripto

Tesla Full-Self Driving Tests Reveal Dangers: Speeds Past Stopped School Bus, Strikes Dummy Kids

Tesla Full-Self Driving Tests Reveal Dangers: Speeds Past Stopped School Bus, Strikes Dummy Kids

Third-party testing conducted by The Dawn Project and partners has revealed serious safety concerns about Tesla’s Full Self-Driving (FSD) software, especially in sensitive areas like residential streets and school zones. The demonstration showed a Tesla vehicle ignoring a stopped school bus’s activated stop sign and failing to stop, resulting in collisions with child-sized dummy models.

The Testing Setup and Findings

Using Tesla’s FSD (Supervised) software, testers created a real-world simulation with a parked school bus actively dropping off and picking up students. Despite the clear stop sign, the Tesla Model Y sped past the bus and struck a dummy child attempting to cross the street from a blind spot behind a parked car.

Tesla claims its FSD can detect pedestrians even in blind spots when it used radar technology. However, since switching to Tesla Vision, which relies only on cameras, pedestrian detection behind vehicles appears compromised. During multiple trials, the vehicle failed to stop or yield eight times, striking the test dummies repeatedly.

Ongoing Scrutiny Over Tesla’s Autonomous Driving

Tesla’s self-driving technology has faced increasing criticism after accidents and fatalities, including a high-profile lawsuit settled last year related to a fatal Model X crash involving Autopilot.

Regulatory scrutiny intensified after the California DMV filed a false advertising lawsuit against Tesla in 2022, accusing the company of misleading claims about autonomous capabilities. An administrative judge recently allowed the DMV to pursue the case further.

Despite these challenges, Tesla continues real-world testing of its FSD technology in cities like San Francisco and Austin, aiming to launch a robotaxi service with Model Ys equipped with FSD Supervised features.

Author’s Opinion

Tesla’s recent testing failures highlight the risks of rolling out autonomous driving technology prematurely, especially in complex environments like school zones. While innovation is crucial, safety cannot be compromised when lives are at stake. The shift from radar to camera-only detection may have unintended consequences that need urgent addressing. Tesla must ensure its systems can reliably protect the most vulnerable road users before expanding its self-driving ambitions.

Related Articles

Adobe’s Firefly Now Available on iOS and Android
Kripto

Adobe’s Firefly Now Available on iOS and Android

Adobe continues its push to become the go-to platform for AI-powered creative...

Trump Rejects Israeli Proposal to Target Iran’s Supreme Leader, Say US Officials
Kripto

Trump Rejects Israeli Proposal to Target Iran’s Supreme Leader, Say US Officials

Amid escalating tensions between Israel and Iran, President Donald Trump opposed an...

Tinder Now Lets You Arrange Double Dates with Friends
Kripto

Tinder Now Lets You Arrange Double Dates with Friends

In response to declining user engagement, Tinder has introduced a new feature...

Instagram Users Report Mass Bans, Blame AI for Crackdown
Kripto

Instagram Users Report Mass Bans, Blame AI for Crackdown

For several weeks, Instagram users have voiced growing frustration over a wave...