A Tesla Model Y hums along under the control of Full Self-Driving (FSD) software on an Austin street, where a school bus sits parked, its red lights flashing, stop signs extended like a warning hand. Nearby, a child-sized mannequin darts across the road, but this is actually a staged safety test by activists from The Dawn Project and Tesla Takedown, determined to expose cracks in Tesla’s semi-autonomous tech.
Yet, the story isn’t as simple as a rogue car failing a test. FSD, branded as “Supervised,” demands a human driver’s constant attention, ready to intervene. In this Austin setup, the activists’ footage shows the Tesla hitting the mannequin, but the real issue lies in expectation versus reality. A human driver, alert and engaged, would likely have braked for the bus’s flashing lights, overriding FSD’s decision-making. The test’s design seems to lean on FSD’s autonomy while sidelining the human factor Tesla emphasizes. Critics point out that Austin’s Robotaxi fleet will run a newer, unreleased FSD version, potentially more robust, with a “4x parameters” upgrade looming.
- Futuristic Design and Practical Features: Inspired by the iconic Car, this RC model boasts unique angular lines for an authentic and eye-catching...
- Advanced Control: Features a precision 27MHz remote control that offers smooth handling for both indoor and outdoor, minimized interferences design...
- Rechargeable Battery: Comes with a long-lasting rechargeable battery with 50+ mins play time and USB charging cable, promising extended playtime after...
Digging deeper, the test’s context reveals a clash of agendas. The Dawn Project and Tesla Takedown aren’t neutral observers; they’re vocal Tesla adversaries. Their collaboration here feels like a calculated jab, aiming to amplify doubts about FSD’s safety as Elon Musk’s Robotaxi vision nears its debut. They’ve even pushed to boot Tesla from Electrify Expo, calling the company a “financial engine” for Musk’s broader ambitions. Meanwhile, Tesla’s defenders argue FSD’s vision-based system, sans radar or LiDAR, isn’t inherently flawed but requires drivers to stay vigilant. The Austin test, they claim, exaggerates FSD’s role by simulating a scenario where human oversight should’ve kicked in.
Data from past incidents adds weight to both sides. Tesla’s FSD has stumbled before—crashing into a tree in Arizona, racking up fines in China, even striking a chicken without flinching. Yet, it’s also navigated dirt paths in China and dodged geese in other tests, showing flashes of competence. The National Highway Traffic Safety Administration is probing FSD after crashes in low-visibility conditions, hinting at limits in its camera-only approach. Rivals using LiDAR, like Waymo, boast better obstacle detection in fog or glare, but Tesla’s bet on vision mirrors human driving, for better or worse.
As Robotaxi’s launch looms, this Austin test feels like a flare shot into a stormy sky. Activists want you to see a reckless machine; Tesla wants you to see a tool misused. The truth likely lies in the gray—FSD is neither a flawless savior nor a runaway disaster.
[Source]