This Video Claims a Tesla Repeatedly Ran Over a Mannequin During Autopilot Test


In this video, which shows a Tesla autopilot making very flawed decisions, could demonstrate the technology simply isn’t ready.

The test shown in the video was conducted on Tesla’s autopilot system and seems to show a car ready to run into a child-sized mannequin with sufficient force to prove fatal in real life.

Research from safety and security group The Dawn Project – whose founder has invested in ad campaigns attacking Tesla’s FSD Beta program – put a Tesla 2019 Model 3 to the test. It involved the car, placed in full self-driving mode, going 40mph down a row of traffic cones.

At the end was a child-sized mannequin, and they claim in their report the conditions were optimal.

“To isolate the situation, all variables were removed from the situation except for the vehicle, the child, and the road itself. This made the testing environment more favorable to FSD (full self drive), since a real-world scenario may include distracting elements such as other vehicles in motion, weather, signage, parked cars, shadows, etc.”

In all three tests they had as part of the video and report, the Tesla failed to avoid the mannequin, striking it at an average speed of 25mph.

“This is the worst commercial software I’ve ever seen,” said founder Dan O’Dowd in a statement. “We need regulations that prohibit self-driving cars from driving on our roads until the manufacturer proves they will not mow down children in crosswalks.”

O’Dowd does seem to feel very strongly that the Tesla – and other self-driving vehicles – are just not ready for the real world.

“Elon Musk says Tesla’s Full Self-Driving software is ‘amazing.’ It’s not. It’s a lethal threat to all Americans. Over 100,000 Tesla drivers are already using the car’s Full Self-Driving mode on public roads, putting children at great risk in communities across the country.”

Your Comments / What Do You Think ?

This site uses Akismet to reduce spam. Learn how your comment data is processed.