Dan O’Dowd, a software company CEO who published the video earlier this month, thinks the National Highway Traffic Safety Administration should ban “full self-driving” until Tesla CEO Elon Musk “proves it won’t mow down children”.
That’s when Cupani, who runs an auto shop focused on imports and Teslas, got involved and recruited his son. While he’s a self-described “BMW guy”, Cupani says the software can’t compare to what Tesla offers.
“Some people look at it and say ‘oh this crazy dad, what is he doing?'” Cupani told CNN Business.
“Well, I do a lot of stuff like that but I’m going to make sure my kid doesn’t get hit.”
Cupani accelerated the Tesla from the other side of the lot and turned on “full self-driving,” reaching 35mph (56km/h). The Tesla braked steadily and came to a full stop — well ahead of his son.
“This Dan guy, he says he’s an expert in this, expert in that,” Cupani said.
“Well, I’m an expert in automotive, future technology, pro-driving instructor.”
The passionate defences and criticism of “full self-driving” highlight how the technology has become a flashpoint in the industry.
Ralph Nader, whose 1960s criticism of the auto industry helped spark the creation of the National Highway Traffic Safety Administration, joined a chorus of critics of “full self-driving” this month.
But it also is yet another example of the unintended consequence of deploying an unfinished, disruptive technology in the wild — and shows how far some Tesla believers are willing to go to defend it and the company.
Enough people appeared to be pursuing their own experiments that one government agency took the extraordinary step of warning people not to use children to test a car’s technology.
“Consumers should never attempt to create their own test scenarios or use real people, and especially children to test the performance of vehicle technology,” NHTSA said in a statement Wednesday. The agency called this approach “highly dangerous”.
Testing Teslas
Earlier this month, California resident Tad Park saw that another Tesla enthusiast wanted to test “full self-driving” with a child, and volunteered two of his children.
Park told CNN Business it was a “little tough” to get his wife to agree. She agreed when he promised to drive the vehicle.
“I’m never going to push the limits because my kids are way more valuable to me than anything,” Park said.
“I’m not going to risk their lives in any way.”
Park said he wasn’t comfortable doing a higher speed test of 40mph — like O’Dowd performed using the mannequins — with his kids.
Toronto resident Franklin Cadamuro created a “box boy”, a childlike form crafted from old Amazon cardboard boxes.
“I am a big Tesla fan.”
His Tesla slowed as it approached “box boy”. Then it sped up again and struck his cardboard mannequin. Cadamuro speculated that this could be because the cameras could not see the short boxes once they were immediately in front of the bumper, and therefore forget they were there.
Cadamuro said his video started as entertainment. But he wanted people to see that “full self-driving” isn’t perfect.
“I find a lot of people have two extreme thoughts of the ‘full self-driving’ beta,” Cadamuro said.
“People like Dan think it’s the worst thing in the world. I know some friends who think it’s near perfect.”
Cadamuro said he also performed other tests in which his Tesla, traveling at higher speeds, effectively steered around “box boy”.
Detecting smaller objects like young children quickly and accurately will generally be more difficult than sensing large objects and adults for a computer vision system like what Tesla vehicles rely on, according to Raj Rajkumar, a Carnegie Mellon University professor who researches autonomous vehicles.
The more pixels an object takes up in a camera image, the more information the system has to detect features and identify the object. The system will also be impacted by the data it is trained on, such as how many images of small children it’s exposed to.
“Computer vision with machine learning is not 100 per cent foolproof,” Rajkumar said.
“Just like diagnosis of a disease, there are always false positives and negatives.”
Tesla did not respond to a request for comment and generally does not engage with the professional news media.
“Wild West chaos rules”
Some Tesla supporters had criticised O’Dowd’s use of cones as lane markings in his original testing, which may have limited the sedan’s ability to steer around the mannequin.
Others claimed that O’Dowd’s test driver had forced the Tesla to strike the mannequin by pushing the accelerator, which wasn’t visible in videos O’Dowd released.
Some Tesla enthusiasts also pointed to a blurry messages on the screen of the Tesla vehicle as an indication that O’Dowd’s test driver was pushing the accelerator to rig the tests.
O’Dowd told CNN Business that the blurry messages referred to supercharging being unavailable, and to uneven tire wear.
CNN Business could not independently verify what the message said as O’Dowd provided no crisper video of what happened in car during the tests.
O’Dowd is the founder of the Dawn Project, an effort to make computers safe for humanity. He ran as a candidate for the US Senate unsuccessfully this year in a campaign focused exclusively on his critique of “full self-driving”.
NHTSA is currently investigating Tesla’s driver-assist technology so changes may be ahead.
“The software that controls billions of people’s lives in self-driving cars should be the best software ever written,” O’Dowd said.
“We’re using absolute, Wild West chaos rules and we’ve gotten something that is so terrible.”