In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
I hope some of you actually skimmed the article and got to the “disengaging” part.
That is like writing musk made an awkward, confused gesture during a time a few people might call questionable timing and place.
That’s so wrong holy shit
Don’t get me wrong, autopilot turning itself off right before a crash is sus and I wouldn’t put it past Tesla to do something like that (I mean come on, why don’t they use lidar) but maybe it’s so the car doesn’t try to power the wheels or something after impact which could potentially worsen the event.
On the other hand, they’re POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.
I see your point, and it makes sense, but I would be very surprised if Tesla did this. I think the best option would be to turn off the features once an impact is detected. It shutting off before hand feels like a cheap ploy to avoid guilt
that’s exactly what it is.
Normal cars do whatever is in their power to cease movement while facing upright. In a wreck, the safest state for a car is to cease moving.
Wouldn’t it make more sense for autopilot to brake and try to stop the car instead of just turning off and letting the car roll? If it’s certain enough that there will be an accident, just applying the brakes until there’s user override would make much more sense…
False positives. Most like it detected something was off (parking sensor detected something for example) but doesn’t have high confidence it isn’t an erroneous sensor reading. You don’t want the car slamming on brakes at highway speed for no reason and causing a multi car pileup.
Rober seems to think so, since he says in the video that it’s likely disengaging because the parking sensors detect that it’s parked because of the object in front, and it shuts off the cruise control.
It always is that way; fuck the consumer, its all about making a buck
Yeah but that’s milliseconds. Ergo, the crash was already going to happen.
In any case, the problem with Tesla autopilot is that it doesn’t have radar. It can’t see objects and there have been many instances where a Tesla crashed into a large visible object.
I’ve heard that too, and I don’t doubt it, but watching Mark Rober’s video, it seems like he’s deathgripping the wheel pretty hard before the impact which seems more likely to be disengaging. Each time, you can see the wheel tug slightly to the left, but his deathgrip pulls it back to the right.