New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired
You know what might work, program the car so that after the second unanswered “alert” the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.
They didn’t say he didn’t respond to the alerts. If you don’t respond, autopilot turns off.
This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8
Ah, so its just people defeating the system
The system with cars is that you don’t distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.
Screenshotting this because it’s so well put.
I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.
Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.
Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.
It’s frustrating to see
This is why we can’t have nice things
It is also frustrating to see people whining for technology when they should rather think about dead policemen and rescuers.
You should get your priorities straight if you ever hope to be taken seriously
The system will warn you to pay attention
… and if we have learned anything from that incident, it is that the warnings have been worthless.
The system can be tricked even by the worst drunkards!
for a few seconds before shutting down.
Few seconds are not enough. The crash was already unavoidable.
You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.
The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.
They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.
This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.
This is not a case of autopilot causing an accident, this is a case of an impaired driver
It is both, of course. The drunkard and the autopilot, both have added their share to create such danger, that ended deadly.
Driving drunk is already forbidden.
What Tesla has brought on the road here should be forbidden as well: lane assist combined with adaptive cruise control AND such a bunch of blind sensors.
The driver was in autopilot. Auto pilot is cruise control and lane assist. It’s not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents…its user error.
Still unaware of a proven FSD accident.
Here is an alternative Piped link(s): https://piped.video/oBIKikBmdN8
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 “keep your hands on the wheel” notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn’t do something similar.
It does and did… He kept driving anyway. Drink drivers FTW.
I presume AEB kicked in but all that can do is reduce the speed of inpact… if you’re determined to kill yourself there’s not much the car can do.
This is preventable and Tesla and other auto manufacturers should respond to make it so. No consumer vehicle should under any circumstance choose to drive full speed into a barrier or allow a driver to do so. It’s the reason we have driver assistance: improved safety.
The problem with this is what if the car thinks there’s a barrier in front of you but there isn’t? People are arguing that these systems are too intrusive while also arguing that they don’t go far enough to take control away from drivers.
This situation happened because a drunk driver ran into police cars, something that has been happening for as long as cars have existed.
That’s the issue with current “self driving” systems in a nutshell. We’re in this terrible middle ground right now where these features let careless drivers take their attention away, but not actually be able to control the vehicle safely. We should ban all that crap until actual self driving is viable.
How does it become viable if you ban the technology? What we have now is advanced cruise control that protects drivers in some circumstances while having zero effect in others. Drivers were equally dumb and careless long before this technology existed. This new tech doesn’t make that aspect any worse. Banning it now just means more people will crash and more people will be injured.
Here’s a an article referencing a UK white paper that talks about the issues with level 2 and 3 autonomous vehicles.
https://www.tu-auto.com/adas-level-2-3-avs-are-hazards-experts-warn/
*“With adaptive cruise control (ACC) for instance, it takes twice the amount of time to respond to a sudden braking event than it does when you are manually driving. Drivers may believe that ACC is safer but actually taking your foot off the accelerator pedal and letting the car make the decisions leads to lower workload and can mean drivers are unprepared for an unexpected event.”
University of Sussex object recognition researcher Dr Graham Hole was also questioned for the study and dubs Levels 2 and 3 “the worst of all worlds”. He says: “Human beings are rubbish at being vigilant – vigilance declines after about 20 minutes. With semi-autonomous you are reducing the driver to monitoring the system on the off-chance something goes wrong. Most of the time nothing goes wrong, leading the driver to have massive faith in the system in all conditions, which of course isn’t always the case.”*
Poor
drunkimpaired driver falling victim to autonomous driving… Hopefully that driver lost their license.That doesn’t drive the problem of autopilot not taking the right choices. What is the driver wasn’t drunk, but they had a heart attack? What if someone put a roofie on their drink? What if the driver was diabetic or hypoglycemic and suffered a blood glucose fall? What if they had a stroke?
Furthermore, what if the driver got drunk BECAUSE the car’s AI was advertised as being able to drive for you? Think of false publicity.
If your AI can’t handle one simple case of a driver being unresponsive, that’s negligence on the company’s part.
How could the company be negligent if someone gets drunk or has a heart attack and crashes their car? No company has a Level 5 autonomous vehicle where no human intervention is needed. Tesla is only Level 2. Mercedes has a Level 3 option (in extremely limited conditions). Waymo claims Level 4 but is geofenced.
removed by mod
You’re completely right and I’ve never seen this for traffic stops in Europe, they’ll make you park somewhere safe, at the very worst, in the emergency lane, but even that is rare for traffic stops. The only times I see lanes blocked is when there’s been an accident/breakdown and then the first thing they do is bring massive light panels well ahead of the spot to make everyone clear the lane.
Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash.
Is the video slowed down? In the video, if you pause 2.5s before the crash, the stopped police car seems to be very close already. A (awake) human driver would’ve recognized the stopped police car from way more distance than that.
I find that it can be hard to tell when a car ahead is stopped, maybe the visual system on the tesla has similar limitations. I think autopilot is controlled by the cameras alone but I’m not super up to date on tesla stuff. I would assume even a basic radar set up could tell something was stationary from quite far away.
Tesla’s are not safe.
Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.
In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
almost 4 times less likely to be involved in a crash than a human driven
Not relevant at all here, when we are discussing occurences that seem so easily and obviously avoidable.
(But it’s nice to see that the Fanboi team is awake now)
We’re talking about overall safety here. Even with 99.99% safety rate you’re still getting 33000 accidents a year in the US alone. There’s always going to be individual incidents to talk about
We’re talking about overall
No.
Tesla’s are not safe
Yes.
Tesla fails at basic safety in the most obvious and simple accidents (like this one or the car pileup at San Francisco tunnel).
So Tesla says. There is no independent verification of this data. It could all be bullshit.
Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on
So you take Tesla’s word and believe it, but ask for proof for the contrary?
You’re just a hypocrite.
Tesla model Y scored the highest possible score on IIHS crash test as well as 5 stars on Euro NCAP
Their other models have similar results. I believe Model X is the safest SUV ever made.
EDIT:
More than just resulting in a 5-star rating, the data from NHTSA’s testing shows that Model X has the lowest probability of injury of any SUV it has ever tested," Tesla said in a statement. "In fact, of all the cars NHTSA has ever tested, Model X’s overall probability of injury was second only to Model S.
EDIT2: Imagine downvoting the guy providing hard evidence and upvoting the fanatic making baseless claims backed by nothing
It’s not hard to game benchmarks.
Or maybe you’re so blinded by the hatred towards Musk that you can’t even think straight and no evidence in the world could convince you otherwise?
You really should’ve checked the last link.
You made the first comment: “Teslas aren’t safe”, without providing proof.
And now you’re calling someone a hypocrite because he asks for data of exactly what you claimed, while you’re redefining your first argument as “the contrary”.
So, do you have proof that Tesla’s aren’t safe in comparison to other cars, or is it just your opinion?
We’re literally having this discussion under a video where automatic braking should have kicked in, but didn’t.
But you can’t base a fact on one accident. Or even multiple. What if newspapers like to write especially about Tesla accidents to generate clicks?
Teslas seemingly have a lot of accidents, but without checking the statistics and comparing it to other manufacturers you wouldn’t really know if the perceived truth is a fact or not.
deleted by creator
There is a bias here in the numbers. Teslas are expensive and not everyone is buying them. The lower accident rate can be explained by the different demographic driving the vehicle rather than Teslas being better. For exemple, younger people might be more likely to cause accident because of different factors and they are also less likely to buy a Tesla because they are so expensive. I dont have the numbers for this, but we should all be careful with the claims of Tesla on safety when they compared themself to the global average.
Sure. There are always multiple factors in play. However I’d still be willing to bet that there’s nothing in Teslas that makes them inherently unsafe compared to other cars.
cough cough one pedal driving cough cough
You mean the feature that every single EV has?
The biggest bias is that the data comes from Tesla. Do you think they are going to release something that makes them look bad?
It’s also so misleading that Tesla use the word Autopilot for what is basically adaptive cruise control and lane assist
deleted by creator
It’s a douche bag trifecta
Tesla owner, driving drunk Cops, being cops Tesla, overselling their shitty car
I just hope that innocent bystander gets something from all three of them
I’m not so sure disengaging autopilot because the driver’s hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that’s the better way?
Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end “The driver was in control at the moment of the crash” just again feels like bad “self” driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.
Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it’s again a sign you shouldn’t be releasing this to the public. It’s clearly just not ready.
Not taking any responsibility away from the human driver here. I just don’t think the behaviour was good enough for software controlling a car used by the public.
Not to mention, of course, the reason for suing Tesla isn’t because they think they’re more liable. It’s because they can actually get some money from them.
The video is very thorough and goes into the hazy video caused by the flashing lights being one of the issues.
That’s not the main problem. It is more like an excuse. The main problem has been explained in the video right before that:
Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!
The emergency vehicles just happen to be your most frequent kind of obstacles.
The fallback to the camera is a bad excuse anyway, because radar is needed first to detect any obstacles. The cam will usually be later (=at closer distance) than the radar.
The even better solution (Trigger warning: nerdy stuff incoming) is to always mix all results of all kinds of sensors at an early stage in the processing software. That’s what european car makers do right from the beginning, but Tesla is way behind with their engineering. Their sensors still work indepently, and each does their own processing. So every shortcoming of one sensor creates a faulty detection result that has to be covered later (read: seconds later, not milliseconds) by other kinds of sensors.
Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!
Teslas don’t use radar, just cameras. That’s why Teslas crash at way higher rates than real self driving cars like Waymo.
Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!
I feel like this is bad tech understanding in journalism (which is hardly new). There’s no reason radar couldn’t see stationary vehicles. In fact, very specifically, they’re NOT stationary relative to the radar transceiver. Radar would see them no problem.
My actual suspicion here is that Tesla actively ignores stationary vehicles (it can know they’re stationary by adding its known speed to the relative speed) not in front of the vehicle. Now, in normal streets this makes sense (or at least those on the non-driver’s side). Do you pay attention to every car parked by the side of the road when driving? You’re maybe looking for signs of movement, or lights on, etc. But you’re not tracking them all, and neither will the autopilot. However, on a highway if you have more than 1 vehicle on the shoulder every now and then it should be making you wonder what else is ahead (and I’d argue a single car on the shoulder is a risk to keep watch on). A long line of them should definitely make you slow down.
I think Human drivers would do this, and I think an autopilot should be considering what kind of road it is on, and whether it should treat scenarios different.
I also have another suspicion, but it’s just a thought. If this Tesla was really using radar as well as cameras, haze or not, it should have seen that stationary vehicle further ahead than it did. Since newer Tesla cars don’t have radar, and coming from a software development background, I can actually see a logical (in terms of corporate thinking) reason to remove the code for radar. They would do this simply because they will not want to maintain it if they have no plans to return to radar. Think of it like this. After a few versions of augmenting the camera detection logic, it is unlikely to work with the existing radar logic. Do they spend the time to make them work together for the older vehicles, or only allow camera based AI on newer software versions? I would suspect the latter would be the business decision.
The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.
Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.
Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?
Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!
I believe this is caused by the fog combined with flashing lights and upward/curved road. The Tesla autopilot system is super impressive in almost all situations but you can clearly see the limits in extreme situations. Here, the drunk driver is definitely at fault, I don’t understand why they’d sue Tesla.
I don’t understand why they’d sue Tesla.
Money. Tesla has much more money than the drunk.
Because of Tesla’s exaggerated advertising.
Even my relatively ‘dumb’ car […] handles […] better than Tesla?!
Not going to be the last time when you experience that :-)
Your relatively ‘dumb’ car probably doesn’t try to gauge distance exclusively by interpreting visual data from cameras.
Wait, the Model X doesn’t have RADAR/LIDAR to supplement the cameras?
No, they were too expensive for Musk
Holy shit… This is worse than I thought
Wanna know what’s even worse? My M3 is equipped with LIDAR, but the functionality has been patched away because they don’t want to develop for it since all their new cars only have cameras… So even though I have what’s in practice a way better system equipped, my lane assist (won’t call it autopilot) is still 100% dependent on the fucking cameras…
Calling a Tesla an “M3” is extremely confusing.
Nope. For whatever reason, Musk decided to just use cameras
A few years ago, they were experimenting with LIDAR (most other car makers had it already then).
Then they abandoned it, even though everyone in the world thought that they need it so badly.
Now we see one of the results.
“Whatever reason” is obviously just trying to cut corners and improve the bottom line with no regard for the consequences.
removed by mod
IIRC Tesla disabled all of their dedicated hardware sensors in an OTA update, due to these sensors being excluded from new vehicles manufactured during the shortage. The autopilot system is vision only now, despite the engineers best efforts to keep hardware sensors in the vehicles
https://www.tesla.com/support/transitioning-tesla-vision
https://carbuzz.com/news/tesla-scrapping-radar-plans-and-upgrading-older-cars-to-tesla-vision (Sensationalised but contains additional details missing from Tesla’s site)
The changes are being rolled out with the latest over-the-air (OTA) Tesla update 2022.20.9 confirmed for Model X, Model S, and Model Y. The update is adding Tesla Vision to the older vehicles equipped with radar
So if the guy behind the wheel died and couldn’t react to the alerts then the car can’t do a decision to just stop instead of crashing into a police car?
deleted by creator
He was reacting to alerts, complying to them by simply touching the steering wheel. He did that 150 times during that 45 minute trip ( not all the trip was on auto pilot).
So if the guy died the car would of disengaged auto pilot (I’m not sure how this works).
You can check the video in the article. It’s quite informative .
Edit
I saw another video and it takes ~60 seconds after taking off your hand from the steering wheel for the car to safely come to a full stop.
So the headline should be “drunk driver hits police car.”
Was he drunk? The article seems to use the fact that the car nagged him 150 times as evidence that he was impaired.
TBH if you’re not used to it the steering wheel check can warn frequently. It’s checking for a small amount of torque on the wheel rather than actually holding it (as there are no pressure sensors) and that catches people out but the prompt says to put your hands on the wheel… I could believe 150 times on a long journey.
removed by mod
You don’t answer the question.
removed by mod
Isn’t that in purpose tho ? Like “hey if we’re not sure to be able to break on time, just disengage so it’s not our responsibility anymore”?
If we want to get really technical, the NSTB is requiring all new cars to have emergency braking so in this situation, the car should slam on the brakes. Even if it can’t slow down fast enough to prevent a crash, it should slow down enough to minimize it.
Is this particular Tesla under said law? Probably not. But I think we can see why this tactic is the infinitely safer and more ethical than saying “good luck, control this car on your own or enjoy this 100 km crash otherwise”
Tesla has AEB but by the time something like that triggers you’re reducing the severity of the crash not eliminating it.
It’s likely the car braked at 100km/h but was still doing 50 when it hit… at those speeds it’s fatal whatever happens.
Setting aside the driver issue, isn’t this another case that could’ve been prevented with LIDAR?
It’s what you get when you design places that require cars for everything
wtf I love Tesla now
Tesla is the new MAGA hat.
I’m not sure if the MAGA (and similar) crowd likes electric cars :-P
lol
Don’t see how that’s a Tesla problem… Drunk/high driver operating their car incorrectly.
By driving it
Tesla wasn’t driving it, the drunk/high owner was.
Right
It was on autopilot, so technically the drunk wasn’t driving it. But he is the one responsible.
Autopilot doesn’t work that way, the drunk should have known that when he wasn’t drunk and not tried to use it that way.
It’s like the old shaggy dog story about the guy driving a camper, setting the cruise control, then going into the back to make lunch.
That’s not the fault of the cruise control.
Thanks for the ELI5 that I wasn’t aware of needing.