New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • @Md1501@lemmy.world
    link
    fedilink
    English
    372 years ago

    You know what might work, program the car so that after the second unanswered “alert” the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

    • @Technoguyfication@lemmy.ml
      link
      fedilink
      English
      82 years ago

      This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

        • @stealin@lemmy.world
          link
          fedilink
          English
          52 years ago

          The system with cars is that you don’t distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.

          • Concetta
            link
            fedilink
            English
            -12 years ago

            Screenshotting this because it’s so well put.

          • @Technoguyfication@lemmy.ml
            link
            fedilink
            English
            1
            edit-2
            2 years ago

            I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.

            Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.

            Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.

            • @NeoNachtwaechter@lemmy.world
              link
              fedilink
              English
              02 years ago

              It’s frustrating to see

              This is why we can’t have nice things

              It is also frustrating to see people whining for technology when they should rather think about dead policemen and rescuers.

              You should get your priorities straight if you ever hope to be taken seriously

      • @NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        2 years ago

        The system will warn you to pay attention

        … and if we have learned anything from that incident, it is that the warnings have been worthless.

        The system can be tricked even by the worst drunkards!

        for a few seconds before shutting down.

        Few seconds are not enough. The crash was already unavoidable.

        • @Technoguyfication@lemmy.ml
          link
          fedilink
          English
          12 years ago

          You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.

          The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.

          They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.

          This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.

          • @NeoNachtwaechter@lemmy.world
            link
            fedilink
            English
            12 years ago

            This is not a case of autopilot causing an accident, this is a case of an impaired driver

            It is both, of course. The drunkard and the autopilot, both have added their share to create such danger, that ended deadly.

            Driving drunk is already forbidden.

            What Tesla has brought on the road here should be forbidden as well: lane assist combined with adaptive cruise control AND such a bunch of blind sensors.

            • @Iheardyoubutsowhat@lemmy.world
              link
              fedilink
              English
              -12 years ago

              The driver was in autopilot. Auto pilot is cruise control and lane assist. It’s not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents…its user error.

              Still unaware of a proven FSD accident.

    • @HalcyonReverb@midwest.social
      link
      fedilink
      English
      142 years ago

      I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 “keep your hands on the wheel” notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn’t do something similar.

      • @tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        22 years ago

        It does and did… He kept driving anyway. Drink drivers FTW.

        I presume AEB kicked in but all that can do is reduce the speed of inpact… if you’re determined to kill yourself there’s not much the car can do.

        • @Bookmeat@lemmy.world
          link
          fedilink
          English
          3
          edit-2
          2 years ago

          This is preventable and Tesla and other auto manufacturers should respond to make it so. No consumer vehicle should under any circumstance choose to drive full speed into a barrier or allow a driver to do so. It’s the reason we have driver assistance: improved safety.

          • @CmdrShepard@lemmy.one
            link
            fedilink
            English
            02 years ago

            The problem with this is what if the car thinks there’s a barrier in front of you but there isn’t? People are arguing that these systems are too intrusive while also arguing that they don’t go far enough to take control away from drivers.

            This situation happened because a drunk driver ran into police cars, something that has been happening for as long as cars have existed.

            • @Obi@sopuli.xyz
              link
              fedilink
              English
              02 years ago

              That’s the issue with current “self driving” systems in a nutshell. We’re in this terrible middle ground right now where these features let careless drivers take their attention away, but not actually be able to control the vehicle safely. We should ban all that crap until actual self driving is viable.

              • @CmdrShepard@lemmy.one
                link
                fedilink
                English
                -12 years ago

                How does it become viable if you ban the technology? What we have now is advanced cruise control that protects drivers in some circumstances while having zero effect in others. Drivers were equally dumb and careless long before this technology existed. This new tech doesn’t make that aspect any worse. Banning it now just means more people will crash and more people will be injured.

                • @Obi@sopuli.xyz
                  link
                  fedilink
                  English
                  1
                  edit-2
                  2 years ago

                  Here’s a an article referencing a UK white paper that talks about the issues with level 2 and 3 autonomous vehicles.

                  https://www.tu-auto.com/adas-level-2-3-avs-are-hazards-experts-warn/

                  *“With adaptive cruise control (ACC) for instance, it takes twice the amount of time to respond to a sudden braking event than it does when you are manually driving. Drivers may believe that ACC is safer but actually taking your foot off the accelerator pedal and letting the car make the decisions leads to lower workload and can mean drivers are unprepared for an unexpected event.”

                  University of Sussex object recognition researcher Dr Graham Hole was also questioned for the study and dubs Levels 2 and 3 “the worst of all worlds”. He says: “Human beings are rubbish at being vigilant – vigilance declines after about 20 minutes. With semi-autonomous you are reducing the driver to monitoring the system on the off-chance something goes wrong. Most of the time nothing goes wrong, leading the driver to have massive faith in the system in all conditions, which of course isn’t always the case.”*

  • N3Cr0
    link
    fedilink
    English
    292 years ago

    Poor drunk impaired driver falling victim to autonomous driving… Hopefully that driver lost their license.

    • Cyber Yuki
      link
      fedilink
      English
      42 years ago

      That doesn’t drive the problem of autopilot not taking the right choices. What is the driver wasn’t drunk, but they had a heart attack? What if someone put a roofie on their drink? What if the driver was diabetic or hypoglycemic and suffered a blood glucose fall? What if they had a stroke?

      Furthermore, what if the driver got drunk BECAUSE the car’s AI was advertised as being able to drive for you? Think of false publicity.

      If your AI can’t handle one simple case of a driver being unresponsive, that’s negligence on the company’s part.

      • @CmdrShepard@lemmy.one
        link
        fedilink
        English
        12 years ago

        How could the company be negligent if someone gets drunk or has a heart attack and crashes their car? No company has a Level 5 autonomous vehicle where no human intervention is needed. Tesla is only Level 2. Mercedes has a Level 3 option (in extremely limited conditions). Waymo claims Level 4 but is geofenced.

    • @Obi@sopuli.xyz
      link
      fedilink
      English
      52 years ago

      You’re completely right and I’ve never seen this for traffic stops in Europe, they’ll make you park somewhere safe, at the very worst, in the emergency lane, but even that is rare for traffic stops. The only times I see lanes blocked is when there’s been an accident/breakdown and then the first thing they do is bring massive light panels well ahead of the spot to make everyone clear the lane.

  • @redcalcium@lemmy.institute
    link
    fedilink
    English
    27
    edit-2
    2 years ago

    Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash.

    Is the video slowed down? In the video, if you pause 2.5s before the crash, the stopped police car seems to be very close already. A (awake) human driver would’ve recognized the stopped police car from way more distance than that.

    • @Thetimefarm@lemm.ee
      link
      fedilink
      English
      112 years ago

      I find that it can be hard to tell when a car ahead is stopped, maybe the visual system on the tesla has similar limitations. I think autopilot is controlled by the cameras alone but I’m not super up to date on tesla stuff. I would assume even a basic radar set up could tell something was stationary from quite far away.

    • @Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      0
      edit-2
      2 years ago

      Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.

      In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

      Source

      • @NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        2
        edit-2
        2 years ago

        almost 4 times less likely to be involved in a crash than a human driven

        Not relevant at all here, when we are discussing occurences that seem so easily and obviously avoidable.

        (But it’s nice to see that the Fanboi team is awake now)

        • @Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          -62 years ago

          Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on

            • @Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              -2
              edit-2
              2 years ago

              Tesla model Y scored the highest possible score on IIHS crash test as well as 5 stars on Euro NCAP

              Their other models have similar results. I believe Model X is the safest SUV ever made.

              EDIT:

              More than just resulting in a 5-star rating, the data from NHTSA’s testing shows that Model X has the lowest probability of injury of any SUV it has ever tested," Tesla said in a statement. "In fact, of all the cars NHTSA has ever tested, Model X’s overall probability of injury was second only to Model S.

              Source

              Also might want to check this

              EDIT2: Imagine downvoting the guy providing hard evidence and upvoting the fanatic making baseless claims backed by nothing

            • @narp@feddit.de
              link
              fedilink
              English
              22 years ago

              You made the first comment: “Teslas aren’t safe”, without providing proof.

              And now you’re calling someone a hypocrite because he asks for data of exactly what you claimed, while you’re redefining your first argument as “the contrary”.

              So, do you have proof that Tesla’s aren’t safe in comparison to other cars, or is it just your opinion?

                • @narp@feddit.de
                  link
                  fedilink
                  English
                  12 years ago

                  But you can’t base a fact on one accident. Or even multiple. What if newspapers like to write especially about Tesla accidents to generate clicks?

                  Teslas seemingly have a lot of accidents, but without checking the statistics and comparing it to other manufacturers you wouldn’t really know if the perceived truth is a fact or not.

      • @tiny_electron@sh.itjust.works
        link
        fedilink
        English
        182 years ago

        There is a bias here in the numbers. Teslas are expensive and not everyone is buying them. The lower accident rate can be explained by the different demographic driving the vehicle rather than Teslas being better. For exemple, younger people might be more likely to cause accident because of different factors and they are also less likely to buy a Tesla because they are so expensive. I dont have the numbers for this, but we should all be careful with the claims of Tesla on safety when they compared themself to the global average.

  • Pablo
    link
    fedilink
    English
    112 years ago

    It’s also so misleading that Tesla use the word Autopilot for what is basically adaptive cruise control and lane assist

  • @ExclamatoryProdundity@lemmy.world
    link
    fedilink
    English
    -62 years ago

    It’s a douche bag trifecta

    Tesla owner, driving drunk Cops, being cops Tesla, overselling their shitty car

    I just hope that innocent bystander gets something from all three of them

  • r00ty
    link
    fedilink
    182 years ago

    I’m not so sure disengaging autopilot because the driver’s hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that’s the better way?

    Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end “The driver was in control at the moment of the crash” just again feels like bad “self” driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

    Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it’s again a sign you shouldn’t be releasing this to the public. It’s clearly just not ready.

    Not taking any responsibility away from the human driver here. I just don’t think the behaviour was good enough for software controlling a car used by the public.

    Not to mention, of course, the reason for suing Tesla isn’t because they think they’re more liable. It’s because they can actually get some money from them.

      • @NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        52 years ago

        That’s not the main problem. It is more like an excuse. The main problem has been explained in the video right before that:

        Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

        The emergency vehicles just happen to be your most frequent kind of obstacles.

        The fallback to the camera is a bad excuse anyway, because radar is needed first to detect any obstacles. The cam will usually be later (=at closer distance) than the radar.

        The even better solution (Trigger warning: nerdy stuff incoming) is to always mix all results of all kinds of sensors at an early stage in the processing software. That’s what european car makers do right from the beginning, but Tesla is way behind with their engineering. Their sensors still work indepently, and each does their own processing. So every shortcoming of one sensor creates a faulty detection result that has to be covered later (read: seconds later, not milliseconds) by other kinds of sensors.

        • @Blaidd@lemmy.world
          link
          fedilink
          English
          42 years ago

          Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

          Teslas don’t use radar, just cameras. That’s why Teslas crash at way higher rates than real self driving cars like Waymo.

        • r00ty
          link
          fedilink
          22 years ago

          Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

          I feel like this is bad tech understanding in journalism (which is hardly new). There’s no reason radar couldn’t see stationary vehicles. In fact, very specifically, they’re NOT stationary relative to the radar transceiver. Radar would see them no problem.

          My actual suspicion here is that Tesla actively ignores stationary vehicles (it can know they’re stationary by adding its known speed to the relative speed) not in front of the vehicle. Now, in normal streets this makes sense (or at least those on the non-driver’s side). Do you pay attention to every car parked by the side of the road when driving? You’re maybe looking for signs of movement, or lights on, etc. But you’re not tracking them all, and neither will the autopilot. However, on a highway if you have more than 1 vehicle on the shoulder every now and then it should be making you wonder what else is ahead (and I’d argue a single car on the shoulder is a risk to keep watch on). A long line of them should definitely make you slow down.

          I think Human drivers would do this, and I think an autopilot should be considering what kind of road it is on, and whether it should treat scenarios different.

          I also have another suspicion, but it’s just a thought. If this Tesla was really using radar as well as cameras, haze or not, it should have seen that stationary vehicle further ahead than it did. Since newer Tesla cars don’t have radar, and coming from a software development background, I can actually see a logical (in terms of corporate thinking) reason to remove the code for radar. They would do this simply because they will not want to maintain it if they have no plans to return to radar. Think of it like this. After a few versions of augmenting the camera detection logic, it is unlikely to work with the existing radar logic. Do they spend the time to make them work together for the older vehicles, or only allow camera based AI on newer software versions? I would suspect the latter would be the business decision.

      • r00ty
        link
        fedilink
        22 years ago

        The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.

        Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.

  • @thatKamGuy@sh.itjust.works
    link
    fedilink
    English
    532 years ago

    Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

    Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

    • @RushingSquirrel@lemm.ee
      link
      fedilink
      English
      32 years ago

      I believe this is caused by the fog combined with flashing lights and upward/curved road. The Tesla autopilot system is super impressive in almost all situations but you can clearly see the limits in extreme situations. Here, the drunk driver is definitely at fault, I don’t understand why they’d sue Tesla.

    • @NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      32 years ago

      Even my relatively ‘dumb’ car […] handles […] better than Tesla?!

      Not going to be the last time when you experience that :-)

    • daikiki
      link
      fedilink
      English
      492 years ago

      Your relatively ‘dumb’ car probably doesn’t try to gauge distance exclusively by interpreting visual data from cameras.

  • Jeena
    link
    fedilink
    English
    622 years ago

    So if the guy behind the wheel died and couldn’t react to the alerts then the car can’t do a decision to just stop instead of crashing into a police car?

    • @pec@sh.itjust.works
      link
      fedilink
      English
      13
      edit-2
      2 years ago

      He was reacting to alerts, complying to them by simply touching the steering wheel. He did that 150 times during that 45 minute trip ( not all the trip was on auto pilot).

      So if the guy died the car would of disengaged auto pilot (I’m not sure how this works).

      You can check the video in the article. It’s quite informative .

      Edit

      I saw another video and it takes ~60 seconds after taking off your hand from the steering wheel for the car to safely come to a full stop.

        • @Landmammals@lemmy.world
          link
          fedilink
          English
          32 years ago

          Was he drunk? The article seems to use the fact that the car nagged him 150 times as evidence that he was impaired.

      • @tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        22 years ago

        TBH if you’re not used to it the steering wheel check can warn frequently. It’s checking for a small amount of torque on the wheel rather than actually holding it (as there are no pressure sensors) and that catches people out but the prompt says to put your hands on the wheel… I could believe 150 times on a long journey.

      • @Wats0ns@programming.dev
        link
        fedilink
        English
        82 years ago

        Isn’t that in purpose tho ? Like “hey if we’re not sure to be able to break on time, just disengage so it’s not our responsibility anymore”?

        • iWidji
          link
          fedilink
          English
          4
          edit-2
          2 years ago

          If we want to get really technical, the NSTB is requiring all new cars to have emergency braking so in this situation, the car should slam on the brakes. Even if it can’t slow down fast enough to prevent a crash, it should slow down enough to minimize it.

          Is this particular Tesla under said law? Probably not. But I think we can see why this tactic is the infinitely safer and more ethical than saying “good luck, control this car on your own or enjoy this 100 km crash otherwise”

          • @tony@lemmy.hoyle.me.uk
            link
            fedilink
            English
            22 years ago

            Tesla has AEB but by the time something like that triggers you’re reducing the severity of the crash not eliminating it.

            It’s likely the car braked at 100km/h but was still doing 50 when it hit… at those speeds it’s fatal whatever happens.

  • @hark@lemmy.world
    link
    fedilink
    English
    382 years ago

    Setting aside the driver issue, isn’t this another case that could’ve been prevented with LIDAR?

  • Jordan Lund
    link
    fedilink
    English
    282 years ago

    Don’t see how that’s a Tesla problem… Drunk/high driver operating their car incorrectly.

          • Jordan Lund
            link
            fedilink
            English
            32 years ago

            Autopilot doesn’t work that way, the drunk should have known that when he wasn’t drunk and not tried to use it that way.

            It’s like the old shaggy dog story about the guy driving a camper, setting the cruise control, then going into the back to make lunch.

            That’s not the fault of the cruise control.