In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • @TommySoda@lemmy.world
    link
    fedilink
    English
    3591 month ago

    Notice how they’re mad at the video and not the car, manufacturer, or the CEO. It’s a huge safety issue yet they’d rather defend a brand that obviously doesn’t even care about their safety. Like, nobody is gonna give you a medal for being loyal to a brand.

    • @can@sh.itjust.works
      link
      fedilink
      English
      1311 month ago

      These people haven’t found any individual self identity.

      An attack on the brand is an attack on them. Reminds me of the people who made Stars Wars their meaning and crumbled when a certain trilogy didn’t hold up.

      • Billiam
        link
        fedilink
        English
        711 month ago

        An attack on the brand is an attack on them.

        Thus it ever is with Conservatives. They make $whatever their whole identity, and so take any critique of $whatever as a personal attack against themselves.

        I blame evangelical religions’ need for martyrdom for this.

        • @CarbonatedPastaSauce@lemmy.world
          link
          fedilink
          English
          191 month ago

          You pretty much hit the nail on the head. These people have no identity or ability to think for themselves because they never needed either one. The church will do all your thinking for you, and anything it doesn’t cover will be handled by Fox News. Be like everyone else and fit in, otherwise… you have to start thinking for yourself. THE HORROR.

        • @mojofrododojo@lemmy.world
          link
          fedilink
          English
          31 month ago

          “Mark my word, if and when these preachers get control of the [Republican] party, and they’re sure trying to do so, it’s going to be a terrible damn problem. Frankly, these people frighten me. Politics and governing demand compromise. But these Christians believe they are acting in the name of God, so they can’t and won’t compromise. I know, I’ve tried to deal with them.” ― Barry Goldwater

      • @jumperalex@lemmy.world
        link
        fedilink
        English
        171 month ago

        So literally every single above average sports fan?

        The pathological need to be part of a group so bad it overwhelmes all reason is a feature I have yet to understand. And I say that as someone who can recognize in myself those moments when I feel the pull to be part of an in group.

        • @DogWater@lemmy.world
          link
          fedilink
          English
          31 month ago

          It’s evolutionary. Humans are social pack animals. The need for inclusion was evolved into us over however many years.

          • @jumperalex@lemmy.world
            link
            fedilink
            English
            130 days ago

            Oh yes I’m aware. I’m clearly missing, or at least have a reduced expression of, that gene. I play soccer multiple times a week, meanwhile I couldn’t possibly care less about watching soccer; or any sport for that matter.

        • @mic_check_one_two@lemmy.dbzer0.com
          link
          fedilink
          English
          21 month ago

          That’s just tribalism in general. Humans are tribal by nature as a survival mechanism. In modern culture, that manifests as behaviors like being a rabid sports fan.

          • @jumperalex@lemmy.world
            link
            fedilink
            English
            130 days ago

            Oh yes I’m aware. I’m clearly missing, or at least have a reduced expression of, that gene. I play soccer multiple times a week, meanwhile I couldn’t possibly care less about watching soccer; or any sport for that matter.

    • @rtxn@lemmy.world
      link
      fedilink
      English
      411 month ago

      The styrofoam wall had a pre-cut hole to weaken it, and some people are using it as a gotcha proving the video was faked. It would be funny if it wasn’t so pathetic.

      • @TommySoda@lemmy.world
        link
        fedilink
        English
        441 month ago

        Yeah, but it’s styrofoam. You could literally run through it. And I’m sure they did that more as a safety measure so that it was guaranteed to collapse so nobody would be injured.

        But at the same time it still drove through a fucking wall. The integrity doesn’t mean shit because it drove through a literal fucking wall.

        • @rtxn@lemmy.world
          link
          fedilink
          English
          11
          edit-2
          1 month ago

          Hopefully with a Mythbusters-style remote control setup in case it explodes. And the trunk filled with ANFO to make sure it does.

      • @samus12345@lemm.ee
        link
        fedilink
        English
        151 month ago

        Yeah, because he knew that thing probably wasn’t gonna stop. Why destroy the car when you don’t have to? Concrete wouldn’t have changed the outcome.

      • @booly@sh.itjust.works
        link
        fedilink
        English
        41 month ago

        Kinda depends on the fact, right? Plenty of factual things piss me off, but I’d argue I’m correct to be pissed off about them.

        • @OnASnowyEvening@lemmy.world
          link
          fedilink
          English
          1
          edit-2
          1 month ago

          Right. Just because sometimes we have to accept something, doesn’t mean we have to like it.

          (Though the other commenter implied people commonly or always angered by fact, but then we have nothing to talk about.)

    • @merc@sh.itjust.works
      link
      fedilink
      English
      81 month ago

      To be fair, and ugh, I hate to have to stand up for these assholes, but…

      To be fair, their claim is that the video was a lie and that the results were manufactured. They believe that Teslas are actually safe and that Rober was doing some kind of Elon Musk takedown trying to profit off the shares getting tanked and promote a rival company.

      They actually do have a little bit of evidence for those claims:

      1. The wall changes between different camera angles. In some angles the wall is simply something painted on canvas. In other angles it’s a solid styrofoam wall.
      2. The inside the car view in the YouTube video doesn’t make it clear that autopilot mode is engaged.
      3. Mark Rober chose to use Autopilot mode rather than so-called Full Self Driving.

      But, he was interviewed about this, and he provided additional footage to clear up what happened.

      1. They did the experiment twice, once with a canvas wall, then a few weeks later with a styrofoam wall. The car smashed right into the wall the first time, but it wasn’t very dramatic because the canvas just blew out of the way. They wanted a more dramatic video for YouTube, so they did it again with a styrofoam wall so you could see the wall getting smashed. This included pre-weakening the wall so that when the car hit it, it smashed a dramatic Looney-Tunes looking hole in the wall. When they made the final video, they included various cuts from both the first and second attempts. The car hit the wall both times, but it wasn’t just one single hit like it was shown in the video.

      2. There’s apparently a “rainbow” path shown when the car is in Autopilot mode. [RAinbows1?!? DEI!?!?!?!] In the cut they posted to YouTube, you couldn’t see this rainbow path. But, Rober posted a longer cut of the car hitting the wall where it was visible. So, it wasn’t that autopilot was off, but in the original YouTube video you couldn’t tell.

      3. He used Autopilot mode because from his understanding (as a Tesla owner (this was his personal vehicle being tested)), Full Self Driving requires you to enter a destination address. He just wanted to drive down a closed highway at high speed, so he used Autopilot instead. In his understanding as a Tesla owner and engineer, there would be no difference in how the car dealt with obstacles in autopilot mode vs. full self driving, but he admitted that he hadn’t tested it, so it’s possible that so-called Full Self-Driving would have handled things differently.

      Anyhow, these rabid MAGA Elon Fanboys did pick up on some minor inconsistencies in his original video. Rober apprently didn’t realize what a firestorm he was wading into. His intention was to make a video about how cool LIDAR is, but with a cool scene of a car smashing through a wall as the hook. He’d apparently been planning and filming the video for half a year, and he claims it just happened to get released right at the height of the time when Teslas are getting firebombed.

  • FuglyDuck
    link
    fedilink
    English
    2371 month ago

    As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

    This has been known.

    They do it so they can evade liability for the crash.

    • @bazzzzzzz@lemm.ee
      link
      fedilink
      English
      361 month ago

      Not sure how that helps in evading liability.

      Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn’t check the recording framerate, but 25fps is the slowest reasonable), less than a second.

      • FuglyDuck
        link
        fedilink
        English
        531 month ago

        It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.

        Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.

      • @orcrist@lemm.ee
        link
        fedilink
        English
        411 month ago

        They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

        And then that creates a discussion about how much time the human driver has to have in order to actually solve the problem, or gray areas about who exactly controls what when, and it complicates the situation enough where maybe Tesla can pay less money for the deaths that they are obviously responsible for.

        • @jimbolauski@lemm.ee
          link
          fedilink
          English
          81 month ago

          They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

          The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

          • @michaelmrose@lemmy.world
            link
            fedilink
            English
            111 month ago

            They can also claim with a straight face that autopilot has a crash rate that is artificially lowered without it being technically a lie in public, in ads, etc

          • @FauxLiving@lemmy.world
            link
            fedilink
            English
            71 month ago

            Defense lawyers can make a lot of hay with details like that. Nothing that gets the lawsuit dismissed but turning the question into “how much is each party responsible” when it was previously “Tesla drove me into a wall” can help reduce settlement amounts (as these things rarely go to trial).

          • FuglyDuck
            link
            fedilink
            English
            51 month ago

            The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

            these strategies aren’t about actually winning the argument, it’s about making it excessively expensive to have the argument in the first place. Every motion requires a response by the counterparty, which requires billable time from the counterparty’s lawyers, and delays the trial. it’s just another variation on “defend, depose, deny”.

    • @fibojoly@sh.itjust.works
      link
      fedilink
      English
      24
      edit-2
      1 month ago

      That makes so little sense… It detects it’s about to crash then gives up and lets you sort it?
      That’s like the opposite of my Audi who does detect I’m about to hit something and gives me either a warning or just actively hits the brakes if I don’t have time to handle it.
      If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

      • @Red_October@lemmy.world
        link
        fedilink
        English
        201 month ago

        The point is that they can say “Autopilot wasn’t active during the crash.” They can leave out that autopilot was active right up until the moment before, or that autopilot directly contributed to it. They’re just purely leaning into the technical truth that it wasn’t on during the crash. Whether it’s a courtroom defense or their own next published set of data, “Autopilot was not active during any recorded Tesla crashes.”

      • FuglyDuck
        link
        fedilink
        English
        3
        edit-2
        1 month ago

        even your audi is going to dump to human control if it can’t figure out what the appropriate response is. Granted, your Audi is probably smart enough to be like “yeah don’t hit the fucking wall,” but eh… it was put together by people that actually know what they’re doing, and care about safety.

        Tesla isn’t doing this for safety or because it’s the best response. The cars are doing this because they don’t want to pay out for wrongful death lawsuits.

        If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

        It’s musk. he’s fucking vile, and this isn’t even close to the worst thing he’s doing. or has done.

    • @Simulation6@sopuli.xyz
      link
      fedilink
      English
      121 month ago

      If the disengage to avoid legal consequences feature does exist, then you would think there would be some false positive incidences where it turns off for no apparent reason. I found some with a search, which are attributed to bad software. Owners are discussing new patches fixing some problems and introducing new ones. None of the incidences caused an accident, so maybe the owners never hit the malicious code.

      • @Dultas@lemmy.world
        link
        fedilink
        English
        51 month ago

        I think Mark (who made the OG video) speculated it might be the ultrasonic parking sensors detecting something and disengaging.

      • FuglyDuck
        link
        fedilink
        English
        31 month ago

        if it randomly turns off for unapparent reasons, people are going to be like ‘oh that’s weird’ and leave it at that. Tesla certainly isn’t going to admit that their code is malicious like that. at least not until the FBI is digging through their memos to show it was. and maybe not even then.

        • @AA5B@lemmy.world
          link
          fedilink
          English
          11 month ago

          When I tried it, the only unexpected disengagement was on the highway, but it just slowed and stayed in lane giving me lots of time to take over.

          Thinking about it afterwards, possible reasons include

          • I had cars on both sides, blocking me in. Perhaps it decided that was risky or it occluded vision, or perhaps one moved toward me and there was no room to avoid
          • it was a little over a mile from my exit. Perhaps it decided it had no way to switch lanes while being blocked in
      • @AA5B@lemmy.world
        link
        fedilink
        English
        2
        edit-2
        1 month ago

        The given reason is simply that it will return control to the driver if it can’t figure out what to do, and all evidence is consistent with that. All self-driving cars have some variation of this. However yes it’s suspicious when it disengages right when you need it most. I also don’t know of data to support whether this is a pattern or just a feature of certain well-published cases.

        Even in those false positives, it’s entirely consistent with the ai being confused, especially since many of these scenarios get addressed by software updates. I’m not trying to deny it, just say the evidence is not as clear as people here are claiming

    • @NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      11
      edit-2
      1 month ago

      Any crash within 10s of a disengagement counts as it being on so you can’t just do this.

      Edit: added the time unit.

      Edit2: it’s actually 30s not 10s. See below.

      • FuglyDuck
        link
        fedilink
        English
        31 month ago

        Where are you seeing that?

        There’s nothing I’m seeing as a matter of law or regulation.

        In any case liability (especially civil liability) is an absolute bitch. It’s incredibly messy and likely will not every be so cut and dry.

        • @NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          4
          edit-2
          1 month ago

          Well it’s not that it was a crash caused by a level 2 system, but that they’ll investigate it.

          So you can’t hide the crash by disengaging it just before.

          Looks like it’s actually 30s seconds not 10s, or maybe it was 10s once upon a time and they changed it to 30?

          The General Order requires that reporting entities file incident reports for crashes involving ADS-equipped vehicles that occur on publicly accessible roads in the United States and its territories. Crashes involving an ADS-equipped vehicle are reportable if the ADS was in use at any time within 30 seconds of the crash and the crash resulted in property damage or injury

          https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

          • FuglyDuck
            link
            fedilink
            English
            41 month ago

            Thanks for that.

            The thing is, though the NHTSA generally doesn’t make a determination on criminal or civil liability. They’ll make the report about what happened and keep it to the facts, and let the courts sort it out whose at fault. they might not even actually investigate a crash unless it comes to it. It’s just saying “when your car crashes, you need to tell us about it.” and they kinda assume they comply.

            Which, Tesla doesn’t want to comply, and is one of the reasons Musk/DOGE is going after them.

            • @NotMyOldRedditName@lemmy.world
              link
              fedilink
              English
              3
              edit-2
              1 month ago

              I knew they wouldn’t necessarily investigate it, that’s always their discretion, but I had no idea there was no actual bite to the rule if they didn’t comply. That’s stupid.

              • @AA5B@lemmy.world
                link
                fedilink
                English
                11 month ago

                Generally things like that are meant more to identify a pattern. It may not be useful to an individual, but very useful to determine a recall or support a class action

          • @oatscoop@midwest.social
            link
            fedilink
            English
            11 month ago

            I get the impression it disengages so that Tesla can legally say “self driving wasn’t active when it crashed” to the media.

      • @NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        1 month ago

        AEB braking was originally designed to not prevent a crash, but to slow the car when a unavoidable crash was detected.

        It’s since gotten better and can also prevent crashes now, but slowing the speed of the crash was the original important piece. It’s a lot easier to predict an unavoidable crash, than to detect a potential crash and stop in time.

        Insurance companies offer a discount for having any type of AEB as even just slowing will reduce damages and their cost out of pocket.

        Not all AEB systems are created equal though.

        Maybe disengaging AP if an unavoidable crash is detected triggers the AEB system? Like maybe for AEB to take over which should always be running, AP has to be off?

      • FuglyDuck
        link
        fedilink
        English
        21 month ago

        So, as others have said, it takes time to brake. But also, generally speaking autonomous cars are programmed to dump control back to the human if there’s a situation it can’t see an ‘appropriate’ response to.

        what’s happening here is the ‘oh shit, there’s no action that can stop the crash’, because braking takes time (hell, even coming to that decision takes time, activating the whoseitwhatsits that activate the brakes takes time.) the normal thought is, if there’s something it can’t figure out on it’s own, it’s best to let the human take over. It’s supposed to make that decision well before, though.

        However, as for why tesla is doing that when there’s not enough time to actually take control?

        It’s because liability is a bitch. Given how many teslas are on the road, even a single ruling of “yup it was tesla’s fault” is going to start creating precedent, and that gets very expensive, very fast. especially for something that can’t really be fixed.

        for some technical perspective, I pulled up the frame rates on the camera system (I’m not seeing frame rate on the cabin camera specifically, but it seems to either be 36 in older models or 24 in newer.)

        14 frames @ 24 fps is about 0.6 seconds@36 fps, it’s about 0.4 seconds. For comparison, average human reaction to just see a change and click a mouse is about .3 seconds. If you add in needing to assess situation… that’s going to be significantly more time.

  • comfy
    link
    fedilink
    English
    1151 month ago

    I hope some of you actually skimmed the article and got to the “disengaging” part.

    As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

    It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

    • @endeavor@sopuli.xyz
      link
      fedilink
      English
      31
      edit-2
      1 month ago

      It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

      That is like writing musk made an awkward, confused gesture during a time a few people might call questionable timing and place.

    • @LemmyFeed@lemmy.dbzer0.com
      link
      fedilink
      English
      231 month ago

      Don’t get me wrong, autopilot turning itself off right before a crash is sus and I wouldn’t put it past Tesla to do something like that (I mean come on, why don’t they use lidar) but maybe it’s so the car doesn’t try to power the wheels or something after impact which could potentially worsen the event.

      On the other hand, they’re POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.

      • 74 183.84
        link
        fedilink
        English
        131 month ago

        I see your point, and it makes sense, but I would be very surprised if Tesla did this. I think the best option would be to turn off the features once an impact is detected. It shutting off before hand feels like a cheap ploy to avoid guilt

        • FuglyDuck
          link
          fedilink
          English
          71 month ago

          … It shutting off before hand feels like a cheap ploy to avoid guilt

          that’s exactly what it is.

      • @skuzz@discuss.tchncs.de
        link
        fedilink
        English
        121 month ago

        Normal cars do whatever is in their power to cease movement while facing upright. In a wreck, the safest state for a car is to cease moving.

      • Krzd
        link
        fedilink
        English
        91 month ago

        Wouldn’t it make more sense for autopilot to brake and try to stop the car instead of just turning off and letting the car roll? If it’s certain enough that there will be an accident, just applying the brakes until there’s user override would make much more sense…

        • @Dultas@lemmy.world
          link
          fedilink
          English
          11 month ago

          False positives. Most like it detected something was off (parking sensor detected something for example) but doesn’t have high confidence it isn’t an erroneous sensor reading. You don’t want the car slamming on brakes at highway speed for no reason and causing a multi car pileup.

      • @T156@lemmy.world
        link
        fedilink
        English
        61 month ago

        Rober seems to think so, since he says in the video that it’s likely disengaging because the parking sensors detect that it’s parked because of the object in front, and it shuts off the cruise control.

    • 74 183.84
      link
      fedilink
      English
      7
      edit-2
      1 month ago

      It always is that way; fuck the consumer, its all about making a buck

    • @PersnickityPenguin@lemm.ee
      link
      fedilink
      English
      61 month ago

      Yeah but that’s milliseconds. Ergo, the crash was already going to happen.

      In any case, the problem with Tesla autopilot is that it doesn’t have radar. It can’t see objects and there have been many instances where a Tesla crashed into a large visible object.

    • NιƙƙιDιɱҽʂ
      link
      fedilink
      English
      -21 month ago

      I’ve heard that too, and I don’t doubt it, but watching Mark Rober’s video, it seems like he’s deathgripping the wheel pretty hard before the impact which seems more likely to be disengaging. Each time, you can see the wheel tug slightly to the left, but his deathgrip pulls it back to the right.

  • Eugene V. Debs' Ghost
    link
    fedilink
    English
    1021 month ago

    “Dipshit Nazis mad at facts bursting their bubble is unreality” is another way of reading this headline.

  • @ZeroGravitas@lemm.ee
    link
    fedilink
    English
    761 month ago

    Painted wall? That’s high tech shit.

    I got a Tesla from my work before Elon went full Reich 3, and try this:

    • break on bridge shadows on the highway
    • start wipers on shadows, but not on rain
    • break on cars parked on the roadside if there’s a bend in the road
    • disengage autopilot and break when driving towards the sun
    • change set speed at highway crossings because fuck the guy behind me, right?
    • engage emergency break if a bike waits to cross at the side of the road

    To which I’ll add:

    • moldy frunk (short for fucking trunk, I guess?), no ventilation whatsoever, water comes in, water stays in
    • pay attention noises for fuck-all reasons masking my podcasts and forcing me to rewind
    • the fucking cabin camera nanny - which I admittedly disabled with some chewing gum
    • the worst mp3 player known to man, the original Winamp was light years ahead - won’t index, won’t search, will reload USB and lose its place with almost every car start
    • bonkers UI with no integration with Android or Apple - I’m playing podcasts via low rate Bluetooth codecs, at least it doesn’t matter much for voice
    • unusable airco in auto mode, insists on blowing cold air in your face

    Say what you want about European cars, at least they got usability and integration right. As did most of the auto industry. Fuck Tesla, never again. Bunch of Steve Jobs wannabes.

    • @OwlHamster@lemm.ee
      link
      fedilink
      English
      41 month ago

      Frunk is short for front trunk. The mp3 issues mostly goes away if you pay for LTE on the car. The rest of the issues I can attest to. Especially randomly changing the cruise control speed on a highway because Google maps says so, I guess? Just hard breaking at high speeds for no fucking reason.

      • @limelight79@lemm.ee
        link
        fedilink
        English
        11 month ago

        Our Mazda 3’s adaptive cruise thought a car that was exiting was in our lane and hit the brakes, right in front of a car I had just passed. Sorry, dude, I made the mistake of trusting the machine.

        Incidents like that made me realize how far we have to go before self driving is a thing. Before we got that car, I thought it was just around the corner, but now I see all the situations that car identifies incorrectly, and it’s like, yeah, we’re going to be driving ourselves for a long time.

      • @ZeroGravitas@lemm.ee
        link
        fedilink
        English
        11 month ago

        I know what frunk stands from: funky trunk, given the smell. And I had premium connectivity included, doesn’t do squat unless you use Spotify, which no thanks (for different reasons). I have carefully curated “car music” on an USB drive, but noo. Search will only return Spotify results.

      • dantheclamman
        link
        fedilink
        English
        11 month ago

        Had a situation driving behind a Tesla on a freeway in clear conditions with no other cars nearby, where they suddenly braked, strongly. I actually had to swerve to go around him. He looked over at me sheepishly. I was a skeptic about the FSD concerns before that happened. Now I try to be cautious, but there are so many Teslas on the road now, I can’t double check that all of them won’t suddenly freak out

  • @wabafee@lemmy.world
    link
    fedilink
    English
    50
    edit-2
    1 month ago

    I bet the reason why he does not want the LiDAR in the car really cause it looks ugly aestheticly.

    • @AngryCommieKender@lemmy.world
      link
      fedilink
      English
      72
      edit-2
      1 month ago

      It costs too much. It’s also why you have to worry about panels falling off the swastitruck if you park next to them. They also apparently lack any sort of rollover frame.

      He doesn’t want to pay for anything, including NHTSB crash tests.

      It’s literally what Drumpf would have created if he owned a car company. Cut all costs, disregard all regulations, and make the public the alpha testers.

      • @Ledericas@lemm.ee
        link
        fedilink
        English
        231 month ago

        it did cost too much at the time, but currently he doesnt want to do it because he would have to admit hes wrong.

      • @werefreeatlast@lemmy.world
        link
        fedilink
        English
        211 month ago

        The guy bankrupted a casino, not by playing against it and being super lucky, but by owning it. Virtually everything he has ever touched in business has turned to shit. How do you ever in the living fuck screwup stakes at Costco? My cousin with my be good eye and a working elbow could do it.

        And now its the country’s second try. This time unhinged, with all the training wheels off. The guy is stepping on the pedal while stripping the car for parts and giving away the fuel. The guy doesn’t even drive, he just fired the chauffeur and is dismantling the car from the inside with a shot gun…full steam ahead on to a nice brick wall and an infinity cliff ready to take us all with him. And Canada and Mexico and Gina. Three and three quarters of a year more of daily atrocities and law breakage. At least Hitler boy brought back the astronauts.

        • @AngryCommieKender@lemmy.world
          link
          fedilink
          English
          12
          edit-2
          1 month ago

          I was mostly lambasting fElon, not Drumpf. You’re correct on Drumpf though. I was discussing the swastitruck, after all. Drumpf showed that he’s scared to drive any of the swasticars when he pretended to know how to sell anything, much less an EV.

          Oh, and Drumpf bankrupted 3-4 casinos in the late '80s to early '90s in Atlantic City, NJ. Literally the golden age of AC casinos.

      • @mcz@lemmy.world
        link
        fedilink
        English
        71 month ago

        Sorry but I don’t get it. You can getva robot vacuum with lidar for $150. I understand automotive lidars need to have more reliability, range etc. but I don’t understand how it’s not even an option for $30k car.

        • @gamer@lemm.ee
          link
          fedilink
          English
          71 month ago

          The is Elon we’re talking about. Why pay a few hundred bucks to improve safety when it’s cheaper and easier to fight the lawsuits when people die?

        • @NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          4
          edit-2
          1 month ago

          They were much more expensive years ago when the decisions were made to not use it. Costs have come down a lot. And cars can have more than 1 if you’re going to use it. That also means more compute needed so a stronger computer and more power draw meaning less milage, which means bigger battery for same mileage. It all adds up.

          Edit: might even impact aerodynamics, which again means more battery, which is more expensive.

          • @faultyproboscus@sh.itjust.works
            link
            fedilink
            English
            4
            edit-2
            1 month ago

            The power draw to process the LIDAR data is negligible compared to the energy used to move the car. 250-300 Watt hours per mile is what it takes to move an electric sedan on average. You might lose a mile of range over an hour of driving, and that’s if you add the LIDAR system without reducing the optical processing load.

            LIDAR sensor housing can be made aerodynamic.

            While it’s true that LIDAR was more expensive when they started work on self-driving, it doesn’t make sense for them to continue down this path now. It’s all sunk cost fallacy and pride at this point.

            • @NotMyOldRedditName@lemmy.world
              link
              fedilink
              English
              21 month ago

              A mile per hour is probably about right, but that’s probably per lidar. Waymo has 4 for example, so on a 300mile vehicle that could be 17 miles at 70mph.

              Even if you can make it aerodynamic it’s still not going to be as aerodynamic as it not being there.

              Sunk cost fallacy make sense, but I’d say it’s also the fear of the massive lawsuit/upgrade cost if wrong due to his statements.

              • @faultyproboscus@sh.itjust.works
                link
                fedilink
                English
                31 month ago

                I tried to look up how much power these self driving systems are pulling, but it looks like that will require a deeper dive. The only results I got from a quick search were from 2017-2018, and the systems were pulling around 2 kW. I’m sure that’s come down in the 7-8 years since, but I don’t know how much.

                I think you’re right on the lawsuit/upgrade cost. They are on the hook to supply Full Self Driving to all the buyers who bought the option. It’s clear they’re not going to be able to provide it. It looks like there are several class-action lawsuits currently underway.

                • @NotMyOldRedditName@lemmy.world
                  link
                  fedilink
                  English
                  2
                  edit-2
                  1 month ago

                  I think the older Tesla system (HW3) was around 300w, but I think the newer system is more now as they beefed up the compute, but I haven’t seen a number on that. The old system is pretty much maxed out though with no room to grow other then making things more efficient vs just more raw power usage.

                  A lot of the older hardware back then wasn’t purpose built for driving and was more repurposed general graphical compute, so it was less efficient hence the 2Kw you were seeing. Tesla built ASICs for the driving computer to bring costs and power usage down.

                  With the newer purpose built Nvidia stuff I’m sure that has brought the power draw down a lot though, likely relatively close (better or worse I don’t know) than Tesla’s watt per performance.

                  edit: clarity

        • @Wispy2891@lemmy.world
          link
          fedilink
          English
          31 month ago

          he does not want to pay $1 for rain sensors and $2 for ultrasonic parking sensors, any price for lidar must be unacceptable

        • @yonder@sh.itjust.works
          link
          fedilink
          English
          31 month ago

          IIRC robot vacuums usually use a single Time of Flight (ToF) sensor that rotates, giving the robot a 2d scan of it’s surroundings. This is sufficient for a vacuum which only needs to operate on a flat surface, but self driving vehicles need a better understanding of their surroundings than just a thin slice.

          That’s why cars might use over 30 distinct ToF sensors, each at a different vertical angle, that are then all placed in the rotating module, giving the system a full 3d scan of it’s surroundings. I would assume those modules are much more expensive, though still insignificant compared to the cost of a car sold on the idea of self driving.

    • @nova_ad_vitum@lemmy.ca
      link
      fedilink
      English
      31 month ago

      You don’t necessarily need to implement lidar the way Waymo does it with the spinning sensor. IPad Pros have them. Could have at least put a few of these on the front without significantly affecting aesthetics.

      • @Rob1992@lemmy.world
        link
        fedilink
        English
        41 month ago

        Everyone and their dog uses radar for distance sensing for the adaptive cruise control. You take the same migh speed sensor and use it for wall detection. It’s how the emergency stop functions work where it detects a car in front of you slamming on the brakes.

  • @buddascrayon@lemmy.world
    link
    fedilink
    English
    371 month ago

    It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

    So, who’s the YouTuber that’s gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.

  • Banana
    link
    fedilink
    English
    301 month ago

    And the president is driving one of these?

    Maybe we should be purchasing lots of paint and cement blockades…

    • @Chewget@lemm.ee
      link
      fedilink
      English
      161 month ago

      The president can’t drive by law unless on the grounds of the White House and maybe Camp David. At least while in office. They might be allowed to drive after leaving office…

      • FuglyDuck
        link
        fedilink
        English
        6
        edit-2
        1 month ago

        This isn’t true at all. I can’t tell if you’re being serious or incredibly sarcastic, though.

        The reason presidents (and generally ex presidents, too) don’t drive themselves is because the kind of driving to escape an assassination attempt is a higher level of driving and training than what the vast majority of people ever have. There’s no law saying presidents are forbidden from driving.

        In any case, I would be perfectly happy if they let him drive a CT and it caught fire. I’d do a little jib, and I wouldn’t care who sees that.

          • FuglyDuck
            link
            fedilink
            English
            51 month ago

            you’re gonna have to drop a source for that.

            because, no, they’re not. the Secret Service provides a driver specially trained for the risks a president might face, and very strongly insists, but they’re not “prohibited” from driving simply because they’re presidents.

            to be clear, the secret service cannot prohibit the president from doing anything they really want to do. Even if it’s totally stupid for them to do that. (This includes, for example, Trump’s routine weekend round of golf at Turd-o-Lardo)

            • @alcoholicorn@lemmy.ml
              link
              fedilink
              English
              2
              edit-2
              1 month ago

              to be clear, the secret service cannot prohibit the president from doing anything they really want to do

              Was Trump lying when he said the SS wouldn’t take him back to the capital on Jan 6?

              I could definitely see him lying about that so he doesn’t look like he abandoned his supporters during the coup, but I could also see the driver being like “I can’t endanger you, mr president” and ignoring his requests.

              • FuglyDuck
                link
                fedilink
                English
                41 month ago

                Was Trump lying when he said the SS wouldn’t take him back to the capital on Jan 6?

                Definitely not. There is no way in hell the secret service would have taken the president to that shit show. Doesn’t mean that they would have physically arrested him if he insisted going on his own, however.

        • @Chewget@lemm.ee
          link
          fedilink
          English
          11 month ago

          Mostly sarcastic, but there is a secret service rule that the president is not allowed to drive on public roads. The rest is all debatable because it hasn’t been litigated. They answer to the president, but a president can not refuse secret service protection.

          The current understanding is that they can strongly suggest things, but ultimately, they have to figure it out if the president doesn’t follow.

    • @LeninOnAPrayer@lemm.ee
      link
      fedilink
      English
      16
      edit-2
      1 month ago

      When he was in the Tesla asking if he should go for a ride I was screaming “Yes! Yes Mr. President! Please! Elon, show him full self driving on the interstate! Show him full self driving mode!”

  • /home/pineapplelover
    link
    fedilink
    English
    25
    edit-2
    1 month ago

    To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well. That being said, tesla shouldn’t rely on cameras

    Edit: having just watched the video, that was a very obvious fake wall. You can see the outlines of it pretty well. I’m also surprised it failed other tests when not on autopilot, seems pretty fucking dangerous.

    • FuglyDuck
      link
      fedilink
      English
      22
      edit-2
      1 month ago

      To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well.

      this isn’t being fair. It’s being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.

      Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.

      • @AA5B@lemmy.world
        link
        fedilink
        English
        -31 month ago

        I don’t see how this test demonstrates anything is better. It is a gimmick designed for a specific sensor to get an intended result. Show me a realistic test if they want to be useful, or go full looney tunes if they want to be entertaining

        • @oatscoop@midwest.social
          link
          fedilink
          English
          31 month ago

          The cartoon wall is just an exagerated illustration of why a vision only system sucks compared to other systems.

          If you watch the rest of the video, you’d see Tesla’s vision only system is inferior to cars using radar/lidar in other, more realistic situations like heavy rain and fog.

    • comfy
      link
      fedilink
      English
      12
      edit-2
      1 month ago

      The video does bring up human ability too with the fog test (“Optically, with my own eyes, I can no longer see there’s a kid through this fog. The lidar has no issue.”) But, as they show, this wall is extremely obvious to the driver.

        • @T156@lemmy.world
          link
          fedilink
          English
          6
          edit-2
          1 month ago

          They already have trouble enough with trucks carrying traffic lights, or with speed limit stickers on them.

          • FuglyDuck
            link
            fedilink
            English
            51 month ago

            and fire trucks. and little kids. and, uh, lots of things really.

    • @utopiah@lemmy.world
      link
      fedilink
      English
      8
      edit-2
      1 month ago

      I’d take that bet. I imagine at least some drivers would notice something sus’ (due to depth perception, which should be striking as you get close, or lack of ANY movement or some kind of reflection) and either

      • slow down
      • use a trick, e.g. flicking lights or driving a bit to the sides and back, to try to see what’s off

      or probably both, but anyway as other already said, it’s being compared to other autopilot systems, not human drivers.

    • @TorJansen@sh.itjust.works
      link
      fedilink
      English
      61 month ago

      Yeah, the Roadrunner could easily skip by such barriers, frustrating the Coyote to no end. Tesla is not a Roadrunner.

      • FuglyDuck
        link
        fedilink
        English
        41 month ago

        in the world of coyotes, be a roadrunner.

        MEEP MEEP.

    • FuglyDuck
      link
      fedilink
      English
      31 month ago

      I don’t think he’s actually driving it. Might even have been a rental.

      • @Manalith@midwest.social
        link
        fedilink
        English
        21 month ago

        He said it was his own car when he was interviewing with Phillip DeFranco. He also said he’s still planning on getting another Tesla when an updated model comes out.

        • FuglyDuck
          link
          fedilink
          English
          11 month ago

          sometimes, engineers might be brilliant engineers without being the most practical of people.

          This is practically what rentals are made for! (ahwell. explains why they used foam with break-away cuts. lol)

    • @Rob1992@lemmy.world
      link
      fedilink
      English
      21 month ago

      They sort of do with the super chargers already. If someone has a car that’s deemed to damaged its disconnected

  • @ABetterTomorrow@lemm.ee
    link
    fedilink
    English
    181 month ago

    I can’t wait for all this brand loyalty and fan people culture to end. Why is this even a thing? Like talking about box office results, companies financials and stocks…. If you’re not an investor of theirs, just stop. It sounds like you’re working for free for them.

    • @Tattorack@lemmy.world
      link
      fedilink
      English
      31 month ago

      Brand loyalty is for suckers. But it’s what brands prey on. They foster brand loyalty. And unfortunately there are a lot of suckers.

    • @ameancow@lemmy.world
      link
      fedilink
      English
      31 month ago

      I can’t wait for all this brand loyalty and fan people culture to end.

      My blackest pill in my adult life was the realization that we’ve leveled off as a species. This is as good as it gets.

      Our brains made monumental leaps in development over the last half-million years, with the strongest changes being made during the last ice-age, times when resources were scarce, and survival was extremely difficult and humanity was caught up in many wars and fights with other humans and animals and weather alike. Our brains were shaped to do a couple of things better than others: invent stories to explain feelings, and join communities. These adaptations worked amazingly, it allowed us to band together and pool resources, to defend each other and spot signs of danger. These adaptations allowed us to develop language and agriculture and formed our whole society, but lets not forget what they are at heart: brains invent stories to explain feelings, and we all want social identity and in-group. Deeply. This shit is hardwired into us.

      Nearly every major societal problem we have today can be traced back to this response system from the average human brain to either invent a story to explain a discomfort, and those discomforts are often the simple desire to have a group identity.

      Our world will get more complicated, but our brains aren’t moving. We can only push brains so far. They’re not designed to know how to form words and do calculus, we trained our brains to do those things, but our systems are far more complicated than language and calculus. Complex problems produce results like lack of necessities, which create negative feelings, which the brain invents stories to explain (or are provided stories by the ruling class.)

      So this is it. Nobody is coming. Nothing is changing.

      We MIGHT be able to rein in our worst responses over enough time, we MIGHT be able to form large enough groups with commonalities that we achieve tenuous peace. But we will never be a global species, we will never form a galactic empire, we will never rise above war and hate and starvation and greed. Not in our current forms at least. There’s no magic combination of political strategies and social messages that will make everyone put down their clubs and knives.

      This is it, a cursed, stupid primate on a fleck of dust spinning around a spark in a cloud of sparks, just looking at every problem like it’s either a rival tribe or a sabertooth-cat hiding in the bushes. Maybe if we don’t destroy ourselves someday our AI descendants will go out into the larger universe, but it certainly won’t be us.

      • @ABetterTomorrow@lemm.ee
        link
        fedilink
        English
        21 month ago

        Well said. Thank you for sharing. This is a nice piece to help those to self reflect once in a well, it feels…… grounding. Curious what the positive sequel would be…

        • @ameancow@lemmy.world
          link
          fedilink
          English
          2
          edit-2
          1 month ago

          Also, I realized recently that because all of our current stories, all of our current narratives of “forces of good versus evil” and all the political drama and inexplicable human decisions we see being made in the highest levels of power, are actually really dumb stories of people just saying shit and trying to be liked… this isn’t even new, these people doing all this stupid, absurd bullshit are genetically identical to the creatures who ruled empires in the past, who led armies, who had songs written about them going down thousands of years… so that tells me that all our great epics are probably mostly bullshit, and the reality was a lot more stupid.

          The idea that most of history was stupid people doing stupid things and then writing fancy stories about it later, that is also strangely reassuring. These are just people, just idiots like the rest of us. Everyone is just improvising as we go and trying to make the best of it.

        • @ameancow@lemmy.world
          link
          fedilink
          English
          2
          edit-2
          1 month ago

          Curious what the positive sequel would be…

          On good days, I remember that the sheer finality and certainty of the state of our world, our universe, the idea that we may not even have free-will at all and this is all just an inexplicable, passing moment of a universe becoming aware of itself, the grandure of it does more for me than any religious ideas or poems or songs or inspirational messages. It’s wholly absurd and beautiful and we exist in the intersection of scales that are so immense they cannot be fathomed by our primitive minds… these ideas make all the struggles, pains and hardships I experience feel a lot less tangible.

          I am aware that a lot of this sensation is simple disassociation from depression, but it’s not necessarily a bad thing, disassociation is either a survival trick to keep us alive when our minds get the best of us, or it’s a consequence of something actual and special inside us, a type of awareness that seems to reside just outside of the things we can quantify and explain, a sum greater than its parts. Disassociation is like standing just outside yourself watching the story unfold, and it used to terrify me, now I realize it can’t be helped, we’re all on tracks and it’s just a ride. But if you look around, it can be a beautiful ride, even the shitty parts exist as a strange kind of reminder that the universe doesn’t owe us anything, take it or leave it, it’s all just experiences.