• @deafboy@lemmy.world
    link
    fedilink
    English
    781 year ago

    And they managed to do it without us obsessing about their CEO several times a day? I refuse to believe that!

  • @cAUzapNEAGLb@lemmy.world
    link
    fedilink
    English
    67
    edit-2
    1 year ago

    As of April 11, there were 65 Mercedes autonomous vehicles available for sale in California, Fortune has learned through an open records request submitted to the state’s DMV. One of those has since been sold, which marks the first sale of an autonomous Mercedes in California, according to the DMV. Mercedes would not confirm sales numbers. Select Mercedes dealerships in Nevada are also offering the cars with the new technology, known as “level 3” autonomous driving.

    Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.

    U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500.

    Mercedes is also working on developing level 4 capabilities. The automaker’s chief technology officer Markus Schäfer expects that level 4 autonomous technology will be available to consumers by 2030, Automotive News reported.

    • @Ilovethebomb@lemm.ee
      link
      fedilink
      English
      411 year ago

      Hmm, so only on a very small number of predetermined routes, and at very slow speeds for those roads.

      Still impressive, but not as impressive as the headline makes out.

      • VindictiveJudge
        link
        fedilink
        English
        351 year ago

        And definitely not worth the $2500 a year they’re asking for the feature.

      • Turun
        link
        fedilink
        English
        61 year ago

        Yes, but it’s actually level 3.

        Not the Tesla “full self driving - no wait we actually lied to you, you need to be alert at all times” bullshit.

    • @jballs@sh.itjust.works
      link
      fedilink
      English
      211 year ago

      I’ve seen this headline a few times and the details are laughably bad. The only reason this can be getting any press is because the headline is good clickbait. But 40 mph top speed on approved roads in 2 states only if a car is in front of you in the daytime is entirely useless. I guess it’s a good first step maybe? But trying to write headlines like this is big news is sad.

      • Turun
        link
        fedilink
        English
        71 year ago

        The reason this gets attention is because it’s the first level 3 sold to consumers.

        The tech is hard, of course it’s gonna start out with laughingly limited capabilities. But it’s the first step towards more automation.

      • @conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        4
        edit-2
        1 year ago

        It’s starting in California where there are a meaningful number of high earners who are spending hours per day in 4 lane bumper to bumper traffic.

        Having actual autonomy during those hours is still shit. But it’s a hell of a lot less shit than the tedium of the high attention requirements of sitting in traffic at a crawl.

  • @eee@lemm.ee
    link
    fedilink
    English
    561 year ago

    U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500

    yeah, fuck that.

      • @MeatsOfRage@lemmy.world
        link
        fedilink
        English
        211 year ago

        They’re also accepting full liability if anything happens while using this feature so it’s actually a type of insurance

        • @Corkyskog@sh.itjust.works
          link
          fedilink
          English
          41 year ago

          I wonder how much cheaper it will make auto insurance. I also wonder if this will open transportation options those who have lost a license.

          • @conciselyverbose@sh.itjust.works
            link
            fedilink
            English
            61 year ago

            Not this. It’s limited to specific scenarios on specific roads. So you’re going to need a licensed driver.

            Eventually with actually full self driving? I’d hope so, though it’s going to take legislation first.

        • @explodicle@sh.itjust.works
          link
          fedilink
          English
          21 year ago

          I kinda like that system because eventually people will put their own OSes on the car, which the manufacturer obviously can’t cover. Having separate insurance/service eliminates having to pay for it if you’re accepting the liability yourself.

        • @jkrtn@lemmy.ml
          link
          fedilink
          English
          21 year ago

          Ok, then I’ll do it if I don’t have to pay for other insurance on the car.

        • @jj4211@lemmy.world
          link
          fedilink
          English
          11 year ago

          The conditions for the system to work are such that if you could find a policy to cover only those conditions, it’d probably just be like a couple dollars a month. Even behaving “badly” you would be unlikely to have an accident and even if you caused an accident, it’s probably just going to be a couple thousand in property damage with no medical implication.

    • @AA5B@lemmy.world
      link
      fedilink
      English
      131 year ago

      Have you seen Tesla’s price for full self driving? And they don’t take liability

    • Richard
      link
      fedilink
      English
      101 year ago

      I think you can afford that if you own an EQS

    • Cosmic Cleric
      link
      fedilink
      English
      31 year ago

      Paywalled.

      On a different subject, why would someone downvote a one-word comment that accurately describes what the content is behind?

      • @stoly@lemmy.world
        link
        fedilink
        English
        41 year ago

        There are people who are pathologically contrarian. I’ve had to end a friendship over it—the endless need to say something negative about literally everything that ever happens and an unwillingness to be charitable to others.

      • A Wild Mimic appears!
        link
        fedilink
        English
        11 year ago

        I have the theory that archive.is, waybackmachine and 12ft.io are no secret anymore, and that just posting “paywalled” comes across as too lazy to copy/paste or (a lot easier) to use this addon to reduce the work to a click. i dont mind, but i can understand why others might see it that way

        • Cosmic Cleric
          link
          fedilink
          English
          -2
          edit-2
          1 year ago

          and that just posting “paywalled” comes across as too lazy to copy/paste

          Blaming the victim, and justifying paywalls.

          or (a lot easier) to use this addon to reduce the work to a click.

          My phone browser doesn’t use add-ons.

          i dont mind

          And yet, you took the time out to reply, to chastise me for saying it.

          Anti Commercial-AI license (CC BY-NC-SA 4.0)

          • A Wild Mimic appears!
            link
            fedilink
            English
            0
            edit-2
            1 year ago

            sheesh, you are quite aggressive, i did not want to offend. and as i said, i don’t mind it, i even posted the archivelink, for which you thanked me. check your target before firing, mate :-)

            (also, theres always firefox mobile. can apple users use it with addons/firefox browser engine now? i don’t follow apple development actively)

  • @merthyr1831@lemmy.world
    link
    fedilink
    English
    351 year ago

    Love how companies can decide who has to supervise their car’s automated driving and not an actual safety authority. Absolutely nuts.

        • @DreamlandLividity@lemmy.world
          link
          fedilink
          English
          -21 year ago

          You can’t have a babysitter following every human to make sure they don’t do something dangerous. Except for high risk areas, liability is the most practical option.

            • @DreamlandLividity@lemmy.world
              link
              fedilink
              English
              -5
              edit-2
              1 year ago

              So you want to read 50 page regulation about how to boil water in your home because boiling water can hurt people?

              And how do you regulate AI when you have no idea how it works or what could go wrong. Not as if politicians are AI experts. Driving itself is already heavily regulated, the AI has to follow traffic rules just like anyone else, if that is what you are thinking.

              • @DrinkMonkey@lemmy.ca
                link
                fedilink
                English
                41 year ago

                Why do you believe that judges (or even juries made of lay people) can make sense of the very things that you’re so confident legislators or regulators cannot?

                I’m not saying regulation is perfect, and as a result, certainly there is a role for judicial review. But come on, man…lots of non sequiturs and straw dogs in your argument.

                • @DreamlandLividity@lemmy.world
                  link
                  fedilink
                  English
                  11 year ago

                  Quite often, juries don’t have to rule on technical matters. Juries will have available internal communications of the company, testimonies of the engineers working on the project etc. If safety concerns were being ignored, you can usually find enough witnesses and documents proving so.

                  On the other hand, how do you even begin to regulate something that is only in the process of being invented? What would the regulation look like?

    • @Trollception@lemmy.world
      link
      fedilink
      English
      131 year ago

      Who said there was no safety authority involved? I thought it was part of the 4 level system the government decided on for assisted driving.

  • @nucleative@lemmy.world
    link
    fedilink
    English
    281 year ago

    Wonder how this works with car insurance. Os there a future where the driver doesn’t need to be insured? Can the vehicle software still be “at fault” and how will the actuaries deal with assessing this new risk.

      • HobbitFoot
        link
        fedilink
        English
        201 year ago

        Which is how it should be. The company creating the software takes on the liability of faults with said software.

      • @Rinox@feddit.it
        link
        fedilink
        English
        151 year ago

        Will it pull a Tesla and switch off the autopilot seconds before an accident?

          • @T156@lemmy.world
            link
            fedilink
            English
            -2
            edit-2
            1 year ago

            If memory serves, that’s not an intentional feature, but more a coincidence, since if the driver thinks the cruise control is about to crash the car, they’ll pop the brakes. Touching the brakes disengages the cruise control by design, so you end up with it shutting down before a crash happens.

            • @nucleative@lemmy.world
              link
              fedilink
              English
              61 year ago

              That makes perfect sense. If the driver looks up to notice that he’s in a dangerous, unfixable situation, slams the breaks, disconnecting the autopilot (which have been responaible for letting the situation develop) hopefully the automaker can’t entirely say “not our fault, the system wasn’t even engaged at the time of the collision”

      • @Sizzler@slrpnk.net
        link
        fedilink
        English
        121 year ago

        And this is how they will push everyone into driverless. Through insurance costs. Who would insure 1 human driver vs 100 bots, (once the systems have a few billion miles on them)

        • dream_weasel
          link
          fedilink
          English
          81 year ago

          And that will probably be safer for everyone, honestly. Better or worse will vary by individual perspective.

          • @Sizzler@slrpnk.net
            link
            fedilink
            English
            3
            edit-2
            1 year ago

            It’ll be interesting to see how it pans out, with local city traffic being essentially reduced to all taxis and only the countryside 4x4 and farm vehicles being the last hold out of human control because of hilly terrain. Once the lorries go fully self-controlled (note: modern lorries have a lot of driver support aids as it is.) it’ll only be a matter of time.

            Totally agree that car incidents will go down dramatically, some police forces will see their entire income disappear. Soo many changes that we can’t even imagine coming.

            • dream_weasel
              link
              fedilink
              English
              21 year ago

              Good points. I bet local towns are the biggest holdout just because of dependence on ticket revenue.

              • @Sizzler@slrpnk.net
                link
                fedilink
                English
                21 year ago

                I included that line thinking of America, it vastly reduces police interaction chance as well which gives me more thought.

              • @Sizzler@slrpnk.net
                link
                fedilink
                English
                11 year ago

                I did think about that whilst I included farm vehicles but meant support rather than harvesters.

                I wonder if any lessons have been used and applied from the farm industries automation which is great when applied to a specific area as opposed to general driving.

                It’s very GPS driven from what I’m aware with the accurate measuring GPS units being thousands of pounds which obviously restricts it for use in the consumer market.

        • @nucleative@lemmy.world
          link
          fedilink
          English
          5
          edit-2
          1 year ago

          You’re probably right. Another decade or two and human driver controlled cars might be prohibitively expensive to insure for some or even not allowed in certain areas.

          I can imagine an awesome world where that’s a great thing but also imagine a dystopian world like wall-e as well. I guess we’ll know then which one we chose.

      • @Hacksaw@lemmy.ca
        link
        fedilink
        English
        -51 year ago

        No. I don’t think this is a good solution. Companies will put a price on your life and focus on monetary damage reduction. If you’re about to cause more property damage than your life is worth (to Mercedes) they’ll be incentivized to crash the car and kill you rather than crash into the expensive structure.

        Your car should be you property, you should be liable for the damage it causes. The car should prioritise your life over monetary damage. If there is some software problem causing the cars to crash, you need to be able to sue Mercedes through a class action lawsuit to recover your losses.

        • femtech
          link
          fedilink
          English
          11 year ago

          Wrongful death and human body damage is a lot more expensive.

    • @Hugin@lemmy.world
      link
      fedilink
      English
      211 year ago

      Berkshire Hathaway owns Geico the car insurance company. In one of his annual letters Buffett said that autonomous cars are going to be great for humanity and bad for insurance companies.

      “If [self-driving cars] prove successful and reduce accidents dramatically, it will be very good for society and very bad for auto insurers.”

      Actuaries are by definition bad at assessing new risk. But as data get collected they quickly adjust to it. There are a lot of cars so if driverless cars become even a few percent of cars on the road they will quickly be able to build good actuarial tables.

  • daikiki
    link
    fedilink
    English
    181 year ago

    According to who? Did the NTSB clear this? Are they even allowed to clear this? If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

    • @Trollception@lemmy.world
      link
      fedilink
      English
      -6
      edit-2
      1 year ago

      You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?

      • @KredeSeraf@lemmy.world
        link
        fedilink
        English
        231 year ago

        Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.

        But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.

        I think this idea is sound, but that doesn’t mean there aren’t things to address around it.

        • @Trollception@lemmy.world
          link
          fedilink
          English
          -51 year ago

          Honestly I’m sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.

          • @KredeSeraf@lemmy.world
            link
            fedilink
            English
            81 year ago

            That still doesn’t address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because “it’ll eventually get better” is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.

            But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn’t clearly codified law I might not, and you might be callous enough to say you don’t care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you’d have to go through a long, expensive court battle to determine fault because no one had done it it. So you’re in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn’t your fault.

            That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.

            • @Trollception@lemmy.world
              link
              fedilink
              English
              0
              edit-2
              1 year ago

              To be clear I never said that I didn’t care about an individual’s safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.

              The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn’t count as it’s not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.

              https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars

            • @Llewellyn@lemm.ee
              link
              fedilink
              English
              0
              edit-2
              1 year ago

              But then it’s good that the manufacturer states the driver isn’t obliged to watch the road. Because it shifts responsibility towards the manufacturer and thus - it’s a great incentive to make technology as safe as possible.

      • @stoly@lemmy.world
        link
        fedilink
        English
        -11 year ago

        Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.

  • @Ultragigagigantic@lemmy.world
    link
    fedilink
    English
    151 year ago

    if it can drive a car why wouldn’t it be able to drive a truck?

    I’m surprised companies don’t just build their own special highway for automated trucking and use people for last mile stuff.

    • @ShepherdPie@midwest.social
      link
      fedilink
      English
      231 year ago

      Because it’s an extremely narrowly defined set of requirements in order to use it. It’s “approved freeways with clear markings and moderate to heavy traffic under 40MPH during daytime hours and clear conditions” meaning it will inch forward for you in bumper to bumper traffic provided you’re in an approved area and that’s it.

      https://www.mbusa.com/en/owners/manuals/drive-pilot

      • @melpomenesclevage@lemm.ee
        link
        fedilink
        English
        -211 year ago

        lol, ‘manufacturer takes on responsibility’ so… I’m just fucked if one of these hits me?

        see a mercedes, shoot a mercedes. destroy it in whatever way you can.

        • @KISSmyOSFeddit@lemmy.world
          link
          fedilink
          English
          281 year ago

          No you’re guaranteed that the Mercedes that hit you is better insured for paying out your damages than pretty much anyone else on the road that could hit you.

          • @melpomenesclevage@lemm.ee
            link
            fedilink
            English
            -61 year ago

            lol corporations don’t have responsibility though. that’s the whole point of them. they’re machines for avoiding responsibility.

            • @explodicle@sh.itjust.works
              link
              fedilink
              English
              -21 year ago

              In this case the responsibility to pay will ultimately fall on everyone, not just on the pedestrian getting hit. Still not good, but you won’t be SOL.

              • @Fedizen@lemmy.world
                link
                fedilink
                English
                -11 year ago

                If these have lidar (unlike teslas) then they might be better at detecting obstructions but I feel like real world road conditions are not kind to cameras and sensors.

                • @QuaternionsRock@lemmy.world
                  link
                  fedilink
                  English
                  21 year ago

                  Fixed lidar sensors are not as reliable as it’s made out to be, unfortunately. Dome lidar systems like those found on Waymo vehicles are pretty good, but way more advanced (and expensive) than anything you’d find in consumer vehicles at the moment. The shadows of trees are enough to render basic lidar sensors useless, as they effectively produce an aperiodic square wave of infrared light (from the sun) that is frequently inseparable from the ToF emission signal. Sunsets are also sometimes enough to completely blind lidar sensors.

                  None of this is to say that Tesla’s previous camera-only approach was a good idea, like at all. More data is always a good thing, so long as the system doesn’t rely on the data more than the data’s reliability permits. After all, cameras can be blinded by sunlight too. IMO radar is the best economical complementary sensor to cameras at the moment. Despite the comparatively low accuracy, they are very reliable in adverse conditions.

          • @Tankton@lemm.ee
            link
            fedilink
            English
            -101 year ago

            The sad part of this is somehow thinking that payment solves any problem. Like, idk what they would pay me, just bring back my dead wife/child/father whatever. You can’t fix everything with money.

            • @QuaternionsRock@lemmy.world
              link
              fedilink
              English
              71 year ago

              It only works on a small handful of freeways (read: no pedestrians) in California/Nevada, and only under 40 MPH. The odds of a crash within those parameters resulting in a fatality are quite low.

            • @Llewellyn@lemm.ee
              link
              fedilink
              English
              61 year ago

              Human drivers are far more dangerous on the road, and you should be applauding assisted driving development.

              • @jj4211@lemmy.world
                link
                fedilink
                English
                1
                edit-2
                1 year ago

                This presumes the options are only:

                • Human and no autonomous system watching
                • Autonomous system, with no meaningful human attention

                Key word is ‘assisted’ driving. ADAS should roughly be a nice add, so long as human attention is policed. Ultimately, the ADAS systems are better able to react to some situations, but may utterly make some stupid calls in exceptional scenarios.

                Here, the bar of ‘no human paying attention at all’ is one I’m not entirely excited about celebrating. Of course the conditions are “daytime traffic jam only”, where risk is pretty small, you might have a fender bender, pedestrians are almost certainly not a possibility, and the conditions are supremely monotonous, which is a great area for ADAS but not a great area for bored humans.

  • @elrik@lemmy.world
    link
    fedilink
    English
    81 year ago

    How is this different from the capabilities of Tesla’s FSD, which is considered level 2? It seems like Mercedes just decided they’ll take on liability to classify an equivalent level 2 system as level 3.

    • @rsuri@lemmy.world
      link
      fedilink
      English
      10
      edit-2
      1 year ago

      According to the mercedes website the cars have radar and lidar sensors. FSD has radar only, but apparently decided to move away from them and towards optical only, I’m not sure if they currently have any role in FSD.

      That’s important because FSD relies on optical sensors only to tell not only where an object is, but that it exists. Based on videos I’ve seen of FSD, I suspect that if it hasn’t ingested the data to recognize, say, a plastic bucket, it won’t know that it’s not just part of the road (or at best can recognize that the road looks a little weird). If there’s a radar or lidar sensor though, those directly measure distance and can have 3-D data about the world without the ability to recognize objects. Which means they can say “hey, there’s something there I don’t recognize, time to hit the brakes and alert the driver about what to do next”.

      Of course this still leaves a number of problems, like understanding at a higher level what happened after an accident for example. My guess is there will still be problems.

    • @GoodEye8@lemm.ee
      link
      fedilink
      English
      81 year ago

      You’ve inadvertently pointed out how Tesla deliberately skirts the law. Teslas are way more capable than what level 2 describes, but they choose to stay as level 2 so they wouldn’t have to take responsibility for their public testing

    • @Socsa@sh.itjust.works
      link
      fedilink
      English
      31 year ago

      Yeah it’s pretty much an insurance product. They came up with a set of boundary conditions someone would underwrite for their “stay between the lines” tech.

      • @skyspydude1@lemmy.world
        link
        fedilink
        English
        11 year ago

        Please tell me how software will be able to detect objects in low/no-light conditions if they say, have cameras with poor dynamic range and no low-light sensitivity?