• Th4tGuyII
    link
    fedilink
    1532 years ago

    “We’re trying to have those conversations with Elon to establish what the sensors would need to do,” Baglino added. “And they were really difficult conversations, because he kept coming back to the fact that people have just two eyes and they can drive the car.”

    Yes, and people crash cars all the time Elon…

    If you want an autopilot with the failure rate of a human, then you might only need two eyes. If you want an autopilot with a near zero failure rate, you need much better telemetry data

    • @GamingChairModel@lemmy.world
      link
      fedilink
      English
      822 years ago

      Our heads are just loaded with sensory capabilities that are more than just the two eyes. Our proprioception, balance, and mental mapping allows us to move our heads around and take in visual data from almost any direction at a glance, and then internally model that three dimensional space as the universe around us. Meanwhile, our ears can process direction finding for sounds and synthesize that information with our visual processing.

      Meanwhile, the tactile feedback of the steering wheel, vibration of the actual car (felt by the body and heard by the ears), give us plenty of sensory information for understanding our speed, acceleration, and the mechanical condition of the car. The squeal of tires, the screech of brakes, and the indicators on our dash are all part of the information we use to understand how we’re driving.

      Much of it is trained through experience. But the fact is, I can tell when I have a flat tire or when I’m hydroplaning even if I can’t see the tires. I can feel inclines or declines that affect my speed or lateral movement even when there aren’t easy visual indicators, like at night.

      • @ikidd@lemmy.world
        link
        fedilink
        English
        20
        edit-2
        2 years ago

        To be fair, 98% of drivers seem to barely be able to hold a straight line and can’t see past the end of their hood, let alone do shoulder checks and be able to hear anything over the stereo turned up to 11. So I’d take my chances with the half-baked autopilot that can at least discern what a red light looks like.

        I followed one gentleman for about 10 blocks before he stopped and I could tell him that he was missing the entire tire on the rear left of his car. There were a lot of sparks and metal screeching. Not a clue.

      • @xavier666@lemm.ee
        link
        fedilink
        English
        162 years ago

        Just adding to your point, when F1 drivers were asked to play a racing sim, they could not perform like real life because they said no matter how good the sim is, it doesn’t provide the feedback of a real car.

    • ringwraithfish
      link
      fedilink
      662 years ago

      And people turn their heads, move their eyes across their windshield, change focus to look ahead or closer, look in their mirrors, listen for sounds (emergency vehicles, car honks, etc), are able to do things like look through gaps and other car windows to adjust to partial obstructions.

      The fact that he doesn’t realize you need a multitude of sensors to do even a little bit of what a human can do tells you all you need to know about Elon’s so called brilliance.

      • HarkMahlberg
        link
        fedilink
        242 years ago

        Even the social aspect of driving eludes him. You and another driver come up to a 4 way stop at the same time, crossing paths. They wave you on to be polite. You wave back and go first. How and when does he plan to handle that behavior?

    • 100_kg_90_de_belin
      link
      fedilink
      English
      282 years ago

      Is Elin really this dense? People have two eyes and milions of years of evolution behind them.

      We tamed massive animals to use them as means of transportation, ffs.

    • @nova_ad_vitum@lemmy.ca
      link
      fedilink
      English
      19
      edit-2
      2 years ago

      Anybody else remember the now-removed Tesla blog post from 2016 arguing that FSD will require LIDAR? Idk why they’ (Elon) are so stubborn about it. It can see through fog and darkness . Add that data to their model and they would probably already be near deployment readiness of real FSD.

      • mosiacmango
        link
        fedilink
        English
        16
        edit-2
        2 years ago

        Automotive lidar costs around $500-1000 to add to a car.

        That’s it. That’s the whole reason.

    • @YellowBendyBoy@lemmy.world
      link
      fedilink
      English
      182 years ago

      Well we perform pretty well with just two eyes, but the difference is that we are a highly skilled general pattern recognition machine that you just can’t recreate in software yet. A few lines diverging with a bigger and smaller circle under it? Guess that’s a truck going that way. Oh the lines are changing angles? Holy shit the truck is coming into this lane!!

    • @Mouselemming@sh.itjust.works
      link
      fedilink
      English
      142 years ago

      A person approaching on foot or a bicycle from my right side at the coincidentally perfect speed can accidentally stay within both my human eyes’ blind spots (behind the support pillar) as I come to a stop at a 4-way. I have learned I need to crane around a bit before proceeding, or their frightened and angry face will suddenly lurch into view too close for comfort. The robot must be designed to have zero blind spots because humans are ridiculously good at hiding in them. Especially the little humans.