cross-posted from: https://fedia.io/m/[email protected]/t/2201156

In case you were worried about the roads being too safe, you can rest easily knowing that Teslas will be rolling out with unsupervised “Full Self Driving” in a couple days.

It doesn’t seem to be going great, even in supervised mode. This one couldn’t safely drive down a simple, perfectly straight road in broad daylight :( Veered off the road for no good reason. Glad nobody got badly hurt.

We analyze the onboard camera footage, and try to figure out what went wrong. Turns out, a lot. We also talk through how camera-only autonomous cars work, Tesla’s upcoming autonomous taxi rollout, and how AI hallucinations figure into everything.

  • Anomalocaris@lemm.ee
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    1 day ago

    I want to go back to early 2010s when all this tech was so cool and promising.

    miss that optimism

  • princessaine@lemm.ee
    link
    fedilink
    arrow-up
    12
    arrow-down
    2
    ·
    1 day ago

    no one seems to be questioning why exactly we need self driving cars in the first place. cars are an absolutely horrendous way to handle personal transportation already why add self driving on top of that

    • KelvarIW@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      ·
      20 hours ago

      (Adam Something voice) Imagine multiple self-driving cars, moving together in a conga line, running every hour of every day, at consistent trains. To allow for consumer freedom, these cars would operate on a “hop-on, hop-off” model, where passengers can get on or off at any one of per-designated stops. For simplicity’s sake, and to better manage the strain this would put on the road, these cars would have exclusive “self-driving car chain”-only roads. Since these cars would only ever require basic movement, their inner workings could be streamlined, such as by chaining each car together, and using one “engine car” in front, cutting costs and reducing production time, by removing the need for each car to have a steering mechanism and combustion engine. And if this idea sounds familiar…

  • boydster@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    1 day ago

    There’s a lot wrong with Tesla’s implementation here, so I’m going to zoom in on one in particular. It is outright negligent to decide against using LIDAR on something like a car that you want to be autonomous. Maybe if this car had sensors to map out 3D space, that would help it move more successfully through 3D space?

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      17
      ·
      1 day ago

      You and I are (mostly) able to safely navigate a vehicle with 3D stereoscopic vision. It’s not a sensor issue, it’s a computation issue.

      • brygphilomena@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        If I eventually end up on a fully self driving vehicle, I want it to be better than what you and I can do with our eyes.

        Is it possible to drive with just stereoscopic vision, yea. But why is Tesla against BEING BETTER than humans?

      • Computation NOW cannot replicate what humans do with our rather limited senses.

        “Self-driving” cars are being made NOW.

        That means it’s the NOW computation we worry about, not some hypothetical future computation capabilities. And the NOW computation cannot do the job safely with just vision.

      • TwanHE@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        22 hours ago

        In theory maybe, but our brains are basically a supercomputer on steroids when it comes to interpreting and improving the “video feed” our eyes give us.

        Could it be done with just cameras, probably some time in the future, but why the fuck wouldn’t you use a depth sensor now, and even in the future as a redundancy.

        • Ulrich@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          17 hours ago

          our brains are basically a supercomputer on steroids

          Yeah I mean that’s what I said.

      • this_1_is_mine@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        22 hours ago

        I can also identify a mirror. Tesla smashed that head on. If you can’t effectively understand the image then it’s not enough information.

        • Ulrich@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          22 hours ago

          I can also identify a mirror.

          My point exactly.

          If you can’t effectively understand the image then it’s not enough information.

          No, it’s just not able to process the information it has.

      • arrakark@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        23 hours ago

        I get what you are saying. But adding a new form of data input into the system would probably function to improve performance, not decrease it. I don’t think it makes sense to not add LIDAR into Teslas.

        All of this feels like Elon was asked to justify not putting a rather expensive (at the time) set of sensors into the Teslas, and he just doubles down and says that they will compensate with software.

  • MushuChupacabra@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    1 day ago

    The obvious reason is that the product is not capable of driving itself.

    That system has no awareness that it’s a motor vehicle.

    It has the capacity to execute instructions.

    It has no capacity to judge if the instructions are reasonable.

    It does not know, understand, or care if it gets you to your destination or if it kills you, or itself.

    No cruelty, just executing instructions.

    • markovs_gun@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 hours ago

      Look I think Tesla’s self driving cars are bad but this is a crazy viewpoint. If you only knew how many systems that are several orders of magnitude more dangerous than cars are run almost entirely by automated systems operating on extremely simple instructions (and nowhere near having “awareness” of or ability to judge anything), you’d apparently be shitting yourself because consciousness is apparently required to make good decisions. Chemical plants have been mostly automated since the 90s running on computers way simpler than anything in a Tesla, and they have way higher potential for disaster if something goes wrong.

      Self driving cars have a lot of problems right now but it’s absolutely insane to say that they inherently can’t work because there’s something special about consciousness that prohibits them from working without it. That’s like saying you can’t drive a car without having a soul or some other bullshit like that.

      • brygphilomena@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        I mean, look at airplanes and their autopilot and especially their auto landing systems.

        I think autonomous driving is limited by the quality and maintenance of rural roads, dirt and gravel roads, and the edge cases like going through drive thrus. Or really doing anything that isn’t “drive on road from a to b.” We use cars and trucks for all sorts of things in all sorts of places that aren’t “roads.”

  • HakFoo@lemmy.sdf.org
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    Read the headline as “flips off car” and was pleased to see they hit a new milestone in mimicking human drivers.

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    1 day ago

    Tesla Full-Self Driving Driver Veers Off Road, Hits Tree, and Flips Car for No Obvious Reason

    FTFY