• CandleTiger@programming.dev
    link
    fedilink
    English
    arrow-up
    5
    ·
    15 hours ago

    How does the construction app know what needs to be constructed and how?

    How does the waiter app know which table ordered what, needs attention, etc?

    How does the IT app know on which port every device is connected?

    These things are all real hard to know. Having glasses that display the knowledge could be really nice but for all these magic future apps, having a display is only part of the need.

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 hours ago

      If you have all that info you could probably remove the human from the equation and automate it.

      As for the NPC-Waiter 🤢

    • Lvdwsn@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      14 hours ago

      As somebody who wanted google glass back in the day and thinks AR glasses would be really really cool, this is ultimately where I end up on it, and with a lot of tech in general: the primary usefulness of any of this shit is in accurate and relevant information, and that’s the part of the equation that these big companies are definitely NOT in the business of producing. In fact, they seem to have discovered a while back that inaccurate and irrelevant information being blasted in your face is the real money maker. And now with AI/ML producing so much/filling in gaps, I just can’t imagine that it’s going to get any better.

      That being said, I think the tech is so cool. I’d love to travel to a new city and be able to get directions around to different sightseeing spots and real time star ratings above all the restaurants instead of anxiously glancing at my phone the entire time. If we ever get to that level of goodness I’m in, but I have a lot of doubts that it’ll ever be more than another attention-seeking thing attached to your body.