• Vanilla_PuddinFudge@infosec.pubOP
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    13 hours ago

    I’d rather not break down a human being to the same level of social benefit as an appliance.

    Perception is one thing, but the idea that these things can manipulate and misguide people who are fully invested in whatever process they have, irks me.

    I’ve been on nihilism hill. It sucks. I think people, and living things garner more genuine stimulation than a bowl full of matter or however you want to boil us down.

    Oh, people can be bad, too. There’s no doubting that, but people have identifiable motives. What does an Ai “want?”

    whatever it’s told to.

    • Uriel238 [all pronouns]@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 hours ago

      You’re not alone in your sentiment. The whole thought experiment of p-zombies and the notion of qualia comes from a desire to assume human beings should be given a special position, but in that case, a sentient is who we decide it is, the way Sophia the Robot is a citizen of Saudi Arabia (even though she’s simpler than GPT-2 (unless they’ve upgraded her and I missed the news.)

      But it will raise a question when we do come across a non-human intelligence. It was a question raised in both the Blade Runner movies, what happens when we create synthetic intelligence that is as bright as human, or even brighter? If we’re still capitalist, assuredly the companies that made them will not be eager to let them have rights.

      Obviously machines and life forms as sophisticated as we are are not merely the sum of our parts, but the same can be said about most other macro-sized life on this planet, and we’re glad to assert they are not sentient the way we are.

      What aggravates me is not that we’re just thinking meat but with all our brilliance we’re approaching multiple imminent great filters and seem not to be able to muster the collective will to try and navigate them. Even when we recognize that our behavior is going to end us, we don’t organize to change it.

      • Vanilla_PuddinFudge@infosec.pubOP
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 hours ago

        It runs deeper than that. You can walk back the why’s pretty easy to identify anyone’s motivation, whether it be personal interest, bias, money, glory, racism, misandry, greed, insecurity, etc.

        No one is buying rims for their car for no reason. No one is buying a firearm for no reason. No one donates to a food bank for no reason, that sort of thing, runs for president, that sort of reasoning.

        Ai is backed by the motive of a for-profit company, and unless you’re taking that grain of salt, you’re likely allowing yourself to be manipulated.