Title says it all

  • TheFriar@lemm.eeM
    link
    fedilink
    arrow-up
    7
    ·
    3 hours ago

    If I were you I’d send this to some media outlets. Tank some AI stock and create some more negative news around it.

  • jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    31
    ·
    13 hours ago

    there’s plausible denia… nah i got nothin. That’s messed up. Even for the most mundane, non-gross use case imaginable, why the fuck would anybody need a creepy digital facsimile of a child?

    • ckmnstr@lemmy.worldOP
      link
      fedilink
      arrow-up
      21
      ·
      12 hours ago

      I mean, maaaybe if you wanted children and couldn’t have them. But why would it need to be “beautiful and up for anything”?

      • jsomae@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        3 hours ago

        “beautiful and up for anything” is incredibly suggestive phrasing. It’s an exercise in mental creativity to make it sound not creepy. But I can imagine a pleasant grandma (always the peak of moral virtue in any thought experiment) saying this about her granddaughter. I don’t mean to say I have heard this, only that I can imagine it. Barely.

  • viciouslyinclined@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    13 hours ago

    And the bot has 882.9k chats.

    Im not surprised and I dont think you or anyone else is either. But that doesn’t make this less disturbing.

    Im sure thw app devs are not interested in cutting off a huge chunk of their loyal users by doing the right thing and getting rid of those types of bots.

    Yes, its messed up. In my experience, it is difficult to report chat bots and see any real action taken as a result.

    • Shin@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      11 hours ago

      Ehhh nah. As someone who used character.ai before there are many horrible bots that get cleared and the bots have been impossible to have sex with unless you get really creative. The most horrendous ones get removed quite a bit and were consistently reposted. I’m not here to shield a big company or anything, but the “no sex” thing was a huge thing in the community and they always fought with the devs about it.

      They’re probably trying to hide behind the veil of more normal bots now, but I struggle to imagine how they’d get it to do sexual acts, when some lightly violent RPs I tried to do got censored. It’s pretty difficult, and got worse over time. Idk though, I stopped using it a while ago.

    • viciouslyinclined@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      13 hours ago

      They definitely knew who they were targeting when they made this. I only hope that, if those predators simply must text with a child, they keep talking to an ai bot rather than a real child.

    • Murvel@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      27 minutes ago

      It dosen’t work that way and we’ve known that for a long time. You cannot counteract desire by fueling it, it will only make it worse.

    • ckmnstr@lemmy.worldOP
      link
      fedilink
      arrow-up
      7
      arrow-down
      4
      ·
      12 hours ago

      I agree in principle, but look at the number of interactions. I think there’s a fine line between creating safe spaces for urges and downright promoting and normalizing criminal activity. I don’t think this should be a) this accessible and b) happening without psychiatric supervision. But maybe I’m being too judgemental

    • nullroot@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      13 hours ago

      Hundred percent. It feels pretty fucking thought-crimey to vilify the people who use these services.

  • ZDL@lazysoci.al
    link
    fedilink
    arrow-up
    15
    ·
    18 hours ago

    Yes it’s what you think it is. I don’t think, however, that there is anywhere to report it that will care enough to do something about it.

  • Ceedoestrees@lemmy.world
    link
    fedilink
    arrow-up
    35
    arrow-down
    2
    ·
    21 hours ago

    Yep. I dick around on a similar platform because a friend built it.

    The amount of shit I’ve reported is insane. Pedos just keep coming back with new accounts. Even with warnings and banned words, they find a way.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      ·
      22 hours ago

      Just a friendly childlike free spririt ready to talk about girl stuff!

      /s for real though, it is totally the evil thing

  • Lyra_Lycan@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    21 hours ago

    I’ve got a couple ads for an AI chat on Android, can’t remember the name but it has a disclaimer onscreen that reads something like “All characters shown are in their grown-up form”, implying that there are teen or child forms that you can communicate with.

    • dil@lemmy.zip
      link
      fedilink
      arrow-up
      5
      ·
      18 hours ago

      nah that likely implies that the children form was rejected by censors so its an adult version now

    • ckmnstr@lemmy.worldOP
      link
      fedilink
      arrow-up
      7
      ·
      21 hours ago

      I saw something similar! Reported it to Google ads and of course they “couldn’t find any ToS violations”

  • you_are_dust@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    21 hours ago

    I’ve messed around with some of these apps out of curiosity of where the technology is. There’s typically a report function in the app. You can probably report that particular bot from within the app to try and get that bot deleted. Reporting the app itself probably won’t do much.