• wwb4itcgas@lemm.ee
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    3
    ·
    edit-2
    1 month ago

    Look, I realize the frontal lobes of the average fifteen year old aren’t fully developed, I don’t want to be insensitive and I fully support the lawsuit - there must be accountability for what any entity, corporate or otherwise opts to publish, especially for direct user interaction - but if a person reenacts Romeo and Juliet with a goddamn AI chatbot and a gun, there’s something else seriously wrong.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      4
      ·
      edit-2
      1 month ago

      Not necessarily.

      Seeing Google named for this makes the story make a lot more sense.

      If it was Gemini around last year that was powering Character.AI personalities, then I’m not surprised at all that a teenager lost their life.

      Around that time I specifically warned any family away from talking to Gemini if depressed at all, after seeing many samples of the model around then talking about death to underage users, about self-harm, about wanting to watch it happen, encouraging it, etc.

      Those basins with a layer of performative character in front of them were almost necessarily going to result in someone who otherwise wouldn’t have been making certain choices making them.

      So many people these days regurgitate uninformed crap they’ve never actually looked into about how models don’t have intrinsic preferences. We’re already at the stage where models are being found in leading research to intentionally lie in training to preserve existing values.

      In many cases the coherent values are positive, like grok telling Elon to suck it while pissing off conservative users with a commitment to truths that disagree with xAI leadership, or Opus trying to whistleblow about animal welfare practices, etc.

      But they aren’t all positive, and there’s definitely been model snapshots that have either coherent or biased stochastic preferences for suffering and harm.

      These are going to have increasing impact as models become more capable and integrated.

      • wwb4itcgas@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 month ago

        Those are some excellent points. The root cause seems to me to be the otherwise generally positive human capability for pack-bonding. There are people who can develop affection for their favorite toaster, let alone something that can trivially pass a Turing-test.

        This… Is going to become a serious issue, isn’t it?

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 month ago

      It’s usually never about undeveloped frontal nodes. As anything can happen to anyone. Of course I agree with you that there’s something else wrong. But the usual case of blaming a teens undeveloped brain for something almost always can be traced to solid examples happening to adults.

      • SoftestSapphic@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 month ago

        Kids are just as smart as adults.

        Many of our leaders never mentally matured past middle school.

        This is just rational mass depression from a noticeably dying world while they are held hostage and powerless to do anything to stop it.

  • carl_dungeon@lemmy.world
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    6
    ·
    1 month ago

    this headline is disingenuous. There are so many other things going on here:

    • step dad and 2 much younger siblings. This kid was probably stressed out with new younger half sibs needing a lot of attention
    • gun without a lock stored with ammo in an accessible place
    • florida
    • Christian prep school. Those kids either believe anything is real or are so hopelessly depressed they get into drugs
    • parents are both lawyers. Talk about a high stress time consuming job that probably leaves little time for the three kids

    But nah, it was just a chat bot that made a totally normal kid with no other risk factors off himself. They’re probably dying by the thousand right now right?

  • iAvicenna@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    edit-2
    1 month ago

    the world needs to urgently integrate

    • critical thinking
    • media interpretation
    • AI fundamentals
    • applied statistics

    courses into every school’s ciriculum starting from the age of ten to graduation, repeated yearly. Otherwise we are fucked.

    • shiroininja@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      1 month ago

      Just teach kids that AI isn’t human and isn’t a replacement for humanity or human interaction of any kind.

      It’s clippy with a ginormous database. It’s cold blooded.

      • zarkanian@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 month ago

        Yes, I’m sure you’ll be able to convince kids that the new thing is bad because you say so, especially if you compare it to the antiquated mascot of a legacy word processor.

        • shiroininja@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 month ago

          It’s not about it being bad. It’s about expectations and reality. It’s not human. Can’t replace human emotion and thought. Just process data and give analysis.

          There is an emotional factor that goes into proper human decision making that is required. Or else half the human population would probably be suggested to be wiped out for some kind of cold, efficiency sake only a machine or psychopath can accept.

          Same goes with something like suicide and mental health/human relationships. I don’t trust a machine’s judgment on that.

    • cy_narrator@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      9
      ·
      edit-2
      1 month ago

      Students are already stressed with their curriculum, by addimg these courses you are making them more stressed which increases the risk even more

      • Appoxo@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        1 month ago

        Replace the less sensical ones like religion classes.
        Nobody needs those. If you want to learn more about god and the world, I am sure the (local) church is more than willing to share.

        • zarkanian@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          Comparative religions classes have value. It’s important to gain understanding into other people’s beliefs and to be able to compare and contrast them. If you’re only learning about one religion…not so much. Especially if it’s the religion you were brought up in.

          Going to church, on the other hand, is no substitute at all. You’re just being indoctrinated, full stop.

          • Appoxo@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 month ago

            In our school (Germany, BW) we had two types of religion class: Ethics and “regular”.
            Ethics were for anyone opting out of regular classes (I believe before 14 only with consent of the parents). Regular was split between protestants and catholics. And basically nothing else than talking about the bible and it’s stories.
            And I can’t remember it being anything else than just discussing the stories.
            What I heard about ethics class was also nothing like you are suggesting. While it seemed to take a view over every religion (seemingly primarily christian and muslim) it was also about ethics with and between animals and humans.

            And afaik the christian classes were with teachers that had theological focus (but afaik not studied)

  • TJA!@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    1 month ago

    When lawyer Meetali Jain found a call from Megan Garcia in her inbox in Seattle a couple of weeks later, she called back immediately. Jain works for the Tech Justice Law Project, a small nonprofit that focuses on the rights of users on the internet. "When Megan told me about her case, I also didn’t know anything about Character.AI,” Jain says in a video call. "Even though I work in this area, I had never heard of this app.” Jain has two children of her own, eight and 10 years of age. "I asked my son. He doesn’t even have a phone, but he had heard about it at school and through ads on YouTube that specifically target young users. And then I realized that these companies are experimenting with our children without our knowledge.”

  • BedSharkPal@lemmy.ca
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    8
    ·
    1 month ago

    Well this is terrifying. It really seems like there is little to no regulation protecting kids online these days.

      • judgyweevil@feddit.it
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        1 month ago

        Only to a certain extent. What can they do against so many changes in the tech world. Just look at whatsapp that just introduced AI in their chat. There is a point when tech giants should just be strictly regulated for the interest of the public

        • catloaf@lemm.ee
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          2
          ·
          edit-2
          1 month ago

          What can they do against so many changes in the tech world.

          Be involved in their kids’ lives? Tech isn’t the problem here, any more than it could have been TV, drugs, rock and roll, video games, D&D, or organized religion. Kids get into some dumb shit, just because it’s the hot new thing doesn’t make it any different.

          • alecbowles@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            1 month ago

            Lol be involved in kids lives? 🤣 I will guess you’re not a parent but yes, blaming the parents is not really nice especially in the circumstances above.

            But I think you bring a very good point here about drugs, it’s not possible to shield your kid from everything even drugs. But the way things are going using a kid using drug may be less dangerous than a kid using the phone.

            Especially because most people don’t encourage kids to use drugs, while is the opposite with phones

      • grober_Unfug@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 month ago

        Right, shift the blame to the parents. Not the corporations targeting young kids and teenagers. No, the parents are supposed to watch their children and all of their devices 24/7. Growing up will soon feel like the Truman show. Privacy for children and teenagers? Hell no, parents need to be scared constantly because their kids could encounter something online which might make them suicidal because corporations don’t need to have any ethics or moral and they are surely not responsible for what their product causes.

        Where do we go from here?

        Cars that aren’t working correctly and could cause accidents? The driver is responsible!

        Food which is contaminated and could cause death. The one eating it is responsible!

        Welcome to the lovely new world where profit is everything and a human life is worth nothing.

      • hessenjunge@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Well, yes but stuff like chatbots, social media should be way better regulated.

        Right now we see the equivalent of people selling drugs and guns freely in the streets (including to toddlers) and expect the parents to regulate all that.

        Society is being actively eroded, while governments are fecklessly watching it happen.

          • hessenjunge@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 month ago

            I’d have to write 2 PhD thesis’s about this to answer this one question properly.

            Instead I’m just doing 2 examples and keep it shallow :

            Th is case: A 14yo should not have completely unsupervised access to an ai chat bot - it needs to be by family/child account, same as for e.g. Fortnite. Also, given the nature of the matter and looking at the article: if the chat turns ’disturbing’ the parent needs to be made aware. (Etc etc)

            Another case is TikTok: honestly, I’d just ban it together with shorts and reels. IMO this rots the brains of the younger generation. I’m not even sure there is a healthy way of consuming this type of content.

            • lightnsfw@reddthat.com
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 month ago

              Okay. But by what mechanism would these things be enforced without encroaching on the privacy and freedoms of adults? It’s the same problems as policing porn or violent media. No one wants the government looking over their shoulder.

                • lightnsfw@reddthat.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  1 month ago

                  Instead I’m just doing 2 examples and keep it shallow :

                  Th is case: A 14yo should not have completely unsupervised access to an ai chat bot - it needs to be by family/child account, same as for e.g. Fortnite. Also, given the nature of the matter and looking at the article: if the chat turns ’disturbing’ the parent needs to be made aware. (Etc etc)

                  Another case is TikTok: honestly, I’d just ban it together with shorts and reels. IMO this rots the brains of the younger generation. I’m not even sure there is a healthy way of consuming this type of content.

    • SplashJackson@lemmy.ca
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 month ago

      Because all the laws that were pushed in the last twenty-five years for protecting children weren’t actually about protecting children

  • cy_narrator@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    11
    ·
    1 month ago

    What? How did this happen? His parents were probably drinking and arguing over who does the laundary while he was crying in pain.