• kalkulat@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    16 minutes ago

    Lots of wannabe authoritarians out there in educationland.

    All those decades that the schools just -couldn’t afford- more (well-educated) teachers and smaller class sizes. Lots of low-end look-good.

    And then along came tech, and lo-and-behold, IT was going to be the savior. Let’s buy into that! We may not be able to teach them to read, write or think, but they can learn to kneel!

  • wuffah@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 hour ago

    What a great way to prepare students for our AI enabled social media and digital surveillance society. Take note kids, trust no one!

  • A Wild Mimic appears!@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    63
    ·
    4 hours ago

    Holy shit, the amount of surveillance the teens are under is ungodly and people blame the chatbot? And there wasn’t even a human kind enough to speak with the girl before calling the fucking cops? I see a lot of blame to place here, but it’s not the chatbot who is to blame.

    • The kids for bullying her for her tan
    • The school boards implementing the surveillance
    • The parents who allowed such surveillance in the first place
    • The person screening what was flagged for not sending the school counselor to talk with the kid
    • The person calling the cops
    • The cops for arresting an 8th-grader and DOING A STRIP SEARCH AND KEEPING HER OVERNIGHT WTF instead of handing her over to her parents

    Everyone of them failed a 13 year old girl. All of them should be ashamed.

  • Bennyboybumberchums@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    3 hours ago

    It not just schools. Its everywhere. I was on reddit just last week, talking about when I was 15 and fancying one of my teachers. I got banned for “soliciting sex from a minor”… And whats worse, when I appealed, they upheld it. Some human actually read a comment in which I spoke about when I was 15. And took that to mean I was asking kids if they want to see some puppies or something. The insane online world of the far left and right has fucked us all.

  • Ontimp@feddit.org
    link
    fedilink
    English
    arrow-up
    49
    ·
    5 hours ago

    This is exactly what is going to happen with the fucking chat control of the EU actually enforces it, but for an entire continent. Fuck this shit. Privacy is a human right.

  • cdf12345@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 hours ago

    “Sometimes you have to look at the trade for the greater good,” said Board of Education member Anne Costello in a July 2024 board meeting.

    No no no no.

  • AlphaOmega@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 hours ago

    I don’t understand why she was arrested, her “threat” was obviously a joke. I guess jokes are illegal now

  • SugarCatDestroyer@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    6 hours ago

    It seems that Big Brother is wathing you… But now it’s already a reality, oh and what will happen if someone commits a thoughtcrime?

  • 𞋴𝛂𝛋𝛆@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    6
    ·
    edit-2
    8 hours ago

    It is not the tool, but is the lazy stupid person that created the implementation. The same stupidity is true of people that run word filtering in conventional code. AI is just an extra set of eyes. It is not absolute. Giving it any kind of unchecked authority is insane. The administrators that implemented this should be what everyone is upset at.

    The insane rhetoric around AI is a political and commercial campaign effort by Altmann and proprietary AI looking to become a monopoly. It is a Kremlin scope misinformation campaign that has been extremely successful at roping in the dopes. Don’t be a dope.

    This situation with AI tools is exactly 100% the same as every past scapegoated tool. I can create undetectable deepfakes in gimp or Photoshop. If I do so with the intent to harm or out of grossly irresponsible stupidity, that is my fault and not the tool. Accessibility of the tool is irrelevant. Those that are dumb enough to blame the tool are the convenient idiot pawns of the worst of humans alive right now. Blame the idiots using the tools that have no morals or ethics in leadership positions while not listening to these same types of people’s spurious dichotomy to create monopoly. They prey on conservative ignorance rooted in tribalism and dogma which naturally rejects all unfamiliar new things in life. This is evolutionary behavior and a required mechanism for survival in the natural world. Some will always scatter around the spectrum of possibilities but the center majority is stupid and easily influenced in ways that enable tyrannical hegemony.

    AI is not some panacea. It is a new useful tool. Absent minded stupidity is leading to the same kind of dystopian indifference that lead to the ““free internet”” which has destroyed democracy and is the direct cause of most political and social issues in the present world when it normalized digital slavery through ownership over a part of your person for sale, exploitation, and manipulation without your knowledge or consent.

    I only say this because I care about you digital neighbor. I know it is useless to argue against dogma but this is the fulcrum of a dark dystopian future that populist dogma is welcoming with open arms of ignorance just like those that said the digital world was a meaningless novelty 30 years ago.

    • SugarCatDestroyer@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 hours ago

      In such a world, hoping for a different outcome would be just a dream. You know, people always look for the easy way out, and in the end, yes, we will live under digital surveillance, like animals in a zoo. The question is how to endure this and not break down, especially in the event of collapse and poverty. It’s better to hope for the worst and be prepared than to look for a way out and try to rebel and then get trapped.

      • 𞋴𝛂𝛋𝛆@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 hours ago

        Collapse is likely not that bad for the average person. The fundamental inputs outputs and needs are all the same. The clowns on top are all that really change.

        • SugarCatDestroyer@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          Well, it may happen that these clowns will use poverty to make people want stability and voluntarily put on a collar, and if that doesn’t work, they will use force. I mean a future concentration camp. Well, and digital currencies and all this dystopian crap.

    • verdigris@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      6 hours ago

      You seem to be handwaving all concerns about the actual tech, but I think the fact that “training” is literally just plagiarism, and the absolutely bonkers energy costs for doing so, do squarely position LLMs as doing more harm than good in most cases.

      The innocent tech here is the concept of the neural net itself, but unless they’re being trained on a constrained corpus of data and then used to analyze that or analogous data in a responsible and limited fashion then I think it’s somewhere on a spectrum between “irresponsible” and “actually evil”.

      • A Wild Mimic appears!@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        3 hours ago

        scraping the web to create a dataset isn’t plagiarism, same with training a model on said scraped data, and calculating which words should come in what order isn’t plagiarism too. I agree that datasets should be ethically sourced, but scraping the web is something that allowed such things as the search engine to be created, which made the web a lot more useful. Was creating google irresponsible?

      • SugarCatDestroyer@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 hours ago

        If the world is ruled by psychopaths who seek absolute power for the sake of even more power, then the very existence of such technologies will lead to very sad consequences and, perhaps, most likely, even to slavery. Have you heard of technofeudalism?

        • verdigris@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          5 hours ago

          Okay sure but in many cases the tech in question is actually useful for lots of other stuff besides repression. I don’t think that’s the case with LLMs. They have a tiny bit of actually usefulness that’s completely overshadowed by the insane skyscrapers of hype and lies that have been built up around their “capabilities”.

          With “AI” I don’t see any reason to go through such gymnastics separating bad actors from neutral tech. The value in the tech is non-existent for anyone who isn’t either a researcher dealing with impractically large and unwieldy datasets, or of course a grifter looking to profit off of bigger idiots than themselves. It has never and will never be a useful tool for the average person, so why defend it?

          • A Wild Mimic appears!@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            3 hours ago

            I am an average person, and my GPU is running a chatbot which currently gives me a course in Regular Expressions. My GPU also generates images for me from time to time when i need an image, because i am crappy at drawing. There are a lot of uses for the technology.

          • SugarCatDestroyer@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            5 hours ago

            There’s nothing to defend. Tell me, would you defend someone who is a threat to you and deprives you of the ability to create, making art unnecessary? No, you would go and kill him while this bastard hasn’t grown up. Well, what’s the point of defending a bullet that will kill you? Are you crazy?

  • A_norny_mousse@feddit.org
    link
    fedilink
    English
    arrow-up
    109
    ·
    edit-2
    10 hours ago

    With the help of artificial intelligence, technology can dip into online conversations and immediately notify both school officials and law enforcement.

    Not sure what’s worse here: how the police overreacted or that the software immediately contacts law enforcement, without letting teachers (n.b.: they are the experts here, not the police) go through the positives first.

    But oh, that would mean having to pay somebody, at least some extra hours, in addition to the no doubt expensive software. JFC.

    • verdigris@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      6 hours ago

      I hate how fully leapfrogged the conversation about surveillance was. It’s so disgusting that it’s just assumed that all of your communications should be read by your teachers, parents, and school administration just because you’re a minor. Kids deserve privacy too.

    • Boddhisatva@lemmy.world
      link
      fedilink
      English
      arrow-up
      49
      arrow-down
      3
      ·
      11 hours ago

      Not sure what’s worse here: how the police overreacted or that the software immediately contacts law enforcement, without letting teachers (n.b.: they are the professionals here, not the police) go through the positives first.

      The idea behind the policy is to stop school shootings. If there were a legitimate threat of violence, you would likely want the police to be notified as soon as possible. The issue here is that the authorities are letting a piece of half-ass code (Read: AI) decide what is a legitimate threat and, worse still, acting on that determination without question.

      They have literally sacrificed an essential freedom for some temporary, and probably illusory, security.

        • FEIN@lemmy.world
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          3
          ·
          10 hours ago

          “no way to stop this” says the only country where this happens

          • Zephorah@discuss.online
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            1
            ·
            10 hours ago

            I didn’t realize the schools were using Run, Hide, Fight. That is the same policy for hospital staff in the event of an active shooter. Maddening.

            • Miles O'Brien@startrek.website
              link
              fedilink
              English
              arrow-up
              6
              ·
              edit-2
              7 hours ago

              Having worked in quite a few fields in the last 15 years or so, it’s the same active shooter training they give everyone. Even in stores that sell guns.

              I’ll let the reader decide how fucked up it is that there’s basically a countrywide accepted “standard response”

            • FerretyFever0@fedia.io
              link
              fedilink
              arrow-up
              9
              arrow-down
              1
              ·
              10 hours ago

              I’m sorry, in hospitals? Where a significant portion of the patients can do none of those things?

              • Zephorah@discuss.online
                link
                fedilink
                English
                arrow-up
                11
                ·
                10 hours ago

                They’re not residents, you’re thinking of nursing homes. Roughly a third of hospital patients can walk without assistance, but yes. The rationale is staff doesn’t turn themselves into bullet sponges, because then who is left to remove the bullets once the shooter is dead? Either way, what do unarmed, untrained (to fight) people with the body armor equivalent of pajamas do to stop bullets?

                The patient room doors don’t lock. Sometimes those doors are made of glass. But herding the patients who can walk into the halls is likely an opportunity for an active shooter to hit more targets. As such, everyone hunkers down, and the police take care of it. In theory, per the training modules. Police sometimes run drills with the hospital, depending on locale and interagency dealings.

                Shutting all the fire doors is likely the only defense. Those nurses can be crafty on the fly, but there are limitations.

                I can’t imagine a secondary piece of this policy isn’t hospitals avoiding liability regarding workplace injury/death lawsuits.

                I just hadn’t known until now that in grasping for solutions schools found the standardized hospital policy and are running with it.

            • frongt@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              10 hours ago

              Why maddening? The active shooter response shouldn’t be all that different/

      • 6nk06@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        3
        ·
        11 hours ago

        the policy is to stop school shootings

        You should try Europe once. It’s more fun than your 3rd world country.

          • Albbi@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            3
            ·
            7 hours ago

            Is this better, worse, or the same as throwing dildos at female WNBA athletes?

          • FerretyFever0@fedia.io
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            10 hours ago

            A lot of Europe seems to somehow have worse racism in some areas than the US. Ask a couple English people what they think of travellers and Muslims.

            • ObjectivityIncarnate@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              4 minutes ago

              “Daily murder” is a sneaky rhetorical maneuver, considering it’s something influenced more by raw population size, than by capita. It’s easy for there to be a “daily murder” in a country of 340,000,000 people, even when the overwhelmingly vast majority of people do not murder.

              Using “few” to trivialize/minimize the racism is no better.

              Shame on you for this disingenuity.

        • Boddhisatva@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          11 hours ago

          Oh, I’m with you on that. I’m just pointing out the thought behind the policy, however flawed. I’ve been to Europe many years ago. I would love to be there now, except that as an American I would be rightly ostracized.

      • ObjectivityIncarnate@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        10 hours ago

        The issue here is that the authorities are letting a piece of half-ass code (Read: AI) decide what is a legitimate threat and, worse still, acting on that determination without question.

        Yeah, at the very least, the software should be passing on the statement, and context surrounding it, along with its ‘judgment’, to the authorities, putting all the responsibility for making the call that X genuinely merits action on said authorities.

        Of course, that’s just one piece of the puzzle, and not a solution if law enforcement isn’t held accountable when they fuck up.