• 58008@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    11 hours ago

    Would be funny if the Wikimedia Foundation developed an AI that is trained on all the major AI models, and uses that to detect AI slop.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      11 hours ago

      They are constantly changing, but one could probably get pretty far focusing on ChatGPT (which is what most “lazy” authors use).

      And there are already efforts in this domain from the community, see the “slop” profiles in EQ bench:

      https://eqbench.com/creative_writing.html

      Traditional LLMs would be better suited (ironically) for fact checking, eg they check for citations, then go to follow the links and see if it matches the text. They’re also much better at “checking” for sanity than actually writing it out. An obviously this would just be a first pass for a person to quickly confirm.

  • A_norny_mousse@feddit.org
    link
    fedilink
    English
    arrow-up
    12
    ·
    15 hours ago

    Nothing miraculous:

    One way Wikipedians are sloshing through the muck is with the “speedy deletion” of poorly written articles, as reported earlier by 404 Media. A Wikipedia reviewer who expressed support for the rule said they are “flooded non-stop with horrendous drafts.” They add that the speedy removal “would greatly help efforts to combat it and save countless hours picking up the junk AI leaves behind.” Another says the “lies and fake references” inside AI outputs take “an incredible amount of experienced editor time to clean up.”

    Typically, articles flagged for removal on Wikipedia enter a seven-day discussion period during which community members determine whether the site should delete the article. The newly adopted rule will allow Wikipedia administrators to circumvent these discussions if an article is clearly AI-generated and wasn’t reviewed by the person submitting it.


    The Wikimedia Foundation is also actively developing a non-AI-powered tool called Edit Check that’s geared toward helping new contributors fall in line with its policies and writing guidelines. Eventually, it might help ease the burden of unreviewed AI-generated submissions, too. Right now, Edit Check can remind writers to add citations if they’ve written a large amount of text without one, as well as check their tone to ensure that writers stay neutral.

    Remember that a spell checker also works completely without AI. Computers can do advanced stuff without LLMs.