“The Wikimedia Foundation has been exploring ways to make Wikipedia and other Wikimedia projects more accessible to readers globally,” a Wikimedia Foundation spokesperson told me in an email. “This two-week, opt-in experiment was focused on making complex Wikipedia articles more accessible to people with different reading levels. For the purposes of this experiment, the summaries were generated by an open-weight Aya model by Cohere. It was meant to gauge interest in a feature like this, and to help us think about the right kind of community moderation systems to ensure humans remain central to deciding what information is shown on Wikipedia.”

Some very out of touch people in the Wikimedia Foundation. Fortunately the editors (people who actually write the articles) have the sense to oppose this move in mass.

  • sculd@beehaw.orgOP
    link
    fedilink
    arrow-up
    15
    ·
    4 days ago

    That is not the case here. These are not bots which flagged issues, but literally a LLM to help with writing “summaries”, which is why the reaction is so different.

    • HappyFrog@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      4
      ·
      3 days ago

      Yeah, I was thinking that if any organization would do AI summaries right, it would be Wikipedia. But I trust the editors the most.