Inspired by a recent talk from Richard Stallman.

From Slashdot:

Speaking about AI, Stallman warned that “nowadays, people often use the term artificial intelligence for things that aren’t intelligent at all…” He makes a point of calling large language models “generators” because “They generate text and they don’t understand really what that text means.” (And they also make mistakes “without batting a virtual eyelash. So you can’t trust anything that they generate.”) Stallman says “Every time you call them AI, you are endorsing the claim that they are intelligent and they’re not. So let’s let’s refuse to do that.”

Sometimes I think that even though we are in a “FuckAI” community, we’re still helping the “AI” companies by tacitly agreeing that their LLMs and image generators are in fact “AI” when they’re not. It’s similar to how the people saying “AI will destroy humanity” give an outsized aura to LLMs that they don’t deserve.

Personally I like the term “generators” and will make an effort to use it, but I’m curious to hear everyone else’s thoughts.

  • James R Kirk@startrek.websiteOP
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    10 days ago

    Here’s the citation if you really need it. I’m not trying to argue, but the process of becoming more complex and specific to a niche is the literal definition of “evolution”:

    “A gradual process in which something changes into a different and usually more complex or better form.”

    • Lumidaub@feddit.org
      link
      fedilink
      arrow-up
      1
      ·
      9 days ago

      The way I understand these terms, more complex and more specific are different things. A more complex definition encompasses more things, whereas a more specific definition has a smaller scope.