• ozymandias117@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    4 days ago

    I would argue that, prior to chatgpt’s marketing, AI did mean that.

    When talking about specific, non-general, techniques, it was called things like ML, etc.

    After openai coopted AI to mean an LLM, people started using AGI to mean what AI used to mean.

    • brisk@aussie.zone
      link
      fedilink
      arrow-up
      2
      ·
      3 days ago

      That would be a deeply ahistorical argument.

      https://en.wikipedia.org/wiki/AI_effect

      AI is a very old field, and has always suffered from things being excluded from popsci as soon as they are achievable and commonplace. Path finding, OCR, chess engines and decision trees are all AI applications, as are machine learning and LLMs.

      That Wikipedia article has a great line in it too

      The Bulletin of the Atomic Scientists organization views the AI effect as a worldwide strategic military threat.[4] They point out that it obscures the fact that applications of AI had already found their way into both US and Soviet militaries during the Cold War.[4]

      The discipline of Artificial Intelligence was founded in the 50s. Some of the current vibe is probably due to the “Second AI winter” of the 90s, the last time calling things AI was dangerous to your funding

    • Ignotum@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      4 days ago

      To common people perhaps, but never in the field itself, much simpler and dumber systems than LLMs were still called AI

        • Klear@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          3 days ago

          So? I don’t see how that’s relevant to the point that “AI” has been used for very simple decision algorithms since for along time, and it makes no sense to not use it for LLMs too.