• sunzu2
    link
    fedilink
    118 days ago

    An llm can’t tell right from wrong… How is it supposed to be AGI?!

    • @Grimy@lemmy.world
      link
      fedilink
      English
      518 days ago

      It isn’t. I’d even say that simply completing puzzles is far from AGI, even if the puzzles are complex.