^^^

  • lokalhorst@feddit.org
    link
    fedilink
    arrow-up
    0
    ·
    15 days ago

    People try to always frame it as AGI is the logical next expansion step of LLMs, but it is not. This is not a linear process and transformer based LLMs and the science fiction like goal of AGI just don’t have much to do with each other.

      • SaveTheTuaHawk@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        the whole concept assumes LLMs will reach some mythical enlightenment after feeding them exabytes of bullshit on the internet.

        Classic case of garbage in, garbage out.