• areyouevenreal@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 days ago

    No actually it has changed pretty fundamentally. These aren’t simply a bunch of FCNs put together. Look up what a transformer is, that was one of the major breakthroughs that made modern LLMs possible.

    • sudneo@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 days ago

      That is a technical detail, not a fundamental change. By fundamental mechanism I mean what the machine is designed to do. Of course techniques and implementations evolve, refine and improve in 60 years, but the idea behind the technology did not evolve much (NLP).

      • areyouevenreal@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 days ago

        Did back propagation even exist in the 60s? That was a pretty fundamental change in what they do.

        If we are arguing about really fundamental changes then arguably any neural network is the same and computers are the same as ChatGPT or a mouse, or even something simpler like a single layer perceptron.