• Daxtron2
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    5
    ·
    2 months ago

    We’re in the “computers take up entire rooms in a university to do basic calculations” stage of modern AI development. It will improve but only if we let them develop.

    • rbesfe@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Moore’s law died a long time ago, and AI models aren’t getting any more power efficient from what I can tell.

      • Daxtron2
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        Then you haven’t been paying attention. There’s been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.

        • Daxtron2
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          GPT is not the end all be all of LLMs

            • Daxtron2
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              GPT is not a paradigm it’s a specific model family developed by openAI. You’re thinking of the transformers architecture. Check out a project like RWKV if you want to see a unique approach.