• Wispy2891@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 minutes ago

    It’s obvious that Google didn’t pay the crazy AWS prices to train Gemini, seeing how many servers they have in gcp.

    They mean that they used creative accounting to pay themselves crazy gcp usage bills to deduct from taxes?

  • Ilovethebomb@lemm.ee
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    8 hours ago

    Considering the hype and publicity GPT-4 produced, I don’t think this is actually a crazy amount of money to spend.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 hours ago

      Comparitively speaking, a lot less hype than their earlier models produced. Hardcore techies care about incremental improvements, but the average user does not. If you try to describe to the average user what is “new” about GPT-4, other than “It fucks up less”, you’ve basically got nothing.

      And it’s going to carry on like this. New models are going to get exponentially more expensive to train, while producing less and less consumer interest each time, because “Holy crap look at this brand new technology” will always be more exciting than “In our comparitive testing version 7 is 9.6% more accurate than version 6.”

      And for all the hype, the actual revenue just isn’t there. OpenAI are bleeding around $5-10bn (yes, with a b) per year. They’re currently trying to raise around $11bn in new funding just to keep the lights on. It costs far more to operate these models (even at the steeply discounted compute costs Microsoft are giving them) than anyone is actually willing to pay to use them. Corporate clients don’t find them reliable or adaptable enough to actually replace human employees, and regular consumers think they’re cool, but in a “nice to have” kind of way. They’re not essential enough a product to pay big money for, but they can only be run profitably by charging big money.

    • oce 🐆@jlai.lu
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      8 hours ago

      Yeah, I’m surprised at how low that is, a software engineer in a developed country is about 100k USD per year.
      So 40M USD for training ChatGPT 4 is the cost of 400 engineers for one year.
      They say cost of salaries could make up to 50% of the total, so the total cost is 800 engineers for one year.
      That doesn’t seem extreme.

    • huginn@feddit.it
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 hours ago

      The latest releases ChatGPT 4o costs $600/hr per instance to run based on the discussion I could find about it.

      If OpenAI is running 1k of those models to service the demand (they’re certainly running more since queries can take 30+ seconds) then that’s 200M/yr just keeping the lights on.

    • SkaveRat@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      3 hours ago

      I hope you complained all these years when games used “AI” for computer controlled enemies, because otherwise your post would be super awkward

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      edit-2
      6 hours ago

      AI is a broader term than you might realize. Historically even mundane algorithms like A* pathfinding were considered AI.

      Turns out people like to constantly redefine artificial intelligence to “whatever a computer can’t quite do yet.”

    • ContrarianTrail@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      6 hours ago

      It is AI though. AGI, which is a subcategory of AI and what many people seemingly imagine AI to mean, it’s not—but AI, yes.