• FooBarrington@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 hours ago

    If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

    It doesn’t make any sense to compare games and AI. Games have a well-defined upper bound for performance. Even Crysis has “maximum settings” that you can’t go above. Supposedly, this doesn’t hold true for AI, scaling it should continually improve it.

    So: yes, in your analogy, MS would still buy a new video card this year if they believed in the progress being possible and reasonably likely.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.

      If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?

      And again data centers aren’t just used for AI.

      • FooBarrington@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        43 minutes ago

        It’s still not a valid comparison. We’re not talking about diminished returns, we’re talking about an actual ceiling. There are only so many options implemented in games - once they’re maxed out, you can’t go higher.

        That’s not the situation we have with AI, it’s supposed to scale indefinitely.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 minutes ago

          Current games have a limit. Current models have a limit. New games could scale until people don’t see a quality improvement. New models can scale until people don’t see a quality improvement.