• ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
    link
    fedilink
    arrow-up
    3
    ·
    5 days ago

    It appears you’ve missed the point here, which is that it turns out you can use older GPUs in creative ways to get a lot more out of them than people realized. Having latest chips isn’t the bottleneck people thought it was.

    • Euphoma@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      This article doesn’t talk about older gpus though? Its talking about using the V80 fpga from amd, which released in 2024 and costs 10k. Unless I’m misunderstanding something about the article? I do think its a good breakthrough being able to use an fpga like this though.

      • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
        link
        fedilink
        arrow-up
        1
        ·
        5 days ago

        You’re right, the chip they leveraged isn’t actually that old. The key part is that we’re seeing a lot of optimizations happening in software space now that allows to use existing chips more efficiently.

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      5 days ago

      turns out you can use older GPUs in creative ways to get a lot more out of them than people realized

      If that’s the point then that’s the entire GPU used for mining then ML revolution, thanks to CUDA mostly, that already happened in 2010 so that’s even older, that’d 15 yeas ago.

      What I was highlighting anyway is that it’s hard to trust an article where simple facts are wrong.

        • utopiah@lemmy.ml
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          5 days ago

          Well, I honestly tried (cf history). You’re neither addressing my remark about the fact from the article nor the bigger picture. Waste of time, blocked.