• morgunkorn@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    25
    ·
    17 hours ago

    trust me bro, we’re almost there, we just need another data center and a few billions, it’s coming i promise, we are testing incredible things internally, can’t wait to show you!

      • LostXOR@fedia.io
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        6 hours ago

        Around a year ago I bet a friend $100 we won’t have AGI by 2029, and I’d do the same today. LLMs are nothing more than fancy predictive text and are incapable of thinking or reasoning. We burn through immense amounts of compute and terabytes of data to train them, then stick them together in a convoluted mess, only to end up with something that’s still dumber than the average human. In comparison humans are “trained” with maybe ten thousand “tokens” and ten megajoules of energy a day for a decade or two, and take only a couple dozen watts for even the most complex thinking.

        • pixxelkick@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 hours ago

          Humans are “trained” with maybe ten thousand “tokens” per day

          Uhhh… you may wanna rerun those numbers.

          It’s waaaaaaaay more than that lol.

          and take only a couple dozen watts for even the most complex thinking

          Mate’s literally got smoke coming out if his ears lol.

          A single Wh is 860 calories…

          I think you either have no idea wtf you are talking about, or your just made up a bunch of extremely wrong numbers to try and look smart.

          1. Humans will encounter hundreds of thousands of tokens per day, ramping up to millions in school.

          2. An human, by my estimate, has burned about 13,000 Wh by the time they reach adulthood. Maybe more depending in activity levels.

          3. While yes, an AI costs substantially more Wh, it also is done in weeks so it’s obviously going to be way less energy efficient due to the exponential laws of resistance. If we grew a functional human in like 2 months it’d prolly require way WAY more than 13,000 Wh during the process for similiar reasons.

          4. Once trained, a single model can be duplicated infinitely. So it’d be more fair to compare how much millions of people cost to raise, compared to a single model to be trained. Because once trained, you can now make millions of copies of it…

          5. Operating costs are continuing to go down and down and down. Diffusion based text generation just made another huge leap forward, reporting around a twenty times efficiency increase over traditional gpt style LLMs. Improvements like this are coming out every month.

          • LostXOR@fedia.io
            link
            fedilink
            arrow-up
            1
            ·
            5 hours ago

            True, my estimate for tokens may have been a bit low. Assuming a 7 hour school day where someone talks at 5 tokens/sec you’d encounter about 120k tokens. You’re off by 3 orders of magnitude on your energy consumption though; 1 watt-hour is 0.86 food Calories (kcal).