• Blue_Morpho@lemmy.world
    link
    fedilink
    English
    arrow-up
    132
    arrow-down
    15
    ·
    1 day ago

    Cancelling new data centers because deep seek has shown a more efficient path isn’t proof that AI is dead as the author claims.

    Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn’t mean the Internet was dead.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      2
      ·
      1 day ago

      I’m gonna disagree - it’s not like DeepSeek uncovered some upper limit to how much compute you can throw at the problem. More efficient hardware use should be amazing for AI since it allows you to scale even further.

      This means that MS isn’t expecting these data centers to generate enough revenue to be profitable, and they’re not willing to bet on further advancements that might make them profitable. In other words, MS doesn’t have a positive outlook for AI.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        23
        ·
        1 day ago

        Exactly. If AI were to scale like the people at OpenAI hoped, they would be celebrating like crazy because their scaling goal was literally infinity. Like seriously the plan that openai had a year ago was to scale their AI compute to be the biggest energy consumer in the world with many dedicated nuclear power plants just for their data centers. That means if they dont grab onto any and every opportunity for more energy, they have lost faith in their original plan.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        22 hours ago

        More efficient hardware use should be amazing for AI since it allows you to scale even further.

        If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

        When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?

        Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?

        There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          15 hours ago

          If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

          It doesn’t make any sense to compare games and AI. Games have a well-defined upper bound for performance. Even Crysis has “maximum settings” that you can’t go above. Supposedly, this doesn’t hold true for AI, scaling it should continually improve it.

          So: yes, in your analogy, MS would still buy a new video card this year if they believed in the progress being possible and reasonably likely.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 hours ago

            Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.

            If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?

            And again data centers aren’t just used for AI.

            • FooBarrington@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 hours ago

              It’s still not a valid comparison. We’re not talking about diminished returns, we’re talking about an actual ceiling. There are only so many options implemented in games - once they’re maxed out, you can’t go higher.

              That’s not the situation we have with AI, it’s supposed to scale indefinitely.

              • Blue_Morpho@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                8 hours ago

                Current games have a limit. Current models have a limit. New games could scale until people don’t see a quality improvement. New models can scale until people don’t see a quality improvement.

                • FooBarrington@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  7 hours ago

                  I’m supposed to be able to take a model architecture from today, scale it up 100x and get an improvement. I can’t make the settings in Crysis 100x higher than they can go.

                  Games always have a limit, AI is supposed to get better with scale. Which part do you not understand?

                  • Blue_Morpho@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    6 hours ago

                    I’m supposed to be able to take a model architecture from today, scale it up 100x and get an improvement.

                    You can make Crysis run at higher fps. You can add polygons. (remember ati clown feet?) You can add details to textures. https://research.nvidia.com/publication/2016-06_infinite-resolution-textures

                    But really the “game” is the model. Throwing more hardware at the same model is like throwing more hardware at the same game.

                    Which part of diminished returns not offering as much profit did you not understand?

                    Current models give MS an extra 30% revenue. If they spend billions on a new model will customer pay even more? How much would you pay more for a marginally better AI?

        • Takumidesh@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          20 hours ago

          If buying a new video card made me money, yes.

          This doesn’t really work, because the goal when you buy a video card isn’t to have the most possible processing power ever and playing video games doesn’t scale linearly so having an additional card doesn’t add anything.

          If I was mining crypto, or selling GPU compute (which is basically what ai companies are doing) and the existing card got an update that made it perform on par with new cards, I would buy out the existing cards and when there are no more, I would buy up the newer cards, they are both generating revenue still.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            19 hours ago

            If buying a new video card made me money, yes

            But this is the supposition that not buying a video card makes you the same money. You’re forecasting free performance upgrades so there’s no need to spend money now when you can wait and upgrade the hardware once software improvements stop.

            And that’s assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.

    • contrafibularity@lemmy.world
      link
      fedilink
      English
      arrow-up
      67
      arrow-down
      1
      ·
      1 day ago

      yeah, genai as a technology and field of study may not disappear. genai as an overinflated product marketed as the be all end all that would solve all of humanity’s problems may. the bubble can’t burst soon enough

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        Historically, the field of AI research has gone through boom and bust cycles. The first boom was during the Vietnam War with DARPA dumping money into it. As opposition to the Vietnam War grew, DARPA funding dried up, and the field went into hibernation with only minor advancement for decades. Then the tech giant monopolies saw an opportunity for a new bubble.

        It’d be nice if it could be funded at a more steady, sustainable level, but apparently capitalism can’t do that.

      • ShepherdPie@midwest.social
        link
        fedilink
        English
        arrow-up
        19
        ·
        1 day ago

        Exactly. It’s not as if this tech is going in the dumpster, but all of these companies basing their multi-trillion-dollar market cap on it are in for a rude awakening. Kinda like how the 2008 housing market crash didn’t mean that people no longer owned homes, but we all felt the effects of it.

    • Kazumara@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      1
      ·
      1 day ago

      Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient.

      Sorry but that makes no sense in multiple ways.

      • First of all single mode fiber provides magnitudes higher capacity than multi mode.

      • Secondly the modal patterns depend on the physics of the cable, specifically its core diameter. Single mode fibers has a 9 micrometer core, multi mode 50 or 62.5 micrometers. So you can’t change the light modes on existing fiber.

      • Thirdly multi mode fiber existed first, so it couldn’t be the improvement. And single mode fiber was becoming the way forward for long distance transmission in 1982 already, and the first transatlantic cable with it was laid in 1988. So it couldn’t be the improvement of 2000 either.

      You must mean something else entirely.

    • bean@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Yeah you echo my thoughts actually. That efficiency could be found in multiple areas, including deepseek. That perhaps too that some other political things may be a bit more uncertain.