• megopie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    The compute used to do this burnt several ton of natural gas, boiled a small lake in Minnesota, traumatized approximately 4 people in the developing world to label the training data, and cost a pensioner about 2% of their net value.

    • vithigar@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      This is really what blows my mind the most. With all this talk about how much power LLMs and diffusion models use companies are still constantly cramming it into places where it’s just running all the time passively doing things no one asked for.

      Overall power use by these things would probably be cut down by an order of magnitude by just limiting it to directed, intentional use only.