It doesn’t take that much energy and power to run an LLM or an image generator, and sure, it would take a lot with so many users connecting, across so many servers… but there’s just no way they’re not mining bitcoin. My math might not be mathing but it seems like AI doesn’t justify the power use and it seems like everybody’s lying.
Someone who knows more about this inform me of your opinions


Start with he premise that energy is not created or destroyed, it only changes state. Pushing processors to perform a task uses energy and creates heat to rearrange numbers and pixels. Combing through huge numbers of data vits takes a lot of energy. If you’ve ever worked with systems larger than a desktop you know how much they can draw.
My whole home lab uses about 700 watts for basic server duties. To power a single 3090 GPU to run some larger models my PC needed a 1000 watt PSU. Extrapolate that out to a commercial size org and you can see where the major power draw comes in.
That makes sense.