Remix.run Logo
crazygringo 16 hours ago

Is it though? When I ask an LLM research questions, it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.

Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.

Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.

If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.

usefulcat 15 hours ago | parent | next [-]

> it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.

In that case I think it would be only fair to also count the energy required for training the LLM.

LLMs are far ahead of humans in terms of the sheer amount of knowledge they can remember, but nowhere close in terms of general intelligence.

crazygringo 14 hours ago | parent [-]

Training energy is amortized across the lifespan of a model. For any given query for the most popular commercial models, your share of the energy used to train it is a small fraction of the energy used for inference (e.g. 10%).

discreteevent 15 hours ago | parent | prev [-]

For this kind of thinking to work in practice you would need to kill the people that AI makes redundant. This is apart from the fact that right now we are at a choke point where it's much more important to generate less CO2 than it is to write scientific simulation code a little quicker (and most people are using AI for much more unnecessary stuff like marketing)

crazygringo 14 hours ago | parent [-]

> For this kind of thinking to work in practice you would need to kill the people that AI makes redundant.

That is certainly not a logical leap I'm making. AI doesn't make anybody redundant, the same way mechanized farming didn't. It just frees them up to do more productive things.

Now consider whether LLM's will ultimately speed up the technological advancements necessary to reduce CO2? It's certainly plausible.

throwaway-11-1 8 hours ago | parent [-]

Honest question - what are artists being freed up to do that’s more important? DoorDash?

crazygringo 7 hours ago | parent [-]

Honest answer - making more art.

Think about how much cloud computing and open sourced changed it so you could launch a startup with 3 engineers instead of 20. What happened? An explosion of startups, since there were so many more engineers to go around. The engineers weren't delivering pizzas instead.

Same thing is happening with anything that needs more art -- the potential for video games here is extraordinary. A trained artist is way more effective leveraging AI and handling 10x the output, as the tools mature. Now you get 10x more video games, or 10x more complex/larger worlds, or whatever it is that the market ends up wanting.