Not necessarily. I think the point of comparison here is how much energy does AI use to e.g. generate a video, compared to the energy used not by the human themselves, but by running XYZ software on a computer with a beefy graphics card for however many hours it'd take a human to do the same work.
While that's a valid point of view, as long as the human is the bottleneck, it's not going to scale to infinity and beyond.
Human's got to exist and needs to work to eat. They don't really, necessarily, existentially need to be 10x productive with the help of AI.
But I'll be honest, that's not really a solid argument, because it could rapidly lead to the question of why they do this exact job in the first place, instead of e.g. farming or whatever else there might be that can be called a net positive for humanity without reservations.
Indeed. Everything taken into account - work getting done takes energy. And if an agent can do a task for less energy than a human, then indeed it's a benefit. This would be the apt comparison, instead of looking at some overall datacenter energy consumption.