Ted Chiang: ChatGPT Is a Blurry JPEG of the Web

I want to note a human producing worthwhile art's carbon footprint is, uh... different then an AI producing shit.

But also the study was obviously slanted and kind of goofy nonsense.

But welcome to the age of misinformation, where we are always a google search away from a dressed up article to cite as factual studies. The doublewrong argument.
 
To be frank, the entire line of argument, that LLMs use too much energy, felt silly to me.

Think people just copy pasted the entirely valid argument about crypto mining, but didn't consider that there's a difference between someone buying racks of GPUs and people typing in prompts.
 
To be honest, my take on the "it uses piles of power" issue isn't the environment or anything like that:
It is the expense.

Bitcoin miners were able to pay for those kind of power usage issues because they were getting Bitcoin that for a time earned a profit when sold that covered that power use. AI on the other hand is a genuine product. It isn't just a scam to get quick money, it is trying to do something.
Which means it needs to pay the bills properly, and all indications so far seem to show that the big name subscription models are not charging enough to pay for things like the massive power bills they certainly have.

To put it one way: AI using the same energy as a lot of homes isn't an issue to me because of the impact of that energy use, it is because it isn't making the money those homes make to pay their power bills.
 
To put it one way: AI using the same energy as a lot of homes isn't an issue to me because of the impact of that energy use, it is because it isn't making the money those homes make to pay their power bills.
That seems like a case of a problem that belongs to the people who want it to be a viable product, though?

Also I am curious whether it's actually power bills constituting the operation cost. Or whether it's actually rent/capital expenses of the compute hardware.
 
That seems like a case of a problem that belongs to the people who want it to be a viable product, though?

Also I am curious whether it's actually power bills constituting the operation cost. Or whether it's actually rent/capital expenses of the compute hardware.
... I'm saying that, among many costs, the power usage numbers imply a big baseline that they need to pay for.
With my main issues with generative AI being the mess that is going on with the groups that are spending a lot of money on the bubble.

There are many parts to that cost, and from my understanding they are being hidden under the much more massive costs of training new models, but those models don't seem to be doing enough to let these places just keep getting money for training new models forever.
 
... I'm saying that, among many costs, the power usage numbers imply a big baseline that they need to pay for.
With my main issues with generative AI being the mess that is going on with the groups that are spending a lot of money on the bubble.
I find there's a huge disconnect in my thinking where like, I am relatively pro AI here but see private capital possibly throwing itself down a rathole on unsound AI enterprises as a "play stupid games" situation. Whereas you and maybe some others who I perceive as clearly anti AI are framing their exposure as a problem rather than something to point and laugh at.

This seems backwards and confuses me greatly.
 
I find there's a huge disconnect in my thinking where like, I am relatively pro AI here but see private capital possibly throwing itself down a rathole on unsound AI enterprises as a "play stupid games" situation. Whereas you and maybe some others who I perceive as clearly anti AI are framing their exposure as a problem rather than something to point and laugh at.

This seems backwards and confuses me greatly.
Point and laugh?
No, I am worried that there are a lot of things being built up and terrible decisions being made in pursuit of a bubble that can't do what they want it to do.

I'm not amused by the way they are failing, I'm upset and concerned.
 
Point and laugh?
No, I am worried that there are a lot of things being built up and terrible decisions being made in pursuit of a bubble that can't do what they want it to do.

I'm not amused by the way they are failing, I'm upset and concerned.
That just sounds like exactly what I expect from venture capital activity in general.
 
So how wasteful something is, is just your opinion?
People are using LLMs for all sorts of things.
That is how calling things wasteful usually works, TBF. Sometimes it's somebody we agree is the main stakeholder so they have extra authority to say something isn't giving value. But if I called megayachts wasteful (and I would) that's just me privileging my opinion over that of ultrarich boat buyers.
 
At least porn sites produce an actual product that isn't just generative AI slop. Though plenty are inundated with it.
That's my point. Cryptominers don't really have an actual product they're creating. Generative AI does.

People can decry it as slop all they want, but the fact that so many people actually use it means they do find it useful for their purposes, an actual product.

Slop has value if it's still eaten.
 
A lot of what LLMs are being used for right now are things that they either fundamentally cannot actually do (but can kinda-sorta approximate for a while), or things that they cannot do at the price they are being sold for.

But there are absolutely things that LLMs can do that people are using them for and willing to pay the price that it costs to actually do it. as long as you recognize that nothing an LLM does can be trusted, and you're using it for fun or as a quick pass that generates untrusted output you either fix up later or just accept as maybe-bad-but-better-than-the-nothing-alternative, they can be okay.

Unfortunately, things that LLMs do worse than the previous method are getting LLMs shoved into them in ways that remove the previous method and there's nothing we can do about it - internet search gets worse, and you can't build your own replacement, you can't get a Windows without LLM garbage built in, etc.
 
okay well you see AI Q&A may not always be factually accurate but at least it will pat you on the head for asking such a deep and insightful question instead of sneering at you for not already knowing or ignoring your question and answering a different one it just made up
 
That is how calling things wasteful usually works, TBF. Sometimes it's somebody we agree is the main stakeholder so they have extra authority to say something isn't giving value. But if I called megayachts wasteful (and I would) that's just me privileging my opinion over that of ultrarich boat buyers.
Objectively, Generative AI has an actual product. You have to be willfully ignorant to deny that. Especially when you said that your criteria for when something isn't wasteful is if it has a product.
 
Objectively, Generative AI has an actual product. You have to be willfully ignorant to deny that. Especially when you said that your criteria for when something isn't wasteful is if it has a product.
Yes, but so do megayacht builders. And dropping half a sentence isn't cricket. You cannot in good faith act as if this stopped at "product".
At least porn sites produce an actual product that isn't just generative AI slop.
 
Yes, but so do megayacht builders. And dropping half a sentence isn't cricket. You cannot in good faith act as if this stopped at "product".

So what's the argument here?

If I should take that quote as "it's a product, but I don't like it" then there's no point in arguing that it actually is a product. If it's "it's not a product because I don't like it", one doesn't necessarily need to talk about the not liking it part to argue that it is a product. What else remains?

Not that this whole "is or isn't a product" thing makes sense to me anyway. "Not producing a product" isn't really what makes crypto mining a problem, after all.

-Morgan.
 
So what's the argument here?

If I should take that quote as "it's a product, but I don't like it" then there's no point in arguing that it actually is a product. If it's "it's not a product because I don't like it", one doesn't necessarily need to talk about the not liking it part to argue that it is a product. What else remains?

Not that this whole "is or isn't a product" thing makes sense to me anyway. "Not producing a product" isn't really what makes crypto mining a problem, after all.

-Morgan.
The argument for me is that @ninjasaid13 is making unsound arguments against the post by @ExistentialBread

I don't need to (and don't) agree that genAI has no useful output for that. Or that the post said anything substantive, even.
 
That's my point. Cryptominers don't really have an actual product they're creating. Generative AI does.

Well, slop has value if it's eaten, but most AI slop isn't! The classic method of using AI to generate slop involves not accepting the first generated image, but either generating a whole sequence of images and cherrypicking the best ones or trying to "prompt engineer" your way to get closer to the specific ones you want. This is a little less common with AI text because it's something everyone thinks they have the skill to directly edit, even if they don't, but it does happen.

So actually you argued against yourself, well done.

But more than that, there's a pretty easy definition of wasteful, an objective definition of wasteful, that you've chosen to overlook, one directly related to the power argument. Specifically, it's wasteful because it's causing older, less efficient, more polluting power plants to be either kept online or brought back online to meet demand generated solely by this source. Plants that, prior to the additional demand, had either commercially failed or were considered surplus to requirements, that are objectively more costly to run, that are objectively worse for their surroundings.

The AI guys freely admit they are the reason why we're not going to meet the carbon targets, but say it's okay because the AI will fix it, something that LLMs have demonstrated no capacity to do.
 
Last edited:
Yes, but so do megayacht builders. And dropping half a sentence isn't cricket. You cannot in good faith act as if this stopped at "product".

how many megayacht builders have millions of people using it monthly in pretty much everything?
I would call technology like chatgpt a general purpose technology.
 
Last edited:
Well, slop has value if it's eaten, but most AI slop isn't! The classic method of using AI to generate slop involves not accepting the first generated image, but either generating a whole sequence of images and cherrypicking the best ones or trying to "prompt engineer" your way to get closer to the specific ones you want. This is a little less common with AI text because it's something everyone thinks they have the skill to directly edit, even if they don't, but it does happen.

So actually you argued against yourself, well done.
Disagree. The end result is you're still using the output of the AI, the product.

I, uh, actually don't know what you're trying to argue here. That because people are trying to re-generate an image and pick the best one, it's not a product? That the final output chosen does not have value to the person choosing it? That's a baffling line of thinking frankly.

But more than that, there's a pretty easy definition of wasteful, an objective definition of wasteful, that you've chosen to overlook, one directly related to the power argument. Specifically, it's wasteful because it's causing older, less efficient, more polluting power plants to be either kept online or brought back online to meet demand generated solely by this source. Plants that, prior to the additional demand, had either commercially failed or were considered surplus to requirements, that are objectively more costly to run, that are objectively worse for their surroundings.

The AI guys freely admit they are the reason why we're not going to meet the carbon targets, but say it's okay because the AI will fix it, something that LLMs have demonstrated no capacity to do.
I haven't looked too deeply into this, but as with everything related to AI and electricity, I'm going to need a link.
 
Disagree. The end result is you're still using the output of the AI, the product.

I, uh, actually don't know what you're trying to argue here. That because people are trying to re-generate an image and pick the best one, it's not a product? That the final output chosen does not have value to the person choosing it? That's a baffling line of thinking frankly.


I haven't looked too deeply into this, but as with everything related to AI and electricity, I'm going to need a link.
Here you go!
www.powermag.com

Power Demand from Data Centers Keeping Coal-Fired Plants Online

The power generation sector is looking at numerous ways to provide enough electricity to satisfy demand from data centers. Bloomberg Intelligence recently
 
Back
Top