AI Art

I'll ask you kindly to not attempt to put words in my mouth.
Shouldn't have put them in your own mouth:
Saying that artists shouldn't be expected to know about robots.txt and other technical anti-scraper solutions and as such should be absolved of not using them is also problematic.

Taken further, this is the same train of thought as "they cannot be blamed for breaking the law because they didn't know such a law exsists".
You are literally comparing not knowing what to do if you don't want to get scraped (or not knowing scraping was a risk in the first place) with not knowing a law and violating it, and I am using the word "literally" literally. The hell do you mean, "shouldn't be absolved of not using anti-scraper solutions"? Since when did not using robots.txt become a sin one would need to be absolved from? This is victim blaming at its finest.
 
Since when did not using robots.txt become a sin one would need to be absolved from? This is victim blaming at its finest.
You're getting hung up on a spontaneous word choice I made in paraphrasing your argument.

Also, 'sin'? You're certainly happy to use suggestive language yourself, eh?

You are literally comparing not knowing what to do if you don't want to get scraped (or not knowing scraping was a risk in the first place) with not knowing a law and violating it
No, I'm saying that it's the same type of argument chain with a given example of something that will not hold up in a legal dispute.
 
Last edited:
You're getting hung up on a spontaneous word choice I made in paraphrasing your argument.

Also, 'sin'? You're certainly happy to use suggestive language yourself, eh?
Hm? So just which argument of mine did you paraphrase in this post, which I will quote in full again:
Saying that artists shouldn't be expected to know about robots.txt and other technical anti-scraper solutions and as such should be absolved of not using them is also problematic.

Taken further, this is the same train of thought as "they cannot be blamed for breaking the law because they didn't know such a law exsists".

The argument might not be as effective in a court of law as you seem to think it is.

Especially when combined with uploading their art to websites where doing so transfers rights and/or ownership. (And I'm explicitly not talking about third-party re-uploads.)
But I'm pretty sure we've had the Terms of Service debate before.
This was your answer to @E.I.G., in case you forgot.

No, I'm saying that it's the same type of argument chain with a given example of something that will not hold up in a legal dispute.
And I am saying that it's complete nonsense. No, an argument chain that tries to absolve the perpetrator from responsibility for things they did is obviously very much not that same as an argument chain that absolves the victim from responsibility for things done to them, and also why the hell do you expect the latter to be brought up in a legal dispute to begin with?
 
Last edited:
My first reply to you was for a simple explanation to clarify why having a publicly viewable gallery was in fact a calculated risk. They have some or all of their artwork visible to everyone in order to promote those who like it to pay them to make more.
The proper precautions before this whole mess were things like signatures in the body of the artwork or watermarks. Things that make it harder for others to repost the artwork and claim it as their own. That was the reasonable precautions of the time, because that was the main risk: Other people reposting their work and claiming it.
As you are saying now, pandaora's box is open, and it wasn't until after that opening that we knew the cause of the issue. The concern was not there before then because the threat was not.

Adding to this, when a human takes a work from a publicly-viewable gallery to falsely claim as their own or incorporates it into their own work without permission (tracing, etc), it's called plagiarism. It doesn't stop being that just because the artists 'failed to take appropriate precautions' or because the art thieves can now automate and launder the process. Yes, any time an artists share their work in a public venue there is a non-zero risk of it being stolen, but it's still a shitty thing to do.

Really, I've found a large degree of overlap between the typical arguments of AI Art apologists and serial plagiarists. Ultimately, it comes down to a entitled belief that the thieves who stole the work are somehow more deserving than artists who created it in the first place.
 
Adding to this, when a human takes a work from a publicly-viewable gallery to falsely claim as their own or incorporates it into their own work without permission (tracing, etc), it's called plagiarism. It doesn't stop being that just because the artists 'failed to take appropriate precautions' or because the art thieves can now automate and launder the process. Yes, any time an artists share their work in a public venue there is a non-zero risk of it being stolen, but it's still a shitty thing to do.

Really, I've found a large degree of overlap between the typical arguments of AI Art apologists and serial plagiarists. Ultimately, it comes down to a entitled belief that the thieves who stole the work are somehow more deserving than artists who created it in the first place.
The idea that "well, if you didn't want me to take it you shouldn't have made it possible to take" is a Libertarian one. That you and only you are responsible for your own prosperity and that includes just how willing you are to step on others to get what you want. Obviously, if you are stupid enough to even make it remotely possible for me to take what you have and profit from it, then you deserve it, because what did you think was going to happen?

This, of course, ignores how reality doesn't work that way, no matter what The Fountain Head would tell us.
 
It really just reads like warmed over leftovers from the Napster/Limewire/Pirate Bay-era, just pure "It's not stealing because I really, really want <thing> and don't want to pay for it, and really they should be thankful I'm even listening to/watching it at all".

We all made fun of the "You wouldn't download a car!" videos at the time but it doesn't change the fact that it was still, you know, stealing.
 
It really just reads like warmed over leftovers from the Napster/Limewire/Pirate Bay-era, just pure "It's not stealing because I really, really want <thing> and don't want to pay for it, and really they should be thankful I'm even listening to/watching it at all".

We all made fun of the "You wouldn't download a car!" videos at the time but it doesn't change the fact that it was still, you know, stealing.
The other reason we made fun of it is that all of us would, in fact, 100% download a car if we could.
 
More like taking a photo of the car and building a replica in your own garage.
 
I'll note for clarity's sake that a lot of people are hung up on acquisitions of art as.

Like. Just as one example?

Midjourney absolutely stole artist's works. Like. In the most "Cry about it" way of backroom dealing with Tumblr staff / shareholders, working in a clause that all content hosted on the site is retroactively given permission to be fed into its training pools / data sets and needs to be opted out of, and in a way that there's enormous legal grey area for if somebody's blog opts out but somebody who never did reposts the art / reblogs it.

"Our training sets were ethically sourced" is an extremely poisoned well because some of those "ethically sourced" datasets are "We retroactively updated ToS on unrelated sites as part of a deal so it's not technically theft anymore".

This is the sort of thing hanging over the subject's head, and why a lot of responses are "Fuck you" without even giving the time of day. We can safely say SV wouldn't do it, but imagine a bizarro world SV where Squishy made some private talks behind the scenes and then announced tomorrow how from this point forward everything in User Fiction was now free fodder for ChatGPT. What it would do to SV's writing community. This is a fear hanging over pretty much every major art site / host these days.
 
Hate to tell you this, but ChatGPT has absolutely eaten the creative writing section. It can't keep authors or titles straight, but they're in the database.
 
The way humans learn has little to do with how AIs learn.
I feel like the way diffusion learns is very similar to the way unconscious humans learn but it can't go further. It 'remembers' things as a cluster of associations not a literal jpeg file, which is more similar to humans than to a classical computer database, but unlike humans it cannot 'wake up' and apply rational conscious thought to whether those associations make sense.

For example when I dream about being on vacation with my cat, I eventually wake up and go "while I go on vacation and I have a cat I do not actually go on vacations with my cat", diffusion though will simply eternally go "Firefossil is associated with cats and vacations, therefore Firefossil on a vacation with a cat makes sense" unless you develop apply conscious effort via prompts and modeling to disabuse it of the false association.

Being able to create a machine that can 'dream' is impressive but you are never going to get AGI without lucidity, and diffusion has yet to show a hint of that. Even the most advanced ChatGPT has yet to show much sign of object permanence, learning from mistakes, or higher logic. Only flaccid mimicry of people doing such, a P-zombie going through the motions, until an obvious mistake belies its lack of comprehension.
 
Last edited:
I have a question if anyone know what kind of prompts might allow these styles of armor to be recreated?
I belt they are both AI generated. Both from YouTube screen captures.

 
I have a question if anyone know what kind of prompts might allow these styles of armor to be recreated?
I belt they are both AI generated. Both from YouTube screen captures.

The first picture looks photoshop to me. Like the head and the neck proportions look definitely off, lol.

The second image is definitely ai. The armor should not be seen through the character's hair like that.

Why is there always a blurry background with these images? It looks so nauseating.
 
The first picture looks photoshop to me. Like the head and the neck proportions look definitely off, lol.

The second image is definitely ai. The armor should not be seen through the character's hair like that.

Why is there always a blurry background with these images? It looks so nauseating.

I did a reverse image search with respect to the first one and at least at the time nothing came up. Been a few months and possible that something new came up. I was hoping to find more pictures with weird style of the armor. Wanted to see if maybe an artist had created others. It is from the cover of a YouTube video that I saw on my phone and took a snapshot.

I will say that while I don't use photoshop myself, I have used GIMP to fix AI created artwork. It does not have to be one or the other.
Original:
Fixed with GIMP
 
It really just reads like warmed over leftovers from the Napster/Limewire/Pirate Bay-era, just pure "It's not stealing because I really, really want <thing> and don't want to pay for it, and really they should be thankful I'm even listening to/watching it at all".

We all made fun of the "You wouldn't download a car!" videos at the time but it doesn't change the fact that it was still, you know, stealing.

The core issue with IP violation is that it isn't stealing.

If someone made an exact copy of your car in the middle of the night but it was still there in the morning when you went to drive it, you wouldn't call the police to report a stolen car.

It's an entirely seperate and newly written crime to violate IP.

Similarly, AI art doesn't really even violate those IP laws, so anti-AI people are left only able to demand new legislation because what they're opposed to at its core isn't even illegal.

That does sound interesting. I tried to find out more about it but couldn't find anything on their site with a five minute search. Do you have a link?

Though as regards insiders, I mean, Boeing's guy is probably going to need a new job soon ;)

Since you seem to be unaware, I'll just point this out:

Adobe's claim to have an AI Art generator that is 100% watertight under all existing and future law is completely true. They do.

This is because Adobe has legal rights to basically everything, since basically everyone uses Adobe products like Photoshop.

In other words, the implementing the demands of Anti-AI crowd means pretty much nothing except overwhelming corporate monopoly on art. Open source projects would be strangled and only Adobe and maybe a few other billion dollar companies would control all the AI art.

Actual artists would be just as affected by AI art as now, if not more so because I'm sure with unlimited money, Adobe could find a way to accelerate things in whatever the worst direction is.
 

Well we have a major studio now replacing all the voice actors they legally could with AI. It also helps that this is for markets where the CEOs don't speak the language so can more easily ignore the problems with computer voices.

I would be careful in taking that. Let's take a loot at that statement:

" The French voice actor for Hitch has apparently been told by someone at Hasbro that dubs of the show are now being done with AI voices outside the USA. "

There is no confirmation. There are just allegations. Allegations that has someone, who was told by someone else. We aren't getting info directly from source. This is "my friends cousing told me that..." level of accusation.
 
Last edited:
An actor working on the series in question isn't 'friend of a friend' level information.

edit: also here is a translation of the interview. x.com
 
Last edited:
An actor working on the series in question isn't 'friend of a friend' level information.

An actor telling they were told by someone at Hasbro, however, is. The actor was not told "We are using AI". Actor was told by someone working at Hasbro that Hasbro will be using AI going forward.

But, let's say we can take all of this at face value. I notice people kinda skipped over the "Hasbro is in dire financial strains", and guess what are expensive? Voice actors. So effectively decision to move to AI voice actors is because Hasbro can't keep paying for all the dubs.

Which is changes the context of "corporate moves to adopt AI voice actors" quite significantly. It's no longer "corporate being cheap for sake of cheap", it's "corporate trying to save money because they got money trouble by using AI dubs".
 
Last edited:
An actor telling they were told by someone at Hasbro, however, is. The actor was not told "We are using AI". Actor was told by someone working at Hasbro that Hasbro will be using AI going forward.

But, let's say we can take all of this at face value. I notice people kinda skipped over the "Hasbro is in dire financial strains", and guess what are expensive? Voice actors. So effectively decision to move to AI voice actors is because Hasbro can't keep paying for all the dubs.

Which is changes the context of "corporate moves to adopt AI voice actors" quite significantly. It's no longer "corporate being cheap for sake of cheap", it's "corporate trying to save money because they got money trouble by using AI dubs".
Note that "money trouble" means they still made money, they judt missed their earnings target, iirc.
 
Note that "money trouble" means they still made money, they judt missed their earnings target, iirc.

Company can have money troubles even when it makes money. if the company can't keep making more money than they are spending, they will run into trouble. "Money trouble" does not mean "Made no money", it means "our expenses are outpacing our income"
 
Company can have money troubles even when it makes money. if the company can't keep making more money than they are spending, they will run into trouble. "Money trouble" does not mean "Made no money", it means "our expenses are outpacing our income"
When a company's expenses outpace its income, it's called "losing money", not making. When people say the company is "making money", they usually, if not always, explicitly mean the inverse is true.
 
Back
Top