[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/ic/ - Artwork/Critique


View post   

File: 390 KB, 1202x1212, 1656999831397.png [View same] [iqdb] [saucenao] [google]
6473570 No.6473570 [Reply] [Original]

The team behind the Copilot litigation files a class-action lawsuit against Stability AI, DeviantArt, and Midjourney. Complaint here.
https://stablediffusionlitigation.com/

>> No.6473584

>Collage tool
ABSOLUTELY MOGGED

>> No.6473585

A minor setback at best.

>> No.6473587
File: 270 KB, 739x474, AIArtThief4.png [View same] [iqdb] [saucenao] [google]
6473587

>>6473570

>> No.6473606

>>6473570
Fuck yeah. Souless pajeets and other non-humans on suicide watch.

>> No.6473614
File: 92 KB, 447x628, FmTd_fXXoAoIaoW.png [View same] [iqdb] [saucenao] [google]
6473614

Don't worry, Midjourney's legal team is here to save the ai future

>> No.6473615

>>6473614
They're scared as fuck lmao.

>> No.6473631

>>6473570
Reddit is burning with seethe and rage.
It's real funny how every side puts themselves as the protectors of mankind against corporate domain.
AI fags have no idea of how big the dick corpos are going to stick in them is the moment their jobs become automated.

>> No.6473636

2023 is looking good!!

>> No.6473642

>>6473570
They are fucked.

How do I throw money at this shit?

>> No.6473655
File: 107 KB, 1080x654, FmcwVTXXoAE6y2N.jpg [View same] [iqdb] [saucenao] [google]
6473655

>>6473631
>nooooo stop oppressing machines you stupid h*mans
Reddit is truly a magical place

>> No.6473662

>>6473655
ai niggers really are soulless lmao

>> No.6473663
File: 53 KB, 896x1001, 428A4C61-F335-4231-AEF3-35A89BFE997B.jpg [View same] [iqdb] [saucenao] [google]
6473663

>>6473614
>We provide the service as is, and we make no promises or guarantees about it.
Is that legally enough to cover their asses even though they put together the dataset of images that didn’t belong to them?
That’s like someone kidnapping a baby and putting it in a hydraulic press, and blaming someone else for pushing the button that transforms said baby into a unique work of art that never existed before.

>> No.6473670

>>6473655
They saw the Reddit avatar was a little robot and said to themselves, “he’s literally me.”

>> No.6473671

>>6473655
post-humanist dredge, these people have no sense of values and happiness

>> No.6473673

>>6473655
What redditurds don't get is that this doesn't hold progress back at all. All the research is already done. What happened is that some greedy pajeets came afterwards to abuse the tech to reap all the profit.

>> No.6473675

>>6473642
>How do I throw money at this shit?
Post the numbers on the bottom of your personal check and I can transfer the money for you? How much do you want me to give them?

>> No.6473676

>>6473655
he hates humanity and GOD. not sure where that leaves him.

>> No.6473678
File: 202 KB, 537x550, 1662919936007.jpg [View same] [iqdb] [saucenao] [google]
6473678

>>6473570
I hope something come out of this, but I doubt it

>> No.6473680

>>6473673
Actually it's not so different from piracy. There's nothing wrong with torrenting itself. It's just abused by pirates.

>> No.6473684

>>6473673
It's just asking for basic regulation, like with any other innovation/discovery, anyone seething is just a pajeet trying to make a few more bucks on redbubble/fiverr before they are forced back to burger wagies.

>> No.6473687

>>6473675
I don't trust you Raj.

>> No.6473690

>>6473680
>There's nothing wrong with torrenting itself.
It's a good way to download the new, improved models for Stable Diffusion.

>> No.6473692

>>6473684
this

>> No.6473696

>>6473687
>I don't trust you Raj.
How did you know my name, sir?

>> No.6473700

>>6473696
By your smell.

>> No.6473703

I'm curious, how the legality situation is going? Did any government show concern? Is it there any plans to create laws?

This is interesting. Different from pirated content (music, books, movies, etc...) where the content itself is distributed, and they can aim these distributions vehicles (copyrighting videos, forcing servers to delete the files and so on), AI is a tool. The content generated by it is not always easy to spot, and it is very random.

Can the tool itself be accused of copyright infringement? Even if the tool was, let's say, open source? Well, I guess you have to prove to have rights for the material used to train the models...

>> No.6473707

>>6473700
Thank you.

>> No.6473713

>>6473703
At best what will happen is that fair use laws will change for the worst.

big corpo will be the only winner here.

>> No.6473714

>>6473703
You are a year behind on the discussion

>> No.6473733

>>6473703
>Is it there any plans to create laws?
Perhaps but their argument essentially boils down to SD having violated existing copyright laws in the process of training their algorithm because it stores a compressed version of a whole bunch of copyrighted images without permission and without any compensation to the copyright holders. And since I know some retard is gonna reply with "a-actually it does not store pixels checkmate lmao" to this post, PLEASE EXPLAIN YOUR DEFINITION OF THE LATENT SPACE THEN. Because strangely, no matter where you research you're gonna find definitions that describe it as "a form of data compression".

The problem is that since the latent space is not something that can be easily "visualized" by people, they think it's some sort of "machine brain" or magic (yes I actually saw a twitterfag call it "magic" today) when in reality, it's just a mathematical way of representing the training data in a very compressed form and all the "new" images created with SD are simply interpolations from the latent space.

>Can the tool itself be accused of copyright infringement?
No but the people who made the tool can.

>Even if the tool was, let's say, open source?
Something being open source does not mean that it's above the law

>> No.6473743
File: 70 KB, 600x908, Two-Buttons.jpg [View same] [iqdb] [saucenao] [google]
6473743

>NOOO AI DOESN'T COPY IMAGES YOU DONT UNDERSTAND

if it doesn't copy images then just remove artists work from the data set

>B-BUT FAIR USE!!!111!!!1!1

>> No.6473745

corporations are salivating at the chance to automate their media products more. this lawsuit will get kneecapped somehow.

>> No.6473748

>>6473570
>collage tool
Dead on arrival.

>>6473703
>Did any government show concern?
The American military is very involved with AI development. They are in no way interested in shutting down or doing anything that slows down the progress of AI research. Same should be true for any big military powers like the UK, Israel, China and Russia.

>> No.6473751
File: 360 KB, 989x806, 1670901526519902.png [View same] [iqdb] [saucenao] [google]
6473751

>>6473748
?

>> No.6473757
File: 568 KB, 1080x1698, Screenshot_20230114-225949.png [View same] [iqdb] [saucenao] [google]
6473757

>>6473748

>> No.6473759

pajeets on reroll watch right now

>> No.6473763

>>6473751
>>6473757
lol good luck enforcing that

>> No.6473764
File: 597 KB, 750x1153, 1672520865366463.png [View same] [iqdb] [saucenao] [google]
6473764

>>6473748
?

>> No.6473766

people on ;g; already mad coping and seething

the more you read the more obvious it becomes these people have mental illness or smth

>> No.6473768

>>6473745
Requiring the artist's permission to train AI on their work would probably be enough of a win for the big corps.
They have decades of their own art they can train on and enough cash they can just buy whatever art they need if they need to train more styles.
It's the little guys that will get fucked. And the big guys are only going to love that.
Though matters on your perspective if it's a good or bad thing overall I guess.
For you turd world freelancers and your cottage industry of selling furry porn to internet weirdoes it would probably be a win.
For small companies making indie games and what not, not so much.

>> No.6473774
File: 432 KB, 607x1079, 1653389300577.png [View same] [iqdb] [saucenao] [google]
6473774

Your allies.

>> No.6473780

>>6473768
>For small companies making indie games and what not, not so much.
They could always just hire artists, you know

>> No.6473784

>>6473780
While competing with big companies who don't need to do that.

>> No.6473786

>>6473774
I don't give a shit about politics, it's humans against the machines. And Ortiz is a damn good artist.

>> No.6473790

>>6473774
>implying people with different views can't work towards a common goal

>> No.6473792

>>6473614
They're so eager to throw their users under the bus, and the users are so cucked they don't give a shit.

>> No.6473793

>>6473784
Since when do indies really "compete" with big companies? For them the biggest problem will always be market being flooded in shit that prevents their stuff from being seen.

>> No.6473797

Good thing I still got the uncensored SD files for when I upgrade my computer

>> No.6473805
File: 123 KB, 1160x770, dumb retard.png [View same] [iqdb] [saucenao] [google]
6473805

>>6473774

>> No.6473813

>>6473570
aibros im literally shidding and seething and coping rn
maybe i need to generate some more underage children

>> No.6473820

I think the artists have gone about this in the whiniest, stupidest way possible (especially the realistic pokemon loser trying to expand copyright law)
But it would be really funny if the tech guys ate shite. Unfortunately it'd probably just mean some mega corp like microsoft buys up all their stuff and carries on as was

>> No.6473825
File: 65 KB, 185x298, ButtErick.png [View same] [iqdb] [saucenao] [google]
6473825

What is stopping me or an AI from saving images and bashing it together to make a new image? Assuming that is what an AI does

>> No.6473829
File: 677 KB, 959x1900, HAMMER TIME.jpg [View same] [iqdb] [saucenao] [google]
6473829

>>6473774
Somebody post that
leftist Jew woman who's SD's propagandist in-chief.

>> No.6473830

>>6473825
if you work for a studio and you use a copyrighted image in your photobashing, your studio can be sued, that is why artists purchase copyright free reference packs

>> No.6473831

>>6473825
Nothing is stopping you, everything will stop you from profiting off that tho

>> No.6473832 [DELETED] 

>>6473774
Nice try to divide and conquer
You smelly indian subhuman
This isn't reddit, we can see right through you.

Go jump infront of a train, nigger

>> No.6473833

>>6473825
I don't see why it wouldn't be covered by fair use as it is transformative enough and the ai is just a tool like photoshop.
Artists thinking they're going to get special treatment when they barely fund the government like massive tech companies do is laughable.

>> No.6473842 [DELETED] 

>>6473768
>you shouldn't to care about monetary interests of small time artists, but you are supposed to care about the monetary interests of indie game devs
I seem to see this train of thought a lot - "just think of all the indie developers who can generate their own assets for free, now they can get their game to market and have a higher profit potential, how wonderful!" Is it a symptom of a lot of the people jerking it to image generators being involved in tech fields, with dreams of making their own game?

>> No.6473847

>>6473768
>you shouldn't care about the monetary interests of small time artists, but you are supposed to care about the monetary interests of indie game devs
I seem to see this train of thought a lot - "just think of all the indie developers who can generate their own assets for free, now they can get their game to market and have a higher profit potential, how wonderful!" Is it a symptom of a lot of the people jerking it to image generators being involved in tech fields, with dreams of making their own game?

>> No.6473850
File: 1.24 MB, 725x2850, 1672788384640545.png [View same] [iqdb] [saucenao] [google]
6473850

>>6473833
Photoshopping one copyrighted image over the other with 50% opacity is not "transformative", either. SD does essentially the same, except with a much larger dataset of images and a different interpolation function. It's not creating anything new, it's creating in-betweens (and in some cases lossy copies) of existing data, also fair use exceptions apply to humans, not machines.

>> No.6473854

>>6473850
So the AI look for patterns and then when you prompt they recall those patterns that match your prompt and create the image?

>> No.6473860

>>6473850
Photoshop a copyrighted image with 50% opacity then show me the results.
Such a disingenuous "argument".

>> No.6473867

>>6473570
Imagine misunderstanding a new technology so fundamentally and then trying to litigate it. Literal qtard schizo tier.

>> No.6473869
File: 47 KB, 680x768, 1673681797101752.jpg [View same] [iqdb] [saucenao] [google]
6473869

Why are the Pajeets such big proponents of AI art?

>> No.6473878
File: 1.18 MB, 1470x3349, 1673735776978068.png [View same] [iqdb] [saucenao] [google]
6473878

For my Art bros

>> No.6473886

>>6473878
How can AI merge loli with big tiddies with greg's tyle? I need the original images AI is using to create these for me to use as references, does anyone where the rabbit hole for these images are?

>> No.6473887
File: 73 KB, 302x500, mikeymouse.jpg [View same] [iqdb] [saucenao] [google]
6473887

>>6473854
Yea basically, the patterns are clusters inside the latent space. The way these diffusion models work is they take some random noise, then try to "remove" that noise step by step in order to "reconstruct" an image and the resulting images will be interpolations from the latent space (which was constructed from the LAION dataset during the training process). Typing prompts basically points the algorithm to certain areas in the latent space (so if you type "cat" it will try to interpolate something from the "cat" cluster). Check out this article which explains latent space interpolation in more detail and shows how it differs from "regular" interpolation in the pixel space:
>https://hackernoon.com/latent-space-visualization-deep-learning-bits-2-bd09a46920df

>>6473860
Ok check out this OC I made

>> No.6473889

>>6473878
The base argument is nonsense

>> No.6473899

>>6473570
Based. Chuds seething

>> No.6473901

>>6473825
As evidenced by the very OP itself, America is a sue happy country and you can sue anyone for offending you. So a company using illegally obtained anything is grounds for lawsuit.

>> No.6473906

>>6473774
And they are all more valuable than you
How does that make you feel?

>> No.6473920
File: 57 KB, 647x778, ddf.png [View same] [iqdb] [saucenao] [google]
6473920

>>6473570
WHAT? but AIsis you told me that artfags were too lazy and poor to start a lawsuite against AI, what are we going to do now???? we got too cocky aisis

>> No.6473935

>>6473901
>So a company using illegally obtained anything is grounds for lawsuit.
This is the dumbest bugman cope I have seen so far.
The guy in the OP is only co-counsel for the lawsuit here's the website for the actual law firm.
https://www.saverilawfirm.com/about-the-firm/
A ten year old firm that won billins in settlements does not take on a case for no reason.

>> No.6473942
File: 499 KB, 512x512, 18365210142.png [View same] [iqdb] [saucenao] [google]
6473942

>>6473887
that's not how it works though

>> No.6473944

>>6473935
>10 year firm
>$4.5 billion dollars acquired
Shit, where's the Loomis version of law? I'm going to be a lawyer.

>> No.6473945

>>6473942
Explain how I'm wrong then

>> No.6473947
File: 918 KB, 640x832, 4098907353.png [View same] [iqdb] [saucenao] [google]
6473947

>>6473945
well the simplest thing you are wrong about is that it's not interpolation, it's called diffusion for a reason

>> No.6473954

>>6473947
Ok, so are you claiming the prompted images are NOT based on interpolations from the latent space? How does SD create them then?

>> No.6473956

>>6473906
What are your pronouns?

>> No.6473957

>>6473614
Haha, that reads like something from the Onion. Who the fuck wrote that???
They going to come to your house personally with a bat and a couple Albanian mafiosos too?

>> No.6473960

>>6473944
>loomis of law
that would be literal law school

>> No.6473961

>>6473655
>Screw human
>Screw god
The absolute level of bugman

>> No.6473962

>>6473954
https://jalammar.github.io/illustrated-stable-diffusion/
This is an ok explanation

>> No.6473963

>>6473957
Yeah when I first saw that image I thought it was fake, then looked it up. Retarded piece of copy.

>> No.6474021

>>6473962
From an outsider’s perspective, your link is just proving his point. Interesting how you try to twist the same set of facts using biased language to push your point

>> No.6474025

>>6473774
I find it hilarious how Pajeets try to turn this shit into /pol/bait

>> No.6474036

Honestly im not against ai but seeing these fuckers from /g/ try to date chatbots and nipons making 3d ai toddlers truly disgusts me.

fucking degens

>> No.6474053
File: 212 KB, 474x491, basado.jpg [View same] [iqdb] [saucenao] [google]
6474053

>>6473684
I'm more afraid of what private corps and governments can do with this tech. Shit, they're already doing it and won't stop, ofc, but at the very least some regulation is needed so that not anyone with an inkling of reading comprehension can do some real harm with that. There's real degens out there. Photographic evidence for instance will be history in legal terms if the machine is good enough, which it will be. The arts are the least of my concerns when it comes to regulating shit like this, but it's an important start.

>> No.6474070
File: 30 KB, 640x336, al6f1lryi5m81.jpg [View same] [iqdb] [saucenao] [google]
6474070

>>6473655
>If you don't fall for this tech scam then you just hate progress!
Every time.
https://www.youtube.com/watch?v=wUlE02RU1AE

>> No.6474072

>>6474036
Most artist's aren't against AI either. Just AI that fucking steals your data.

>> No.6474079

People need to remember that this is just a reinforcement of copyright & fair use law practices that already exist. Video games companies 40 years ago could have legal action taken (or threat of legal action) against them due to them having copies of visual assets taken from established franchises, and music generators already had legal action taken against them for years every time they use non-public domain data without permission.

>> No.6474083

>>6474021
>try to twist the same set of facts using biased language to push your point
The bugmen on /g/ do this all the time. They act like, and probably are, the Antifa goons on /pol/.
>>6474036
They also remind me of the pedo fags on a dead *chan I can't name here.

>> No.6474094

>>6473963
It's a combination of trying to seem "hip" and thumbing their nose at people who question the legality by giving a flippant "nothing gonna happen" type of statement.

>> No.6474111

>>6473774
who tf cares about allies. The allies of the AI shills are redditors, neckbeards, and street shitters.

>> No.6474126

Why the automatic sage? The thread isn’t bumping. Is this because this is a 100% one sided win for artists?

>> No.6474148

>>6473570
AIcels on suicide watch

>> No.6474189

>>6474126
Mods and jannies are fags that are paid off to let the spam happen for the past half a year.

>> No.6474207
File: 645 KB, 640x832, 1318575434.png [View same] [iqdb] [saucenao] [google]
6474207

>>6474021
Depends on what his point is I guess, but if the point is it's just doing interpolation then no, it's not and the info in the link shows what it's actually doing.

>> No.6474210

>>6474021
>Durr but total retards like me don't understand so basically your wrong

>> No.6474212

>>6474207
What’s interpolation?

>> No.6474213

Poojeets on suicide watch

>> No.6474214
File: 421 KB, 3040x720, clownworld.jpg [View same] [iqdb] [saucenao] [google]
6474214

>>6473631
it's very satisfying seeing them kvetch

>> No.6474221
File: 740 KB, 640x832, 336837942.png [View same] [iqdb] [saucenao] [google]
6474221

>>6474212
google it

>> No.6474227

>>6473774
How will punitive damages be handled if they win? I still hope they win but like, it's kind of shitty if those artists get millions just because they're the plaintiffs.
Reminds me of that lady that became a millionaire because she got burned by a McDonalds coffee. Something about the punitive aspects of the American legan system really bother me.

>> No.6474259

>>6474221
I did. It doesn’t support your position, which is why I’m asking you for (your) definition

>> No.6474260

>>6474227
That lady got 3rd degree burns and had to have skin grafts

>> No.6474263
File: 603 KB, 1092x2048, 5BC790A9-E12D-4879-B722-ABC10D97F0A6.jpg [View same] [iqdb] [saucenao] [google]
6474263

>>6473655
As if progress is the be-all, end-all of the human experience. This guy is missing the forest for the trees.

>> No.6474264

>>6474227
The mcdonalds coffee incident is non ironically a good lawsuit, the only reason people see it as frivolous is precisely because there was a big demoralization campaign against it by the media likely sponsored by mcdonalds.

>> No.6474377

>>6473947
>it's not interpolation, it's called diffusion for a reason
Holy fuck, AI tards are the dumbest fucking faggots on the planet.

>> No.6474385

>>6473655
Average HFY hater showing his true colors

>> No.6474389

>>6473655
Non human

>> No.6474391

>>6473655
>5 upvotes
I want to start scamming people, people this retarded deserve to be exploited and there are so many of them

>> No.6474401

>>6473655
>5 updoots
Redditors don't deserve human rights

>> No.6474498

So what happens if they're charged?
Does it become illegal to use the tools?

What about ruDALLE?

>> No.6474531

>>6474263
Once progress is made, you can’t go back.
Stable Diffusion is open source and released to the masses. The genie is out of the pandora’s box.

>> No.6474535

>>6474531
You can slow progress by making it unprofitable to continue

>> No.6474539

>>6473570
>>6473587
>stable diffusion contains millions and possibly billions of copyright images
The tool or the individual model?
Has the existence of these copyright images have been demonstrated to be present in a non-transformative way in the released models?

This shit will go no where if they speak out of their ass. You have to have an understanding of the technology before you do lawsuits like this.

>> No.6474548

Implying I care about what an ameriburger court says.

>> No.6474553
File: 894 KB, 1629x633, AI Artist.png [View same] [iqdb] [saucenao] [google]
6474553

>> No.6474557

Klaus Schwab already revealed that the next step in his masterplan is a large scale cyber attack that will make Covid 19 look like a joke. It's over (for AIcels and digicucks).

>> No.6474561
File: 696 KB, 1398x1430, Screenshot_20230115_131510_Twitter.jpg [View same] [iqdb] [saucenao] [google]
6474561

lmao is this your legal dream team?

>> No.6474564

>>6474557
That old fart forgot Starlink exists.

>> No.6474633

>>6474539
Tell it to the judge Raj, literally LOL

>> No.6474648

>>6474561
>data can only be represented in one way

>> No.6474659

>>6474531
algorithmic disgorgement + ai to detect if it was made with open source model with DMCA suits after

Its over sirs. The genie need not be put back in bottle. It will simply die a slow death and be flushed down loo!

>> No.6474662

>>6474659
>trying to decypher Anything_Proto245v2.0_OrangeMix merges
Good luck. The models are so obfuscated at this point that it would be literally impossible.

>> No.6474673

>>6474662
just as impossible as making you shit in a toilet huh

>> No.6474682

>>6474673
Are you indian or something?

>> No.6474689

If this lawsuit pass, what will happen to my robot waifus in 30 years? Will it be postponed for another 30 years? I'm tired of waiting

>> No.6474701

>>6474682
>im not the indian, y-y-you are
thats all you pajeets got at this point, and it gladdens me

>> No.6474712

>thread discussing the lawsuit gets autosaged
>bait thread gets to stay up
>as soon as aicels in bait thread are asked to back up their claims, it immediately gets purged
Kek

>> No.6474730

>>6474701
>no y-you are the indian
Look at this poo in the loo, so sad

>> No.6474744
File: 193 KB, 1790x424, file.png [View same] [iqdb] [saucenao] [google]
6474744

This response is so awful in many ways. For example, pic related is just bad faith. AI bros general consensus is that it's okay to use artworks from artist without permission for their models. Even if the artist is on their knees they wouldnt give a shit.

>> No.6474748

>>6474744
I mean it literally starts with the "it does not save pixels" strawman argument, this tells you everything you need to know about their deboooonking tactics

Kinda funny how angry they are getting about it, tho

>> No.6474774

>>6473825
If you use any of my sonic inflation porn in ur filthy training set then I will coming for your ass with a lawsuit.

>> No.6474783

>>6474730
Cope however you want Ahmed, we're winning

>> No.6474796

>>6474783
>artists funding scam lawsuit kickstarters
>the arguments are all nonsensical, showing no knowledge of the technology and instead parrots the same Twitter arguments
>we are winning
This lawsuit stinks, literally. Good luck, pajeet anon.

>> No.6474803

I trust Joe Biden's Department of Art to micromanage my life. We are winning !

>> No.6474826

>>6474796
Ok anon, here's your chance to shine:
1. What are they getting wrong about the technology?
2. How would you define the latent space?
None of your aidditor buddies seem to be able to answer these simple questions, so maybe you can do it?

>> No.6474832
File: 42 KB, 694x339, file.png [View same] [iqdb] [saucenao] [google]
6474832

As much as i hate seeing AI art there is nothing wrong with StableDiffusion. Should've aimed their guns at pic related though

>> No.6474879
File: 321 KB, 1284x1472, a5ViiX3SYuJD.jpg [View same] [iqdb] [saucenao] [google]
6474879

>>6474826
>1
Calling it a collage tool when no one has found any evidence of exact replication of any of the training images. Not even exact replication has been established in cases of extreme overfitting (Bloodborne image, etc.).
>2.
What you faggots are misunderstanding about latent space is that, as pic related explains, the AI is not stiching together pieces of encoded training images, instead, it encodes the distribution of pixels in conjuction with CLIP language tokens such as the position of features such as eyes, faces, etc., to create a coherent image. The AI learns the average position of pixels relating to langauge tokens (“these many pictures of faces have the average pixels relating to face roughly in this range) and using random noise it can create basically infinite variations of the same prompt, since what it hallucinates from the set of pixels will be different each time the noise seed changes.
It would be physically impossible to fit the amount of images it was trained on (tens of billions of images), including the language model as well, in the 4gb forms the sd models were released in, even with the most advanced image compression tools on the planet.

>> No.6474885

>>6474879
>Calling it a collage tool when no one has found any evidence of exact replication of any of the training images
And nowhere in the lawsuit does it say that the images are "exactly replicated", they are specifically talking about latent space representation.
>the AI is not stiching together pieces of encoded training images
Again, nobody is saying that, this is a strawman argumant you aidditors are continuously using, including in the first paragraph of your debooonking article. What I'm asking you is WHAT data the latent space actually represents? No matter how much you try to weasel your way out of this, you cannot change the answer.

>> No.6474896

>>6474885
Literally OP’s pic related calls Stable Diffusion a “collage tool”, you fucking faggot.
How am I strawmanning when all you anti-ai faggots parrot the same argument: “AI is just glorified photobashing”. You don’t have any idea of what you are talking about, you pseud.
>what data the latent space represents
Represents what the AI has learned from the training, in pure statistical mathematical form (like any fucking ML model ever created), averages, distribution, basically an analysis of the training data, not a replica of the data. This completely destroys your faggot ass argument of “latent space is just images compressed and interpolated through”.
Cope harder.

>> No.6474917

>>6474896
>basically an analysis of the training data, not a replica of the data.
It's not supposed to be a "replica", it's supposed to represent the SAME data in a compressed and mathematical form. Your strawman argument is that you keep repeating over and over again that "it doesn't store pixels" or "it can't produce exact copies", even though nobody ever made either of those claims and none of that changes the fact that latent space representation is still a form of REPRESENTATION

>> No.6474948

Seriously, fuck these tranny janny faggot fuckers for making this thread autosage, fucking NGMI AI cocksucking perma wage slave faggots.

>> No.6474952

>>6474917
>it’s supposed to represent the SAME data in a compressed and mathematical form
You don’t know an ounce of ML faggot, what is stored is the analysis, the averages, statistical data, NOT EXACT positions of pixels. This is where you sre fundamentally wrong.
>novody ever made either of those claims
You retard. Read the thread >>6473887. ALL and I do mean all of you faggots make this exact argument and countless /ic/ threads talk about this single statement: “AI is just glorified interpolated photobashing”. You are coping hard now by obfuscating.
>it still is a form of representation
Yeah, a highly abstracted, statiscal analysis of the training data. Every ML model is like this. You are trying really hard to shoehorn your “it’s just compressed data bro” argument but it’s not working.
If the latent space is a representation of a highly PROCESSED data, than the data taken from the training data is new, transformative, by definition.

>> No.6474953

>>6474917
>it's supposed to represent the SAME data
Not that anon but this is very false. If this were true, then the outputs from programs that implement diffusion would not be any different from the input.

Diffusion models are not just advanced compression algorithms. They are training a neural network to learn (yes learn in the sense that humans learn) what an image looks like.
The input pixel data is not used to create the image. It is more analogous to the processes of the human brain than to compression algorithms. If you want to say that the mathematical representation counts as a copyright violation then you will have to put forward an argument that makes humans exempt fom storing images as a representation because humans store every image they see as a biochemical representation in their brains.
>inb4 "but muh humans are different and special"
It is very likely that the functions of the human brain can be 1:1 mimicked using purely mathematical representations in the future. There is nothing special about the biochemical processes that govern the human brain which make it exempt from the laws of the universe.

Please learn how the tech works properly before you start making arguments against it.
Artists spewing arguments from emotion without any of the correct facts or use of logic is cancer.
For anyone who wishes to know how diffusion actually works and not this blatant misunderstanding by this midwit "lawyer": https://arxiv.org/pdf/1503.03585.pdf

>> No.6474973
File: 108 KB, 500x280, jddcsi5pgk431.jpg [View same] [iqdb] [saucenao] [google]
6474973

>>6474796
*Yawn*

>> No.6474978
File: 142 KB, 502x372, 1970v1.jpg [View same] [iqdb] [saucenao] [google]
6474978

>>6474953
>They are training a neural network to learn (yes learn in the sense that humans learn)
HE SAID IT!
HE SAID THE SECRET WORDS

>> No.6474982

>>6474952
>>6474953
This arguments redditors fall in love with of "but humans aren't special we're just tiny in the universe god isn't real brrt I just shat my pants morty its the same as humans learn" is doing already taking some big assumptions when it comes to neuroscience and the process of actual human learning.
It is still highly debated whether the human brain can ever be computed in the first place, you're just assuming tech will advance to infinity and prove your assumptions right.

>> No.6474983

>>6474953
No one's making the human comparison argument except you disingenuous fucking faggots. If you want to compare the human brain to a neural network then you better tell the class how the brain actually works and not hide behind garbage non-answers like "it's just biochemical bro".

>> No.6474989

>>6474952
>You don’t know an ounce of ML faggot, what is stored is the analysis, the averages, statistical data, NOT EXACT positions of pixels
Aaaand nobody ever claimed it did, you are still clinging to your strawman
>You are trying really hard to shoehorn your “it’s just compressed data bro” argument but it’s not working.
Except that's literally what it is and all you're doing right now is basically plugging your fingers into your ears screaming "lalala you're wrong because I say so". Look up literally any article on the latent space, hell even the goddamn wikipedia definition, and you will read that it is a form of data compression. Nothing you said ITT disproved it, you can keep calling it "analysis" if you want but it does not change the fact that the goal of this "analysis" is representing the input data
>>6474953
>If this were true, then the outputs from programs that implement diffusion would not be any different from the input.
False, it's a lossy compression. So the results will not be exacty like the input but this does not make them into something brand new, it's still a form of representing the same data. Compression is not the same as copying

>> No.6475000

>>6474989
>Aaaand nobody ever claimed it did, you are still clinging to your strawman
Cope harder. I literally gave you in-thread examples.
>Look up literally any article on the latent space, hell even the goddamn wikipedia definition, and you will read that it is a form of data compression.
>False, it’s lossy compression
Look at this retard lol. Keep coping, faggot. I literally gave you a whole explanation of how latent space works and you still say “it’s just compression bro trust me”.
>so the results will not be exactly like the input but does not make it brand new
So it’s still a forn of replication? So I am not strawmanning you then lmao.

>> No.6475005

>>6474978
lol I don't even need to read their replies anymore. It's all the same nonsensical shit.
Makes me wonder if their no soul bugmen brain actually works like a machine because they seem to thing humans are capable of perfect replication in seconds like an AI.

>> No.6475011

>>6475005
Why does AI work so well then? That should you tell you something about the human brain, and skeptical anons who said “AI will never do *insert thing* ” get consistently BTFO over the years.

>> No.6475016

>>6475000
Where in the posts you linked does anyone say anything about it copying pixels?
>So it’s still a forn of replication? So I am not strawmanning you then lmao.
Do you realize that you can "replicate" an image without literally copying it pixel by pixel? If I take a shitty photo of an image, is that not an example of a very lossy "replication"?

>> No.6475017
File: 110 KB, 688x823, 1670907419423288.jpg [View same] [iqdb] [saucenao] [google]
6475017

Good luck pulling your "repeat shit until it sticks" shtick in court.

>> No.6475019

>>6475005
mark my words, the whole "b-b-b-b-b-but it learns like a human your honor!" will be their futile "defense" in court

>> No.6475020

>>6475016
>Do you realize that you can "replicate" an image without literally copying it pixel by pixel? If I take a shitty photo of an image, is that not an example of a very lossy "replication"?
Too bad that latent space is not lossy compression.
Refer to this post for explanation >>6474879.

>> No.6475021

>>6475011
>work so well
>every image is nonsensical
>literally just reverse engineers the artistic process and shits out garbage forever with no understanding
it's tiring to look at AI images because every single one demands that you parse what the fuck it's trying to say despite the fact it's saying nothing and is utter garbage.
also no one gives a shit about your attempts to bait more yous with more AI talking points that have nothing to do with what I typed

>> No.6475022
File: 4 KB, 327x137, imagesl.jpg [View same] [iqdb] [saucenao] [google]
6475022

>thread is not being bumped to front page without having hit bump limit
the fuck is going on here

>> No.6475023

>>6475011
If AI images to you work well then that just reveals just how garbage your taste is.

>> No.6475027

>>6475021
>>6475023
If there can even be a “taste” for anything AI related, it has succeded in breaching the human area it targets.
You can cope how much you want about AI being in this very early stage kinda wacky, but the results speak for themselves. ChatGPT stirred up the programming world, Stable Diffusion stirred up the artistic world. You can’t take that away with just nitpicks.
Even so, those nitpicks begin to sound like criticism you’d have towards human creations.

>> No.6475032

>>6475020
>post literally says that it's representing the training data by storing relevant information about it in the latent space
>SD can then recreate images from the latent space that look similar to the training images but aren't exact copies
>a lossy compression is a method of compressing data while ommitting certain information about the data, meaning the result will not be an exact copy but still represent the same data
>yet somehow, the latent space is not a lossy compression of the training data, even though everyone ITT is just a simple google search away from finding definitions that call it exactly that and none of your posts ITT managed to explain why it isn't
You are like a carricature of a redditor

>> No.6475036

>>6475027
>le taste
please gain a soul before you try and get some taste.

>> No.6475038

>>6475027
Who's arguing otherwise retard? You are free to have garbage taste, no one's taking that away from you. You can meet up with other likeminded room temperature IQ dumb fucks just like yourself and talk about how great disfigured big tittied humanoids with 30 fingers look great, just leave the rest of us with actual taste alone.

>> No.6475041

>>6475032
>post literally says that it's representing the training data by storing relevant information about it in the latent space
And how the fuck did you equate it to “lossy compresion”, retard? It literally says that it stores THE DISTRIBUTION, statistical analysis, not compressing data in any form.
>sd can recreate images from latent space that look similar
“Look similar” refers to how the AI estimates the pixels positions based on the noise it’s given, that pixels estimation is purely from the statistical analysis of the training data, no training data has been stored. Learn to read.
>yet somehow, the latent space is not a lossy compression of the training data,
Tell me how can a 4GB model can hold lossy compressions of 50 billion images? I’m waiting for an answer.

>> No.6475044

>>6475036
>>6475038
>reeeee
My point still stands. Looks like you guys are stirred up, just like I told you. Funny.

>> No.6475051

>>6475041
>literally says that it stores THE DISTRIBUTION, statistical analysis
The "statistical analysis" is a form of compression because it is meant to represent the training data.
>no training data has been stored.
Stooooop repeating this, do you want to fuck your strawman or something
>Tell me how can a 4GB model can hold lossy compressions of 50 billion images?
By turning them into a mathematical function, the one that you keep calling "analysis". Tell me, what is the point of this "analysis" if not to represent the training data?

>> No.6475054

>>6475044
you all sound like you're paid shills. I see more successful paid shills on other boards, you're not even hiding the fact that you're just never gonna accept objective truth. You're all failed creatives who wish you could get the instant gratification you crave like good goys. The most pathetic kind of jaded faggots on Earth. After this nft-like fad passes and it becomes a specific use tool you're still gonna be the same soul dead antihuman retards wishing for the next fad to give you a purpose in life.

>> No.6475069

>>6475051
>The "statistical analysis" is a form of compression because it is meant to represent the training data.
> By turning them into a mathematical function, the one that you keep calling "analysis".
Retard. If you take the average of n data points, you are left with ONE final value, that accounts for the analysis of the n number of objects. That’s why it is an analysis and not a compression, what is left in the latent space is an “observation” based on multiple training data and not the individual training data.
You don’t even understand what analysis means lol.

>> No.6475071

>>6475054
>reeeeeeeeeeeeeeeee
U mad?

>> No.6475077

>>6475069
>what is left in the latent space is an “observation” based on multiple training data and not the individual training data.
Anon are you retarded or something, why do you keep arguing against a strawman? Please answer the following question: does the latent space represent the training data or does it not? A simple "yes" or "no" will be enough

>> No.6475082

>>6475077
>yes or no
Already answered here faggot.>>6474952

Now you tell me how the average of n data points is somehow compression. If you can’t answer, you are fucked.

>> No.6475088

>>6475082
Just give me a yes or no please
>Now you tell me how the average of n data points is somehow compression
Because it is a representation of the same data, are you really unable to grasp this simple fact?

>> No.6475093

>>6475088
Is there any point to watching him talk in circles any more

>> No.6475098

>>6475088
>Because it is a representation of the same data
That’s not how compression works at all, faggot.

>> No.6475101

AIniggers go back to /g/

>> No.6475102

>>6475101
>AIniggers go back to /g/
Go back to "cute boys'.

>> No.6475105

>>6475088
How can one data form represent multiple data forms in compression? Are you retarded?

>> No.6475111

>>6475098
>still not answering the question
There are different forms of compression

>> No.6475114

>>6475105
Literally just answer the question, does the latent space represent the training data or does it not? Stop trying to deflect with more bullshit ramblings

>> No.6475121

>>6475111
You haven’t answered how you can lossly compress 55 billion images in a 4GB model file. There’s no technique available today for that to be possible.
>>6475114
I alresdy told you here >>6474952, faggot.

>> No.6475122

Link to the full PDF of the litigation here:
t.co/tPZDQVjQA1

>> No.6475123

>>6475121
Give me a yes or no answer

>> No.6475124

>>6475123
Read the post I linked, faggot

>> No.6475126

>>6475124
Can you really not be bothered to type 2-3 letters?

>> No.6475128

>>6475126
Not until you tell me how you can compress 55 billion images in a 4gb model.
I’m waiting.

>> No.6475129

>>6475122
thanks. I wanted to check it out a bit
>Pursuant to Federal Rule of Civil Procedure 38(b), Plaintiffs demand a trial by jury of all
the claims asserted in this Complaint so triable.
so this is just to start a trial

>> No.6475130

>>6475128
I was the first one who asked, tho

>> No.6475131

>>6475129
>trial by jury
Basically have random normalfags decide a highly technological debate. What a bunch of jews lol

>> No.6475134
File: 518 KB, 1194x1200, 1664614425396.png [View same] [iqdb] [saucenao] [google]
6475134

>>6475128
>>6475124
>>6475121
>>6475098
>>6475082
>>6475069
>>6475041
>>6475020
>>6475000
>>6474952
>>6474896
>>6474879
nigger

>> No.6475136

>>6475130
I already answered, you haven’t
You don’t know what compression means

>> No.6475137

>>6475136
No you didn't lmao, linking a random post is not a yes or no answer

>> No.6475140

>>6475137
Are you gonna answer or not?

>> No.6475142

>>6475140
Yea once you give me a yes or no. I just want to make sure I'm not wasting my time on a total retard here

>> No.6475145

>>6475131
it's ok, we are defending humans, while AIfags are soulles NPCs

>> No.6475147

>>6475142
Can you tell me why this post doesn’t answer your question >>6474952? I don’t want to repeat myself.

If you applied your compression definition to every ML model, each one would be considered “lossy compression”. Your definition is retarded.
Now answer my question.

>> No.6475153

>>6475147
>If you applied your compression definition to every ML model
It's not "my" definition, it's the standard one
>each one would be considered “lossy compression”.
Yes, now you're getting it. The truth is that this entire argument could have been avoided if you simple bothered to google the definition of "latent space".

>> No.6475159

>>6475153
>yes
So every ML model created is a lossy compression of the training data. Gotcha lol.
Are you fucking retarded?

>> No.6475163
File: 231 KB, 578x709, 1603361204218.png [View same] [iqdb] [saucenao] [google]
6475163

/v/ tourist here. I really dont get why that anon goes full defense mode for AI while trying to act like this lawsuit wont have an effect. I played around SD with Aynthingv3, SD1.5 for a month and all you do is hope that one of these 200 generated images you prompt is somewhat close to what you want but its never exactly what you want and then you proceed to few fix like the face and the hands. Basicly the AI does the creative part while you fix the mistakes, the ultimate cuck.

>> No.6475166

>>6475159
Serious question, what else do you think it does?
>inb4 analysis
This "analysis" is still a mathematical representation of the training data. This is why I keep asking if you believe that the latent space represents the training data or not

>> No.6475182

>>6475166
>the analysis is still a mathematical representation of the training data
I never said it wasn’t. It is a highly abstracted form of the input data to the point it can’t even be replicated, only “reasoned” about by the AI using random noise. That’s why I’m saying that ultimately what the AI creates is transformative >>6474952 and why saying it just is a collage is entirely wrong.

>> No.6475187

Dealing with AI retards has unironically improved my argumentative skills exponentially. I feel like a bona fide debatelord now.

>> No.6475193

>>6475187
You are not gonna last anon. Go to https://www.reddit.com/r/Destiny/ and AI chads will destroy your artist ass.

>> No.6475196

>>6475182
I do agree that "collage" is not the best term, not because it's not accurate but because people associate it with a specific way of creating images. I would call it a "merging tool" instead. However, the problem is that anything "new" it creates will be an interpolation from the latent space, since the entire diffusion process actually happens inside the latent space. That way, it will always be restricted by it's training data and can never create anything truly "new"
>>6475193
>r/Destiny/
l m a o

>> No.6475198

>>6475193
Can you not post cringe, I would be most appreciative.

>> No.6475201

>>6475193
listen i like dgg and destiny but holy shit his community can be so cringe, he debated some art girl on AI the other day and sadly he supports AI...! sad!

>> No.6475207

>>6475196
>>6475198
They are good in discussion, much better than the tards in the stable diffusion subreddit. They are also surprisngly good faith. But i bet you are still scared.

>> No.6475209

>>6475207
Anon if you want us to argue with your reddit idols, tell them to come here instead of sending us to that shithole

>> No.6475214

>>6475207
I thought I told you not to post cringe, Raj.

>> No.6475219

>>6475196
>I would call it a “merging” tool
Whatever way you call it, you can make arguments that humans also behave intellectually in the same manner.
It’s funny that we get more insights into the human brain by examining AI rather than the other way around.
>it can never create anything new
It can create infinite variations of the same thing with the same prompt, given a random set of noise each time. What it “reasons” about can be called transformative, and if the lawsuit hinges on arguments like “it is just a collage tool your honour”, they will fuck up badly.

>> No.6475226

>>6475219
>you can make arguments that humans also behave intellectually in the same manner.
Ok but machines don't have rights while humans do. So what are you gonna achieve by bringing up these arguments?
>if the lawsuit hinges on arguments like “it is just a collage tool your honour”, they will fuck up badly.
If you actually read it, you will see that it talks specifically about latent space representation of data

>> No.6475316

>>6473570
does he have enough money ai has made a lot and will be worth billions even trillions one day it's easy for the court to rule in favor of ai

>> No.6475350

>>6475219
>humans also behave intellectually in the same manner.
There is no human that can generate images at the scale and speed like the AI software does.
It's funny how ai-shills are willing to die on this semantic hill of comparing the AI to humans, when the logical conclusion for this line of reasoning, when someday AI becomes actual AGI, is slavery.

>> No.6475360

>>6475350
So you're saying that humans are actually inferior at creating images than current soft AI?

>> No.6475368

>>6475360
in certain terms, like speed and volume, yes
not necessarily in terms of quality

>> No.6475379

>>6475368
Then in this case, I don't see how artists are saying it can't be used as a tool if there's still a massive gap in quality that humans can add to it.
Concept artists will no longer have to waste their artist time drawing tons of designs that wont get used.
I can see hypothetical specialised models being made purely for concept art iteration then the artist just picking the best ones and refining those. Some companies already do this sort of thing with photobashing and I can't see how using AI would be an inferior technique.

>> No.6475384

>>6475360
>>6475368
The truth is that it's two entirely different processes with only few superficial similarities. It's like saying "cars are just like humans" because both can move forward. This weird trend of comparing algorithms to human thinking probably comes from a desire to humanize machines, even though they really aren't similar at all

>> No.6475387

>>6473655
>>6473614
These people are fake futurists and it actually makes me angry. The original futurist dream was that we'd automate DRUDGERY, all of the stupid and DUMB SHIT that nobody wants to do; picking up the trash, oil drilling and resource mining, shit that kills and poisons people, grinds them up and buries them alive. The whole point was that we'd automate all that shit and then we'd do nothing but draw all day and play music in the park and go on road trips and spend time with our families.

But no, let's not focus on that. Let's focus on the profit motive; entertainment is big, everyone likes shoveling streaming sludge down their throats, so let's reduce the overhead on that instead. Let's make art less valuable so that we can pocket the money instead. Oh, what's that? Offshore oil drilling has awful working conditions? People are getting poisoned and blown to fucking pieces making chemicals? Nah, fuck automating that, having a bunch of monkeys press buttons to get waifus is far more profitable.

Pathetic.

>> No.6475395

>>6475379
if the models used only licensed photography then I think artists would have accepted it as a tool, but that's not what happened
instead the people behind Stable Diffusion paid off LAION for a giant dataset of whatever they could scrape off the internet (copyrighted or not) and classified as "aesthetically pleasing" with the intent to create something that replaces artists

>> No.6475400

>>6475384
>This weird trend of comparing algorithms to human thinking probably comes from a desire to humanize machines, even though they really aren't similar at all
Artificial neural networks are specifically designed to function similarly (in terms of learning) to the human brain. It is not a new thing. Artificial neural networks existed as early 1940s, the reason why the analogy keeps being made is because it is true.
Learn your facts before you argue please.

>> No.6475416

>>6475400
further:
https://en.wikipedia.org/wiki/Unsupervised_learning Preliminary reading for all artists in this thread as you all seem to keep saying the same incorrect things. First get the facts right about how it actually works then you can make arguments about whether they can be compared to human thinking/learning or not.
There have been too many anons who think that AI works the same as a traditional computer algorithm (it doesnt). I'd like to see some effort put in by artists to actually understand what it is they're against before they make their arguments from emotion.

>> No.6475420

>>6475400
Neural networks do not learn remotely like the human brain, if its dataset contains only photographs, it will never create a painting. You'd be laughed out of court if you tried that defense because a neuroscientist expert witness would destroy you immediately.

>> No.6475429

>>6475420
This is not the gotcha you think it is.

The output of a artificial neural network is more acurately described as an imagined image. This is the term many researchers use in the research literature. The correct analogy would be: If a human brain shown only photographs, they're only going to be able to imagine photographs and will never imagine a painting (we've seen a form of this on this board about learning from reference vs learning from life when learning how to draw). The same thing would obviously apply to any entity that can learn.

>> No.6475438

>>6475400
>Artificial neural networks are specifically designed to function similarly
Yea "similarly" but not the same. They function similarly in the sense that both are designed to interpret data but this does not mean that an ANN is just like a human neuron ... unless you are suggesting that humans learn new things by converting data into mathematical functions

>> No.6475439

https://www.youtube.com/watch?v=x1wlW4t9o1U&ab_channel=FutureAI
reminder AIfags unironically belive AI is real intelligence
lmao

>> No.6475443

>>6475429
computers do not "imagine"
just because researchers use semantic tricks and analogies to help make their work understandable does not mean there is a direct correlation

>> No.6475447

>>6475443
Just because you don't understand it doesn't mean its "semantic tricks".

>> No.6475457

>>6475429
Not everyone can imagine, it's called aphantasia yet Glen Keane is a world renowned animator despite that. Strawman aside, I wasn't talking about imagining to begin with, I'm talking about the act of producing a painting from photographs, life or even imagination which the AI programs cannot do if no paintings are within their dataset to begin with.

>> No.6475461

>>6475447
Just because you gish gallop with technical jargon doesn't mean your argument is strong, in fact, it makes it weaker.

>> No.6475462

>>6475429
The "imagined image" is just an interpolation from the latent space.
>we've seen a form of this on this board about learning from reference vs learning from life when learning how to draw
That's a very weird way to make your point, anon. Even people who "only learn from photographs" are not gonna start reproducing only photographs, the reason why you're supposed to learn from life is in order to better understand how things look in 3D

>> No.6475471

>>6475457
It doesn't work that way though. The AI would need a robot body and given a paintbrush and told to paint how it's output looked onto a canvas for it to work in the way you are thinking. The output of an AI can be thought more like the just brains and not the entire human. Humans don't think images onto the canvas, they use their hands to place paint onto it. If a human brain could be hooked up to a computer in the same manner as an artificial neural network and was only shown photographs for the first 20 years of its life then you would see only photograph like images as the output too.
>>6475461
It's not gish galloping with technical jargon, you are personally just unable to understand what is the case and what isn't the case in terms of AI. There's a reason why it takes a degree and usually a PhD to start contributing to AI research and a random tard from /ic/ isn't writing groundbreaking papers. You may not like this answer but it's reality.
>>6475462
Latent space is just a "visual library". The things that we talk about here all the time. Give a robot a binocular set of cameras and allow it to wonder around the world then it will be able to produce images with a better 3D understanding.

>> No.6475477

>>6475471
>Give a robot a binocular set of cameras and allow it to wonder around the world then it will be able to produce images with a better 3D understanding.
Okay so do that, then convince your government that said robot should have human rights and then we can talk about the "it learns just like humans" argument being a justification for it getting trained on copyrighted data without permission

>> No.6475484

>>6475477
relevant video on the current progress of robots that can navigate meatspace: https://www.youtube.com/watch?v=dCPHGwW9SOk
It will probably happen in the next 10 years (high estimate) but I wouldn't be surprised if it happened sooner.
It doesn't need to have legal human rights to learn just like humans (no idea why you would think this, it doesn't make much logical sense). Governments could remove all your legal rights tomorrow and you would still be human physically but just not legally.

the rest of your post is really off topic for what we are currently discussing though.
This current discussion is not about copyrighted work and the potential copyright infringement that stability may have comitted (I would have preferred them to just release the tech model-free so that this pointless argument about copyright could be completely omitted). Training on copyrighted data is a thing that all of us artists do anyway. Just look at the study generals. They are literally just about grinding the copyrighted work of others to produce work of our own.

>> No.6475485

>>6475471
>We need to give the robot a body
Jesus, you are actually completely clueless.

>> No.6475491

>>6475485
https://www.youtube.com/watch?v=dCPHGwW9SOk
It's already being done. Maybe you are the one who is clueless.

>> No.6475492

>>6475484
Two more weeks until automated cars

>> No.6475493

>>6475484
>us artists
pyw

>> No.6475495

>>6475492
Depends on how automated we're talking. https://www.youtube.com/watch?v=wQkXcySUnJk
I'd already count this as an automated car.

>> No.6475497

>>6475495
https://www.youtube.com/watch?v=6Kf3I_OyDlI
yeah sure

>> No.6475502

>>6475484
You made the claim that given enough data, a robot would be able to "think just like humans do", yet your main argument to back up this claim is "it will probably happen in 10 years"? That's a prediction, not an argument.
>It doesn't need to have legal human rights to learn just like humans (no idea why you would think this, it doesn't make much logical sense)
Because it's a statement that is constantly being used to justify SD having been trained on copyrighted data. People are trying to claim that because it "learned just like humans", anything it spits out should be treated as "fair use", even though fair use exceptions can only apply to humans. It does matter whether machines have human rights or not when talking about legal issues

>> No.6475503

>>6475484
>This current discussion is not about copyrighted work and the potential copyright infringement that stability may have comitted (I would have preferred them to just release the tech model-free so that this pointless argument about copyright could be completely omitted). Training on copyrighted data is a thing that all of us artists do anyway
except the argument by ai shills like you is exactly that the latter justifies the former because you are equating AI with humans too closely

>> No.6475506

>>6475497
>NPC old guy can't use a car correctly, blames manufacturer
This has been happening before the advent of automated cars. Likely the guy thought he was going to get a fat check from tesla
I'll wait for an actual rebuttal.
>>6475502
I'm just here to clear up misconceptions on how it actually works so that artists don't look so painfully stupid when arguing against it, I don't really care about legal jewery. AI wouldn't automatically fall under copyright if it could be proven to learn like humans anyway. The iconic monkey learns just like humans yet the macaque was not assigned copyright as it is not a human. Anyone who thinks that AI will be given rights like a human doesn't really understand our own arrogance.

>> No.6475514

>>6475506
>I'm just here to clear up misconceptions
Ok? Maybe that's something better reserved for a place like twitter, not a half dead /ic/ thread on autosage
>Anyone who thinks that AI will be given rights like a human doesn't really understand our own arrogance.
That's a good thing tho, idk why some "people" are so eager to get replaced by the AGI dystopia

>> No.6475516

Will the cyber police finally jail me for having SD installed and a fuck ton of games and software pirated?

>> No.6475524

>>6475514
>Ok? Maybe that's something better reserved for a place like twitter, not a half dead /ic/ thread on autosage
I think I just let my aspergers get a little out of control because it bothers me when people are wrong about easily available information. I don't think AI technology is good or bad. Future AGI has the potential to both worsen human existence and enrich it but I don't think we should stop all AI research just because we are scared of it being used for bad. Stability shouldn't have used artists work without permission whether it ends up being ruled legal use or not. It is just good manners to ask first.
The lawyer in the OP image does not care about artists or ethical use of AI at all and is just looking to get a fat paycheck.
I hope this clears up my personal perspective a little bit.
Thanks for the discussion.

>> No.6475538

>>6473570
>Writer, designer, programmer, and lawyer
How old is he again? Or is his expertise as shallow as a puddle for half of these fields?

>> No.6475572
File: 9 KB, 250x241, A3F7BC96-7059-48B9-9781-4BEC87135A98.jpg [View same] [iqdb] [saucenao] [google]
6475572

>ITT:Ai shills can only argue semantics
God you guys are fucked more than I previously thought. This lawsuit might be settled within 3 months with arguments like these. Emad will have to bend the knee and be forced to settle lol

>> No.6475585

>>6475572
Learning from something isn't illegal yet

>> No.6475589

>>6475585
>Learning

>> No.6475620

>>6475585
Learning how to shit in a toilet must be illegal the way you pajeets avoid it

>> No.6475625

>>6475620
>scat closet

>> No.6475666

>>6475497
"brackets not working" sounds very strange. And even if that was true, it has nothing to do with automatic cars, just a daily problem with cars in general

>> No.6475719

>>6474539
>non-transformative
only applies to humans

>> No.6475766

>>6475585
Machines don't learn or interpret information in the same way that humans do and thus there should be distinct differences in the way the two are regulated.

>> No.6475851

bump

>> No.6475854

>>6475851
It's on autosage for whatever reason

>> No.6475931

>>6473614
They deserve to be ruined. They can only thrive over and among completely demoralized animals.

>> No.6475949

>>6475625
>Scat-hole Diffusion

>> No.6475951

>>6475854
faggot ass jannies

>> No.6476361

Admit you lost /ic/. The lawsui has been proven to be an utter joke. Time to flip burgers.

>> No.6476409

>>6476361
>*nervous sweating increases*

>> No.6476422

>>6475766
Need a handicap to stay relevant fellow human?

>> No.6476588

>>6475572
The shitty lawsuit is based on semantics retard. Look how hard the kike tries to call it a collage tool.