On Courses + Tools use code BLACK20
@flaque
@flaque
Earth
Activity Feed
Jon Passig
Really appreciate him doing this interview but I can’t lie, the amount of apathy he had for art, artists, and the many that will be displaced and abused by this system is disturbing. It’s not a requirement either. They could just *stop* doing it this way. They don’t *need* to use the art of non consenting artists, much less actual [extremely illegal content related to children] and illegal content in their data sets. If what he says is true, by his logic he could literally train AI to draw on its own without feeding it other peoples labor. Stealing grain from the farmer to sell them back their bread. and when he does admit fault and any level of concern for what’s happening, there’s a complete reluctance to do anything about it. It’s like the sheer act of talking to what he describes as a small amount of developers is a Herculean task. Assault rifles were invented, but you don’t see every Joe walking down the street with them. There’s nuance about it. You can’t just let this ‘boulder’ roll down a hill and crush a village blaming the residents that they should push it in another direction when you’re the person who pushed it in that direction in the first place. gross irresponsibility, delusional futurist 1930s rhetoric of an impossible utopia, and magical thinking about how the free market will magically solve any injustice. Trust me, if it were the case my mothers insulin wouldn’t be marked up 300% because of the arbitrary decisions of business people. The logic of ‘competition inspiring better prices and product’ is bull in most applicable situations and I’m not going to gargle the backwash of Silicon Valley transhumanists that can’t process that art may be enjoyable for more reason than pushing button and getting a dopamine rush at a pretty picture. I don’t hate the person, his genuine enthusiasm and clear love for what he’s doing is charming. But Marshall help us all if this is emblematic of what the majority of developers in that field feel. We are in for a *very* rough ride.
@flaque
2yr
> Really appreciate him doing this interview but I can’t lie, the amount of apathy he had for art, artists, and the many that will be displaced and abused by this system is disturbing. It’s not a requirement either.  Ah, I apologize if it came off that way. To be clear, programmers & designers (ie: me), are ALSO in the same position as artists. > They could just *stop* doing it this way. They don’t *need* to use the art of non consenting artists, much less actual [extremely illegal content related to children] and illegal content in their data sets. I think most people actually are. I'm not sure folks who trained stable diffusion actually knew that there was copyrighted images in the dataset, or even really thought about it initially. >  It’s like the sheer act of talking to what he describes as a small amount of developers is a Herculean task. There's a small amount of Silicon Valley(*) developers. I have no control over foreign-state's AI plans. It's probably feasible to get American AI companies to stop training on copyrighted data; for example, you could change the law. It's very infeasible to get Chinese AI projects to do so (maybe tariffs?). (Though again, I'm not sure it actually solves the problem for you) > Trust me, if it were the case my mothers insulin wouldn’t be marked up 300% because of the arbitrary decisions of business people. There's only three manufacturers of insulin, and the US specifically has laws that prevent imports of prescription drugs from competitors over seas. If you don't think this is the case, you can go start an insulin company and bring the prices down! You'll make lots of money doing so, and do good for the world. -Evan
@veryartthing
If I'm honest, I really hope you team up with other high profile artists and teachers to do something to stop this. This video showed me that the developers of this technology are either lying or incredibly naïve, and given that they are charging money for these tools its probably the former. The fact that they are selling a program that was built off the copyrighted work of innumerable independent artists and corporations is absurd. This guy basically responds to the issue of the copyrighted work in their dataset by saying it doesn't matter. That somehow, they are not responsible for the data they trawled off the web being full of stuff they had no legal permission to use. He then claims that even if they did remove it, it wouldn't change anything. But that is a lie: if they didn't need it, they wouldn't be using it. Its inexcusable, and the fact that they already did it is no reason we can't hold them accountable. Please team up with other creators to file legal action against them. AI is going to be a big part of the future, but as Steven Zapata pointed out, we do not have to roll over an accept it being done in a way that has zero respect for the people that came before it. AI still has a long way to go before it overtakes the industry completely, so there's still time to at least try to soften the blow and make sure these people who blatantly stole from others are held accountable for it. You've helped me so much with art, and for a few years I genuinely believed art could be my future. I believed it could be the future of so many others as well, and eagerly pointed them to your content so that we could learn and grow together. Now I am pleading with you to do what you can to stop people from stealing the work of others. You have the power to really represent a lot of artists, both big and small. Please do something.
@flaque
2yr
To be clear, I very much agree that copyrighted work in the dataset DOES matter. But also, I am letting you know that, even if you didn't have copyrighted work in the dataset, that wouldn't necessarily be "better" for artists. You can win the court case, but it won't actually solve your problem. -Evan
Help!
Browse the FAQs or our more detailed Documentation. If you still need help or to contact us for any reason, drop us a line and we’ll get back to you as soon as possible!