I don’t get this. AI bros talk about how “in the near future” no one will “need” to be a writer, a filmmaker or a musician anymore, as you’ll be able to generate your own media with your own parameters and preferences on the fly. This, to me, feels like such an insane opinion. How can someone not value the ingenuity and creativity behind a work of art? Do these people not see or feel the human behind it all? And are these really opinions that you’ve encountered outside of the internet?
There are people out there without any kind of conscience or memorse. If they can somehow make a profit, they’ll sell somehing to you or anyone who is willing to listen.
How can someone not value the ingenuity and creativity behind a work of art?
Their point of view is that if people do actually value this then there will always be a market for it.
If they don’t, there won’t.
I suppose a long time ago the radio and gramophone looked like they’d been the end of live performing musicians but they still exist, everything’s just continually changing…
The best example I can think of, and this is being very generous to the AI bros, is that they’re trying to compare it to obsolete creative positions. Think about animation. Each frame used to have to be hand drawn and colored entirely by hand. There was a lot of heavy lifting going on in the process that weren’t necessarily creative but still required for the final product. I think they’re trying to say that we’ll need less work like this.
I’m not sure I agree or how accurate their claims are.
Edit: I’m just explaining what I think their point of view is. It’s not my personal opinion.
I’m no AI bro, but I do think this concern is a bit overblown. The monetary value in art is not in simply having a picture of something, a whole infamous subset of “modern art” commands high prices despite being simple enough that virtually anybody could recreate it. A lot is simply in that people desire art created by a specific person, be it a painting that they made, or commissioning a still active artist to create something, or someone buying a band’s merch to support their work. AI simply does not have the same parasocial association to it. And of course, it doesn’t at all replicate the non-monetary value that creating something can give to someone.
I can, at most, imagine it getting integrated into things like advertising where one really doesn’t care who created the work; but even then there’s probably still value in having a human artist review the result to be sure of it’s quality, and that kind of art tends to add the least cultural value anyway.
That isn’t zero impact obviously, that kind of advertisement or corporate clip art or such does still pay people, but it’s a far cry from the end of creative human endeavor, or even people getting paid to be creative.
My daughter (15f) is an artist and I work at an AI company as a software engineer. We’ve had a lot of interesting debates. Most recently, she defined Art this way:
“Art is protest against automation.”
We thought of some examples:
- when cave artists made paintings in caves, perhaps they were in a sense protesting the automatic forces of nature that would have washed or eroded away their paintings if they had not sought out caves. By painting something that could outlast themselves, perhaps they wished to express, “I am here!”
- when manufacturing and economic factors made kitsch art possible (cheap figurines, mass reprints, etc.), although more people had access to “art” there was also a sense of loss and blandness, like maybe now that we can afford art, this isn’t art, actually?
- when computers can produce images that look beautiful in some way or another, maybe this pushes the artist within each of us to find new ground where economic reproducibility can’t reach, and where we can continue the story of protest where originality can stake a claim on the ever-unfolding nature of what it means to be human.
I defined Economics this way:
“Economics is the automation of what nature does not provide.”
An example:
- long ago, nature automated the creation of apples. People picked free apples, and there was no credit card machine. But humans wanted more apples, and more varieties of apples, and tastier varieties that nature wouldn’t make soon enough. So humans created jobs–someone to make apple varieties faster than nature, and someone to plant more apple trees than nature, and someone to pick all of the apples that nature was happy to let rot on the ground as part of its slow orchard re-planting process.
Jobs are created in one of two ways: either by destroying the ability to automatically create things (destroying looms, maybe), or by making people want new things (e.g. the creation of jobs around farming Eve Online Interstellar Kredits). Whenever an artist creates something new that has value, an investor will want to automate its creation.
Where Art and Economics fight is over automation: Art wants to find territory that cannot be automated. Economics wants to discover ways to efficiently automate anything desirable. As long as humans live in groups, I suppose this cycle does not have an end.
Art is subjective, AI is a buzzword, if statements are considered AI, especially in the gaming world.
And the current state of LLMs and what are the smartest and brightest in the industry have only managed to produce utter trash, while sacrificing the planet and its inhabitants. I like your daughter more, she will create more value and at the same time not be a total corporate tool, ruining the planet for generations to come, mad respect.
(not calling you a tool, but people who work with LLMs)
I do work with LLMs, and I respect your opinion. I suspect if we could meet and chat for an hour, we’d understand each other better.
But despite the bad, I also see a great deal of good that can come from LLMs, and AI in general. I appreciated what Sal Khan (Khan Academy) had to say about the big picture view:
There’s folks who take a more pessimistic view of AI, they say this is scary, there’s all these dystopian scenarios, we maybe want to slow down, we want to pause. On the other side, there are the more optimistic folks that say, well, we’ve gone through inflection points before, we’ve gone through the Industrial Revolution. It was scary, but it all kind of worked out.
And what I’d argue right now is I don’t think this is like a flip of a coin or this is something where we’ll just have to, like, wait and see which way it turns out. I think everyone here and beyond, we are active participants in this decision. I’m pretty convinced that the first line of reasoning is actually almost a self-fulfilling prophecy, that if we act with fear and if we say, “Hey, we’ve just got to stop doing this stuff,” what’s really going to happen is the rule followers might pause, might slow down, but the rule breakers–as Alexander [Wang] mentioned–the totalitarian governments, the criminal organizations, they’re only going to accelerate. And that leads to what I am pretty convinced is the dystopian state, which is the good actors have worse AIs than the bad actors.
https://www.ted.com/talks/sal_khan_how_ai_could_save_not_destroy_education?subtitle=en
They absolutely hate anyone who is better than them at anything. They hate programmers. They hate artists. They hate their secretary that knows more about them than they do.
Getting rid of everyone would soothe their egos.
What a cynical view to live under.
It’s legitimate to question why we would want to replace human artistry with AI. Somebody might have asked the same question about replacing hand tools with power tools. But I wouldn’t be a longtime amateur woodworker if all I had to work with was hand tools - the work would be far too time consuming and the learning curve much too high. Or ask content creators who are able to get their ideas in front of the public without learning HTML, CSS or Javascript, what they think of content creation tools. Was making MySpace etc. available 20 years ago a bad thing because it changed our view of programming?
Enabling millions of people to jump traditional entry barriers is a good thing, even if it means we no longer look at the creative process as being reserved for people with natural talent or years of training. TBH you might as well object to Bob Ross teaching people easier ways to paint, or to people who teach breadbaking on YouTube - it turns out bread is dead simple btw, you should try it.
But more to the point, the genie is out of the bottle, and no amount of objection is going to stuff it back in.
Enabling millions of people to jump traditional entry barriers is a good thing
Except often it’s not even traditional entry barriers. Look how bad Google search has gotten, overrun by AI blogposts and advertising slop. Those aren’t entry barriers, those are “hold up, is this even content?” barriers.
But more to the point, the genie is out of the bottle, and no amount of objection is going to stuff it back in.
We regulated the assembly line and gave laborers compensation and safety rights when power tools increased their capacity. So too, we could force OpenAI et al to compensate the copyright holders from whom they scraped data. No one is calling for the genie to go back in, only for the corporatists to stop being the ones with all the wishes.
So too, we could force OpenAI et al to compensate the copyright holders from whom they scraped data
Fuck expanding copyright’s power in any way. Effort better spent on making AI content illegal to sell or another way of ending corporate profit off of it
Because AI bros love the smell of their own farts and they get off by convincing other people that they should also smell their farts. (Only partly /s)
But more seriously, I’d say it’s just a symptom of the world we live in where there is tremendous pressure to commodify and commercialize everything in the most “efficient” way possible, including creativity.
Why do people who post loaded questions approve of pedophilia and torturing kittens?
approve of pedophilia and torturing kittens?
what the actual fuck?
Doesn’t sound like a denial - I thought so!!!
Probably put the TP on backwards, too.
with ai making content they will never have to worry about some sort of original content upsetting the selling of continous reboots.
Hello,
Let me chime in as someone who would probably fall under your definition of an AI defender.
How do I defend AI? Well, I think AI really flips the world on it’s head. Including all the good and the bad that comes from it. I still think the industrialization is a good metaphor. Things changed a lot. A lot of people were pissed. Now we don’t mind as much anymore, because it’s the new normal, but at the time, most people weren’t happy about it.
Same with AI. I think overall it’s a plus, but obviously it comes with new pitfalls. LLM hallucinations, the need for more complex copyright and licensing definitions, impersonation, etc. . It’s not entirely great, but I totality, when the dust settles, it will be a helpful tool to make our lives easier.
So why do I defend AI? Basically, because I think it will happen, whether you like it or not. Even if the law will initially make it really strict, society will change their mind about it. It might be slowly, but it’s just too useful to outlaw.
Going back to industrialization metaphor, we adapted it over a longer period of time. Yes, it forever changed how most things are made, but it wasn’t necessarily a bad thing. It’s just a thing. And even though lots of logistics chains are streamlined, there’s always gonna be handmade things and unique things. Ofc, not everything is handmade, but some important things still are. And for both of them, there’s some stuff that’s totally fine to be automated, and then there’s some stuff that just loses it’s value if we just gloss over with automation.
Now I don’t want AI to just roam free (ofc not, there’s some really bad stuff happening and I’m not pretending that it’s not) but what we need is laws and enforcement against it, and not against AI.
Imagine if most countries outlawed AI. It would make all AI companies and users move operation to that one country that still allows it, making it impossible to oversee and enforce against. So we better find a good strategy to allow it for all the things where it doesn’t do damage.
Now let me address some specific points you brought up;
In the near future no one will “need” to be a writer
But isn’t this already how it’s going? Only people who wanna be a writer are one, anf it’s good that way.
Also, AI can only remix the art that’s already there, so if you’re doing something completely unique, AI won’t ever be able to replace you. I find that somehow validating for the people who make awesome and unique art. I think that’s how it should be.
Do these people not see or feel the human behind the art at all?
I do. And that’s the exact reason I’m not concerned. Everyone who puts in the work to make something very particular to them should not be impacted in any way.
Now there’s an argument to be made how consent for training data is given (opt-in / opt-out) and what licensing for the models can and should look like, but this is my very basic opinion.
Are these really opinions you have encountered outside of the internet?
I may have about one friend out of 30 who thinks like me.
I mean I am living proof we exist, but I can’t say this is a popular opinion, which is fair.
I don’t want people to mindlessly agree, I want them to come their own opinions because of their own research and presumptions.
I also don’t expect you to agree with me, but I hope some people will understand my perspective and maybe this brings a bit more nuance to this bipolar conversation.
In addition to this, the current state of AI is basically just advanced algorithms. Id would be extremely difficult, but in theory you could still trace the connections between bodes and run the optimization calculations yourself.
Soon enough, we will have AGI. Im not a big fan of LLMs, because theyre a fundamentally flawed idea. The only way to get that much data is without consent, and they will always be prone to hallucinations. AGI on the other hand is fundamentally different. It’s capable of learning just like a human, and capable of doing tasks just like a human. By all measurements it will be able to do anything a human can do, and by most measurements, it will do it better.
The issue most people have is that they do not understand that the current state of AI is like the OG printing press. It’s crazy to a layperson, and it has its uses, but since most everyone is illiterate farmers, its not that useful. But to claim that transcribing text is pointless is ignoring an entire world of possibilites, to the point where people who rail against AI almost seem malicious or willfully ignorant. Why do you not want us to be able to almost instantly diagnose new diseases? Or have a nursebot babysitter that is literally a better parent than you are, and doesnt have to sleep or eat? Whats the issue with making cars safer, making construction more efficient, and taking corruption out of the government? Why do people hate the idea of people no longer having to be alone, or having a therapist that is available at all times, perfectly tailored to help you with your specific issues and no biases?
Yes, these things are impossible with modern AI. But to claim that AI is useless… It’s either malice or ignorance.
Completely agree, I think of industrialization as well when comparing it.
Steel plow comes to mind.
I absolutely don’t agree with your perspective.
AI is just another way to ensure control of the means of production stays in the hands of capitalists.
It empowers the techno-feudalist monopolies to put further pressure on more industries. Not content to own a portion of every retail purchase, every digital payment, every house, and every entertainment property. They now get to own a portion of every act of creation, every communication that could possibly challenge their power.
They can subvert any act of independent impactful art by copying it and remanufacturing lesser versions over and over until the original’s impact is lost. And they can do it faster than ever before, cashing in on the original creative’s effort and syphoning returns away from creators into their own pockets.
You might think it’s inevitable and inescapable, but that’s what people once thought of the divine right of kings.
You’re basically saying AI can’t be used in any other way than it’s being used right now. I think you are the one who’s taking the current state of things as inevitable and inescapable.
I mean, he basically said industrialization is bad. Not sure why he’s saying that online, via his computer.
There’s nothing wrong with opposing technology as it currently stands. Maybe there’s room for nuance in language, but that doesn’t break their argument.
As it currently stands, the user above is right, and the labor of human artists is being siphoned into corporate profit with zero compensation. In the same way, at the beginning of the industrial revolution the labor of children was siphoned into profit with low compensation and deadly work conditions.
The way the textile industry was “fixed” was by opposition: speaking about the issues related to the technical developments and advocating for better treatment of the laborers. The only way AI as it currently stands can be “fixed” is also by opposition. Being critical of AI doesn’t mean “turn it off,” it means speaking about the issues related to the new technology and advocating for better treatment of the laborers.
Because they like money, and anything they say about creative industries is just silly words they don’t mean that you shouldn’t take seriously. Zero meaning in anything they say 🤷♂️
Everyone’s frame of reference is their own IQ…
So for some people AI seems as smart as their frame of reference, or even better.
They assume their frame of reference is everyone’s, so we’re in that weird period where dumb people are super excited about AI, and smart people still think it’s a gimmick.
Those people who find AI impressive, see it as a means to level the playing field, and it will eventually.
It just means the smarter you are, the longer it’s going to take to be impressive. Because your frame of reference is just a higher standard.
They’d never be as creative as a creative person, so to them it’s switching from relying on a person they have no control over or influence on, to a computer program that will do whatever is asked. To them it generates the same quality as a person, don’t forget the most popular media caters to the lowest common denominator, this is the same thing.
Like, it makes sense from their perspective. You just need to realize everyone has a different perspective.
It’s human variation
Pretty good points there, though i’d argue it’s not just pure numerical IQ, but mostly life experience. The more variety of life you experience, the more you know of human history, different cultures, ways of thinking and seeing the world - the harder it is for you to get impressed by something as shallow as AI.
Tech bros live in a bubble of their own creation and don’t understand the true richness of the human condition.
it’s not just pure numerical IQ,
We talk about IQ like it’s a single number, but it’s like SAT/ACT, a bunch of different specific scores averaged into one number. So yeah it’s not as simple as a single number. I was thinking mostly processing speed and associative memory, but obviously you need the general knowledge as well.
The more variety of life you experience, the more you know of human history, different cultures, ways of thinking and seeing the world - the harder it is for you to get impressed by something as shallow as AI.
This is a very specific and easily fixable problem. It’s trained by a certain class of people, so it’s going to regurgitate stuff from that class and ignore everyone who hadn’t trained it.
Tech bros live in a bubble of their own creation and don’t understand the true richness of the human condition.
Nobody is gonna argue with that tho
Creative is great, industry not so much.