I am not really happy with thinking about what this does to small companies, hobbyists, open source programmers and so on, if it becomes a necessity to be competitive.
Especially since so many of those models have just freely ingested a whole bunch of open source software to be able to do what they do.
If you make 10k/mo -- which is not that much!, $500 is 5% of revenue. All else held equal, if that helps you go 20% faster, it's an absolute no brainer.
The question is.. does it actually help you do that, or do you go 0% faster? Or 5% slower?
This is the sort of statement that immediately tells me this forum is disconnected from the real world. ~80% of full time workers in the US make less than $10k a month before tax.
And yet, the average salary of an IT worker in the US is somewhere between 104 and 110k. Since we're discussing coders here, and IT workers tend to be at the lower end of that, maybe there is some context you didn't consider?
>And yet, the average salary of an IT worker in the US is somewhere between 104 and 110k.
After tax that's like 8% of your take home pay. I don't know why it's unreasonable to scoff at having to pay that much to get the most out of these tools.
>maybe there is some context you didn't consider?
The context is that the average poster on HN has no idea how hard the real world is as they work really high paying jobs. To make a statement that "$10k a month is not a lot" makes you sound out of touch.
>A lot of us dismiss AI because "it can't be trusted to do as good a job as me"
Some of us enjoy learning how systems work, and derive satisfaction from the feeling of doing something hard, and feel that AI removes that satisfaction. If I wanted to have something else write the code, I would focus on becoming a product manager, or a technical lead. But as is, this is a craft, and I very much enjoy the autonomy that comes with being able to use this skill and grow it.
I consider myself a craftsman as well. AI gives me the ability to focus on the parts I both enjoy working on and that demand the most craftsmanship. A lot of what I use AI for and show in the blog isn’t coding at all, but a way to allow me to spend more time coding.
This reads like you maybe didn’t read the blog post, so I’ll mention there many examples there.
Nobody is trying to talk anyone out of their hobby or artisanal creativeness. A lot of people enjoy walking, even after the invention of the automobile. There's nothing wrong with that, there are even times when it's the much more efficient choice. But in the context of say transporting packages across the country... it's not really relevant how much you enjoy one or the other; only one of them can get the job done in a reasonable amount of time. And we can assume that's the context and spirit of the OP's argument.
>Nobody is trying to talk anyone out of their hobby or artisanal creativeness.
Well, yes, they are, some folks don't think "here's how I use AI" and "I'm a craftsman!" are consistent. Seems like maybe OP should consider whether "AI is a tool, why can't you use it right" isn't begging the question.
Is this going to be the new rhetorical trick, to say "oh hey surely we can all agree I have reasonable goals! And to the extent they're reasonable you are unreasonable for not adopting them"?
>But in the context of say transporting packages across the country... it's not really relevant how much you enjoy one or the other; only one of them can get the job done in a reasonable amount of time.
I think one of the more frustrating aspects of this whole debate is this idea that software development pre-AI was too "slow", despite the fact that no other kind of engineering has nearly the same turn around time as software engineering does (nor does they have the same return on investment!).
I just end up rolling my eyes when people use this argument. To me it feels like favoring productivity over everything else.
Again, this article is not discussing the quality of generative AI. Sanderson clearly believes himself that AI is already able to produce things that are indiscernible to art from his eyes.
What this article is trying to get across is that art is a transformative process for the human who creates it, and by using LLMs to quickly generate results, robs the would be artist of the ability for that transformation to take place. Here's a quote from Sanderson:
"Why did I write White Sand Prime? It wasn’t to produce a book to sell. I knew at the time that I couldn’t write a book that was going to sell. It was for the satisfaction of having written a novel, feeling the accomplishment, and learning how to do it. I tell you right now, if you’ve never finished a project on this level, it’s one of the most sweet, beautiful, and transcendent moments. I was holding that manuscript, thinking to myself, “I did it. I did it."
I'm not sure it is. I think his whole stance here is that you should create art for yourself, not because there is some intrinsic use to whatever you create, but because the artist has an insatiable need to create __something__. Creating art is therefor as much an act of personal growth as it is a past time. To rob yourself of that growth in his eyes, is to discard such growth.
>I don't know if it will always stay this way, though. If one day I read a novel and I think, this is a great novel. I appreciated it, I felt myself growing from it. And then later I learn it was written by an AI. That's it, that will prove that great AI novels are possible. I will know it when I see it. I haven't seen it yet, but if it happens, I'll know.
That's not what the essay is about. Sanderson spends the first half of the essay examining reasons for his strong feelings against AI. He also touches on the fact that he already struggles to discern generative AI from human art.
Eventually, he concludes that his real objection to generative AI has nothing to do with the quality, and everything to do with the process by which it was created. He believes (as do I) that focusing solely on the end product of generating a painting or a novel, robs would be artists of the valuable learning experience of failing repeatedly to create art, and then eventually rising past that failure to finish something. In this way, he thinks one of the real hallmarks of art is that it's transformative for the human who creates it, going so far to state that __humans are the art__ itself.
I mean if I'm totally honest, it could be beneficial to me if something like this comes to be. Even in the developed world, we have a bunch of annoying people who complain/cry constantly about dumb things. They do that instead of doing something, whereas I can excuse the people trapped in hellholes overseas because it really isn't their fault they were oppressed and mistreated.
They'd have their own economy and "life" and leave the rest of us alone. It would be completely transactional, so I'd have zero reason to feel bad if they do it voluntarily.
If they can be happy in a simulated world, and others can be happy in the real world, then everyone wins!
>I think we'll eventually get to the point where these are real time and have consistent representations
You have a dangerously low opinion of your fellow man, and while I sympathize with your frustration, I would humbly suggest you direct that anger at owners of companies/politicians, rather than aim it at your everyday citizen.
Are people seriously dropping hundreds of dollars a month on these products to get their work done?
reply