

Really?


Really?


The thing is: LLMs do not accelerate the progress of proper software development. Your processes have to be truly broken to be able to experience a net gain from using LLMs. It enables shitty coders to output pull requests that look like they were written by someone competent, and thereby effectively waste the time of skilled developers who review such pull requests out of respect for the contributor, only to find out it is utter garbage.


problem is that the widespread use of (and thereby provision of your data to) LLMs contributes to the rise of totalitarian regimes, wage-slavery and destroying our planet’s ecosystem. Not a single problem in any of our lives is important enough to justify this. And convenience because we are too lazy to think for ourselves, or to do some longer (more effort) web research, is definitely not a good excuse to be complicit in murder, torture and ecoterrorism.


I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.
Please hate yourself, reflect on that and walk back from contributing to destroying the environment by furthering widespread adoption of this shitty technology. The only reason you seem to get “useful answers” is because of search engine and website enshittification. What you are getting is still tons worse than a good web research 10 years ago.
Basically you were taught to enjoy rancid butter because all restaurants around you had started tasting like shit first, then someone opened a rancid butter shop.


let’s make it “every one million dollar over it” and these greedy fucks would still not survive it.


I can’t keep you from doing what you want, but I will continue to view software developers using LLMs as script kiddies playing with fire.


While I appreciate your differentiated opinion, I strongly disagree. As long as there is no actual AI involved (and considering that humanity is dumb enough to throw hundreds of billions at a gigantic parrot, I doubt we would stand a chance to develop true AI, even if it was possible to create), the output has no reasoning behind it.
A good developer has zero need for non-deterministic tools.
As for potential use in brainstorming ideas / looking at potential solutions: that’s what the usenet was good for, before those very corporations fucked it up for everyone, who are now force-feeding everyone the snake oil that they pretend to have any semblance of intelligence.


I didn’t say your particular application that I know nothing about is slop, I said success does not mean quality. And if you use statistical pattern generation to save time, chances are high that your software is not of good quality.
Even solar energy is not harvested waste-free (chemical energy and production of cells). Nevertheless, even if it were, you are still contributing to the spread of slop and harming other people. Both through spreading acceptance of a technology used to harm billions of people for the benefit of a few, and through energy and resource waste.


There’s a difference between vibe coding and responsible use.
There’s also a difference between the occasional evening getting drunk and alcoholism. That doesn’t make an occasional event healthy, nor does it mean you are qualified to drive a car in that state.
People who use LLMs in production code are - by definition - not “good developers”. Because:
This already means the net gain with use of LLMs is negative. Can you use it to quickly push out some production code & impress your manager? Possibly. Will it be efficient? It might be. Will it be bug-free and secure? You’ll never know until shit hits the fan.
Also: using LLMs to generate code, a dev will likely be violating copyrights of open source left and right, effectively copy-pasting licensed code from other people without attributing authorship, i.e. they exhibit parasitic behavior & outright violate laws. Furthermore the stuff that applies to all users of LLMs applies:


Look, bless your heart if you have a successful app, but success / sales is not exclusive to products of quality. Just look around at all the slop that people buy nowadays.
As long as AI helps you achieve your goals and your goals are grounded, including maintainability, I see no issues.
Two issues with that


I can’t just call everything snake oil without some actual measurements and tests.
With all due respect, you have not understood the basic mechanic of machine learning and the consequences thereof.


As you said, “boilerplate” code can be script generated - and there are IDEs that already do this, but in a deterministic way, so that you don’t have to proof-read every single line to avoid catastrophic security or crash flaws.


The kind of useful article I would expect then is one exlaining why word prediction != AI


Except that outright dismissing snake oil would not at all be bad business. Calling a turd a diamond neither makes it sparkle, nor does it get rid of the stink.


And then there are actual good developers who could or would tell you that LLMs can be useful for coding
The only people who believe that are managers and bad developers.


That’s because you are not a proper developer, as proven by your comment. And you create tech legacy that will have a net cost in terms of maintenance or downtime.


Problem is that statistical word prediction has fuck-all to do with AI. It’s not and will never be. By “giving it a try” you contribute to the spread of this snake oil. And even if someone came up with actual AI, if it used enough resources to impact our ecosystem, instead of being a net positive, and if it was in the greedy hands of billionaires, then using it is equivalent to selling your executioner an axe.


So there’s actual developers who could tell you from the start that LLMs are useless for coding, and then there’s this moron & similar people who first have to fuck up an ecosystem before believing the obvious. Thanks fuckhead for driving RAM prices through the ceiling… And for wasting energy and water.
if they were reasonable, they wouldn’t be a ruling “class”