

- you have no clue about licenses
- you have no clue what deterministic means
I can’t keep you from doing what you want, but I will continue to view software developers using LLMs as script kiddies playing with fire.


I can’t keep you from doing what you want, but I will continue to view software developers using LLMs as script kiddies playing with fire.


While I appreciate your differentiated opinion, I strongly disagree. As long as there is no actual AI involved (and considering that humanity is dumb enough to throw hundreds of billions at a gigantic parrot, I doubt we would stand a chance to develop true AI, even if it was possible to create), the output has no reasoning behind it.
A good developer has zero need for non-deterministic tools.
As for potential use in brainstorming ideas / looking at potential solutions: that’s what the usenet was good for, before those very corporations fucked it up for everyone, who are now force-feeding everyone the snake oil that they pretend to have any semblance of intelligence.


I didn’t say your particular application that I know nothing about is slop, I said success does not mean quality. And if you use statistical pattern generation to save time, chances are high that your software is not of good quality.
Even solar energy is not harvested waste-free (chemical energy and production of cells). Nevertheless, even if it were, you are still contributing to the spread of slop and harming other people. Both through spreading acceptance of a technology used to harm billions of people for the benefit of a few, and through energy and resource waste.


There’s a difference between vibe coding and responsible use.
There’s also a difference between the occasional evening getting drunk and alcoholism. That doesn’t make an occasional event healthy, nor does it mean you are qualified to drive a car in that state.
People who use LLMs in production code are - by definition - not “good developers”. Because:
This already means the net gain with use of LLMs is negative. Can you use it to quickly push out some production code & impress your manager? Possibly. Will it be efficient? It might be. Will it be bug-free and secure? You’ll never know until shit hits the fan.
Also: using LLMs to generate code, a dev will likely be violating copyrights of open source left and right, effectively copy-pasting licensed code from other people without attributing authorship, i.e. they exhibit parasitic behavior & outright violate laws. Furthermore the stuff that applies to all users of LLMs applies:


Look, bless your heart if you have a successful app, but success / sales is not exclusive to products of quality. Just look around at all the slop that people buy nowadays.
As long as AI helps you achieve your goals and your goals are grounded, including maintainability, I see no issues.
Two issues with that


I can’t just call everything snake oil without some actual measurements and tests.
With all due respect, you have not understood the basic mechanic of machine learning and the consequences thereof.


As you said, “boilerplate” code can be script generated - and there are IDEs that already do this, but in a deterministic way, so that you don’t have to proof-read every single line to avoid catastrophic security or crash flaws.


The kind of useful article I would expect then is one exlaining why word prediction != AI


Except that outright dismissing snake oil would not at all be bad business. Calling a turd a diamond neither makes it sparkle, nor does it get rid of the stink.


And then there are actual good developers who could or would tell you that LLMs can be useful for coding
The only people who believe that are managers and bad developers.


That’s because you are not a proper developer, as proven by your comment. And you create tech legacy that will have a net cost in terms of maintenance or downtime.


Problem is that statistical word prediction has fuck-all to do with AI. It’s not and will never be. By “giving it a try” you contribute to the spread of this snake oil. And even if someone came up with actual AI, if it used enough resources to impact our ecosystem, instead of being a net positive, and if it was in the greedy hands of billionaires, then using it is equivalent to selling your executioner an axe.


So there’s actual developers who could tell you from the start that LLMs are useless for coding, and then there’s this moron & similar people who first have to fuck up an ecosystem before believing the obvious. Thanks fuckhead for driving RAM prices through the ceiling… And for wasting energy and water.
With all due respect, you’re being an pain in the ass for contributing to stealing millions from people who need to buy RAM for new computer. Pardon my French but fuck off with that LLM bullshit already.


Because everyone left at the company is as bad as he is or worse?


If designing fonts is part of the work, all you need is a versioning repository like a forgejo or gitlab instance.


I was being aggressive, but not towards another poster, but the corporate bullshit culture that requires such a thing as an online repository for free fonts.


I think you are not following the news very attentively - those corrupt EU fuckers are currently working overtime to expose us completely to those very corporations that are so evil that I have no words to insult them as harshly as they deserve.


A mix of any subset of cultures is a net positive compared to isolation from “others”. It’s impossible to jump straight from isolation to one big happy family world-wide. A “subset” that you seem to consider worthy of criticism is a necessary step towards a better future.
And I do take offense when people insinuate that I am proud of anything but my very own accomplishments, because pride in something you had no part in (e.g. patriotism) is among the most moronic sentiments mankind ever came up with.
let’s make it “every one million dollar over it” and these greedy fucks would still not survive it.