

Not a new low, it’s an old low applied in new ways.
Not a new low, it’s an old low applied in new ways.
Also the free cupholder era.
Simple. They get a president to fund and expand their operations.
Why not?
I agree that even if the continent turns into a nuclear wasteland, there will still be people shouting this kind of thing pointlessly into the void.
There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.
I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.
You’re expecting it to be used responsibly when we ourselves in general are very lacking in that department.
This here is a very good example of the actual use that will happen. A rush job to meet unrealistic deadlines. And that’s what will happen as is the norm.
“like it or not, gen AI is becoming an invaluable tool for developers”…
…who wish to take a dump on their work.
They’ve rebranded. It’s N.ICE now.