Lvxferre [he/him]

I have two chimps within, Laziness and Hyperactivity. They smoke cigs, drink yerba, fling shit at each other, and devour the face of anyone who gets close to either.

They also devour my dreams.

  • 0 Posts
  • 15 Comments
Joined 2 years ago
cake
Cake day: January 12th, 2024

help-circle




  • I saw in a recent Youtube video that between web services and AI, Windows licencing is only about 10% of Microslop’s business.

    That’s correct. Here’s some data on Microsoft’s revenue:

    40%     Server Products and Cloud Services
    22%     Office Products and Cloud Services
    10%     Windows
     9%     Gaming
     7%     LinkedIn
     5%     Search and News Advertising
    

    IDK if that number is true, but it sure would explain how much they’ve put into user experience.

    It does but it’s really short-sighted from MS’s part. Sure, Windows might be only 10% of its business, but the other 90% heavily rely on it. Or rather on Windows being a monopoly on desktop OSes; without that people Windows servers, Office and MS “cloud services” (basically: we shit on your computer so much you need to use ours) wouldn’t see the light of the day.










  • IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

    Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

    This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

    And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

    Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.


    EDIT: I’m going to abridge what I said above, in a way that even my dog would understand:

    What Grok is doing is harmful, there are victims of that, regardless of some “ackshyually this is not CSAM lol lmao”. And yet you guys keep babbling about definitions?

    Everything else I said here was contextualising and detailing the above.

    Is this clear now? Or will I get yet another lying piece of shit (like @Atomic@sh.itjust.works) going out of their way to misinterpret what I said?

    (I don’t even have a dog.)