

Right, it’s okay if you’re saving a lot of money in the process.
Basically a deer with a human face. Despite probably being some sort of magical nature spirit, his interests are primarily in technology and politics and science fiction.
Spent many years on Reddit before joining the Threadiverse as well.


Right, it’s okay if you’re saving a lot of money in the process.


So the bar has shifted from “it’s okay to replace dish-washers and others such staff with robots, as long as artist jobs are protected” to “well, okay, you can replace certain kinds of artist with robots.”
Which kind of artist is next in line?
It’s far from obvious these days that the US wouldn’t apply a mandatory straight-up bribe to its government services. It’d just be a smaller-scale version of Trump’s “gold card” visa.


Indeed. Although it’s painful to see these things happening at such crucial junctures to such crucial individuals, overall it’s a good thing that corruption is being rooted out. The process of rooting out corruption will of course expose a bunch of corruption in the process, but better that it be exposed than to let it continue to fester.
It would have been lovely to flip a magical switch the moment Ukraine left Russia’s orbit that made corruption go away, but that switch doesn’t exist. Corruption probes and prosecutions are what will cause the change to happen.


It works because the .png and .jpg extensions are associated on your system with programs that, by coincidence, are also able to handle webp images and that check the binary content of the file to figure out what format they are when they’re handling them.
If there’s a program associated with .png on a system that doesn’t know how to handle webp, or that trusts the file extension when deciding how to decode the contents of the file, it will fail on these renamed files. This isn’t a reliable way to “fix” these sorts of things.


So it’s basically “nobody wants to use it because nobody is using it.”
I actually rather like it, and at this point many of the tools I use have caught up so I don’t mind it any more myself.


It’s so that the readers can tell who’s being talked about at a glance, not weird at all.


I have to admit, as an Albertan, to the tiniest bit of relief to finally see a different province’s name in a headline like this.
Also dread that it’s spreading, though.
I’m sure this thread will have more than just knee-jerk scary “feels” or inaccurate pop culture references in it, and we’ll be able to have a nice discussion about what the technology in the linked article is actually about.


If you believe that Google’s just going to brazenly lie about what they’re doing, what’s the point of changing the settings at all then?
In fact, Google is subject to various laws and they’re subject to concerns by big corporate customers, both of which could result in big trouble if they end up flagrantly and wilfully misusing data that’s supposed to be private. So yes, I would tend to believe that if the feature doesn’t say the data is being used for training I tend to believe that. It at least behooves those who claim otherwise to come up with actual evidence of their claims.


You are being sarcastic but this is indeed the case. Especially for companies like Google, which are concerned about being sued or dumped by major corporations that very much don’t want their data to be used for training without permission.
There’s a bit of a free-for-all with published data these days, but private data is another matter.


Yes, they are. Not sure why you are bringing that up.
I am bringing it up because the setting Google is presenting only describes using AI on your data, not training AI on your data.


Yes, exactly. Training an AI is a completely different process from prompting it, it takes orders of magnitude more work and can’t be done on a model that’s currently in use.


I have yet to see any of these news sites show evidence that this setting is for allowing training with your data. That’s not what the setting itself says, it seems like this is just a panicked ripple of clickbait titles sweeping rapidly across social media on a wave of AI dopamine.


Yes, but the point is that granting Google permission to manage your data by AI is a very different thing from training the AI on your data. You can do all the things you describe without also having the AI train on the data, indeed it’s a hard bit of extra work to train the AI on the data as well.
If the setting isn’t specifically saying that it’s to let them train AI on your data then I’m inclined to believe that’s not what it’s for. They’re very different processes, both technically and legally. I think there’s just some click-baiting going on here with the scary “they’re training on your data!” Accusation, it seems to be baseless.


Understand that basically ANYTHING that “uses AI” is using you for training data.
No, that’s not necessarily the case. A lot of people don’t understand how AI training and AI inference work, they are two completely separate processes. Doing one does not entail doing the other, in fact a lot of research is being done right now trying to make it possible to do both because it would be really handy to be able to do them together and it can’t really be done like that yet.
And if you read any of the EULAs
Go ahead and do so, they will have separate sections specifically about the use of data for training. Data privacy is regulated by a lot of laws, even in the United States, and corporate users are extremely picky about that sort of stuff.
If the checkbox you’re checking in the settings isn’t explicitly saying “this is to give permission to use your data for training” then it probably isn’t doing that. There might be a separate one somewhere, it might just be a blanket thing covered in the EULA, but “tricking” the user like that wouldn’t make any sense. It doesn’t save them any legal hassle to do it like that.


I’m not seeing where any of this gives Google permission to train AI using your data. As far as I can see it’s all about using AI to manage your data, which is a completely different thing. The word “training” appears to originate in Dave Jones’ tweet, not in any of the Google pages being quoted. Is there any confirmation that this is actually happening, and not just a social media panic?


The similarity lies in what accusations the Americans are slinging, not their physical proximity.
You’ve already established that it’s okay to switch to AI to save money, we’re now just dickering about the specific price. You were the one who introduced the 50-times threshold, I’m not concerned about being so specific.
Indeed, AI tends to be more economically beneficial for smaller studios. It’s one of the things I like about it.