

That sucks. And it’s a great example of how people don’t get how difficult it is to deal with this stuff as a parent. Even when you know better.


That sucks. And it’s a great example of how people don’t get how difficult it is to deal with this stuff as a parent. Even when you know better.


Everyone has learned to keep this private now. If it leaks, that’ll get less attention.


Maybe they shouldn’t be chatting there, and this is a good deterrent. And if we can say parents are the solution, then are they not again the solution here? They should keep their kids away from facial recognition, and away from online spaces that don’t know their customers.


He also talks about the current Twitter being parallel media. And how that’s a good thing for furthering his agenda of tech ceos taking control of society.


What’s an example of worse?


It’s funny how often things stay the same in information technology just due to inertia. Yet the overall pace of change has been relentless.


OpenGraph tags in particular? Or do you mean something else?


Lots of great software ideas out there. It’s always the execution, availability of resources, and the reality of capitalism getting in the way.


He’s just an old man they can push around.
But the head of Nvidia says he’s very smart, and has a lot of stamina. He remembers everything.


The ycombinator guy calls this “parallel media” and talks openly about how it’s intended to replace governments with tech companies.
Wow, so signal ranks second worst in “safety” with a 2. While discord has a 4.
So it’s worse than discord at dealing with unwanted CSAM uploads? What a wild ranking system this guy has.
How many of these have moderation issues? Like, unwanted content uploads and stuff. How many expose you to accidentally hosting illicit content?


Platforms should beheld partially responsible. They enable creators like this through algorithms.


I don’t disagree. We need better tools for this.
The problem is making sure these tools are not used as tools of the state for tracking.


I think it has become increasingly clear that letting platforms off the hook has been really bad for society across the world across many dimensions. We need something in the middle of zero responsibility and full responsibility.


Parents, right? That’s always the solution to platforms.
Edit: all the ironic upvotes. I was being sarcastic. Parents won’t keep their predator sons and daughters off Roblox.


I’m not putting it aside to dismiss the idea that it’s bad. I was doing so just for the sake of conversation. You’re really overreacting to conversation.
This age verification stuff is an invasion of privacy, and we should have a better answer to the claims that it is the only way to put in protection on the platform side. “Parents” is not the answer to the platform’s responsibility here as a facilitator who profits from facilitating.
Age verification systems are invasive and harm privacy AND it’s weird that we think platforms should be all ages with no serious/effective barriers. It’s weird that we just act like there is no solution, and bemoan parents.
I was quite clear about switching the topic focus.


Ok, let’s say it is. They just want to invade our privacy. Now set that aside.
On a separate topic: What’s an alternative solution for the “there’s porn on the playground” problem that discord has? They are participants in it, they are facilitators. They shouldn’t be immune. Giving platforms a pass on things like this is pernicious. Giving platforms a pass is why Elon Musk thinks he can get away with CSAM generators.


Right, it’s weird that parents let kids play on platforms that have porn on them. Something needs to be done, maybe we should go after the parents instead.
Or is the answer just saying parents should do something and then do nothing? We need a clever solution that does not involve sharing proof of identity… any ideas? Banning adult content completely would work too.
And isn’t this not a unique idea? Surely OpenAI had something in the works already.