Peter Link
retired healthcare IT programmer/analyst, supporter of Palestine, Cuban Revolution, women’s rights, FOSS, Linux, Black Lives Matter. Live in Michigan, USA
- 105 Posts
- 5 Comments
Joined 1 year ago
Cake day: August 23rd, 2024
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
Peter Link@lemmy.mlOPto World News@lemmy.world•Why the British Medical Association is speaking out on GazaEnglish2·2 months agoYou’re welcome, thanks for letting me know!
Peter Link@lemmy.mlOPto World News@lemmy.world•Palestinians Are Collapsing in Gaza's Streets From Israeli-Imposed Starvation CampaignEnglish11·2 months agoWhat are you referring to? The “uncommitted” vote in the Democratic prez primary? Pro-Trump voters maybe, but uncommitted voters didn’t back Trump, they were trying to move Biden/Harris to stop supporting the genocide, which was well underway back then.
Peter Link@lemmy.mlOPto World News@lemmy.world•UK, France and 23 other countries say the war in Gaza ‘must end now’English4·2 months agoAnd what are they going to DO now?
Peter Link@lemmy.mlOPto World News@lemmy.world•In an Attack at Sunset, Israelis Set a Palestinian Village AblazeEnglish13·3 months agoA Palestinian friend who lived in my neighborhood in the US is there now, in his family’s home. He said it was like 3 nights in a war zone.
I don’t understand your logic here. Clearly, the kid had problems that were not caused by ChatGPT. And his suicidal thoughts were not started by ChatGPT. But OpenAI acknowledged that the longer the engagement continues the more likely that ChatGPT will go off the rails. Which is what happened here. At first, ChatGPT was giving the standard correct advice about suicide lines, etc. Then it started getting darker, where it was telling the kid to not let his mother know how he was feeling. Then it progressed to actual suicide coaching. So I don’t think the analogy to videogames is correct here.