

15·
12 days agoThey have been debunked as lie detectors…
…But they can work at scaring the person testifying into giving away more information.
They have been debunked as lie detectors…
…But they can work at scaring the person testifying into giving away more information.
And even if it was more similar, as long as it’s not just reposting someone else’s post, we need more people to post stuff, not less.
Different person here.
For me the big disqualifying factor is that LLMs don’t have any mutable state.
We humans have a part of our brain that can change our state from one to another as a reaction to input (through hormones, memories, etc). Some of those state changes are reversible, others aren’t. Some can be done consciously, some can be influenced consciously, some are entirely subconscious. This is also true for most animals we have observed. We can change their states through various means. In my opinion, this is a prerequisite in order to feel anything.
Once we use models with bits dedicated to such functionality, it’ll become a lot harder for me personally to argue against them having “feelings”, especially because in my worldview, continuity is not a prerequisite, and instead mostly an illusion.