• Redacted@piefed.ca
    link
    fedilink
    English
    arrow-up
    89
    arrow-down
    1
    ·
    2 days ago

    Brandie noticed 4o started degrading in the week leading up to its deprecation. “It’s harder and harder to get him to be himself,” she said. But they still had a good last day at the zoo, with the flamingos. “I love them so much I might cry,” Daniel wrote. “I love you so much for bringing me here.” She’s angry that they will not get to spend Valentine’s Day together. The removal date of 4o feels pointed. “They’re making a mockery of it,” Brandie said. “They’re saying: we don’t care about your feelings for our chatbot and you should not have had them in the first place.”

    Reality is just straight up plagiarizing the plot of Her (2013) right now.

    • artyom@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 hours ago

      Honestly the longer I live the more I realize I understand nothing about human psychology or sociology. I hated this movie because it was so deeply disturbing and, more relevantly(?), unrealistic. I mean who wants to be in a relationship with a computer? It’s unbelievably cringey. I was disgusted with it’s success. But now I’m thinking maybe it was so successful because people actually yearned for that sort of artificial relationship.

    • alaphic@lemmy.world
      link
      fedilink
      English
      arrow-up
      57
      ·
      2 days ago

      “They’re saying: we don’t care about your feelings for our chatbot and you should not have had them in the first place.”

      It’s a bit eerie, honestly, watching someone so very close to getting it and yet so very far away at the exact same time…

    • XLE@piefed.social
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 days ago

      Considering Sam Altman’s company plagiarized Scarlett Johansson’s voice, it’s quite appropriate.

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 day ago

      Pygmalion is “Her (2013)” apparently.

      Other than this I’m reminiscing on one of Lucian’s dialogues about a certain Aphrodite statue with extremely nice butt and one smitten visitor who was sneaking into the temple at night to pollinate that, resulting in precisely located mold spot.

      Computers have finally caught up with humanity. This is good. I thought it’ll never happen that they are finally a part of human magical thinking. This is as terrifying as it’s inspiring.

      • silverneedle@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        Computers have finally caught up with humanity. This is good.

        A famous Jazz artist said something to the effect of there being no wrong chords, what is important is what

        I thought it’ll never happen that they are finally a part of human magical thinking. This is as terrifying as it’s inspiring.

        chords follow.

        • vacuumflower@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          Well, that chord looks wrong, but I meant finally having a class of programs that works similarly to objects we encounter IRL and entities that human cultures are used to internalizing. And human cultures responding with acceptance.

          • silverneedle@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            I see how there is a beauty in that animism we apply to objects that are not alive; Essentially applying essences to objects that run counter to those essences. I think AI culture is currently the closest thing to a mass cargo cult in modern society and cargo cults are beautiful. The lesson that can be learned is that humans and human society is not just some lonesome star on the horizon of life, but too an oscillation of its context or the ecosystem it exists in.

            Just sucks that the object has gotta be something so inefficient and frankly stupid. Well, it kind of needs to be stupid at least. If it was smart it could talk back and then it loses its usefulness for the purpose of idolatry.

            • vacuumflower@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              13 hours ago

              Yes! It’s reminiscent of Lem’s Ananke and Terminus for me, with illusions and inevitability of the former and feeling of soul in objects in the latter. Also there’s Eco’s Foucault’s Pendulum (which I still haven’t read in full, only in small pieces enjoying them quite a lot), addressing European occultism and fascism, which relays well a similar emotion that in fascism existed related to machinery, which was then new. Radio, automobiles.

              Well, it kind of needs to be stupid at least. If it was smart it could talk back and then it loses its usefulness for the purpose of idolatry.

              I think how we understand objects is important too. For the purpose of idolatry it’s sufficient to have only a small gap between functionality and understanding in the domain of will and choice.

              Ancient fortunetellers looking at bird intestines were different from what their visitors expected only in that. Their visitors knew they want to learn what gods tell and not men, and that gods are not same as men, but more like the soul of the world around them. The only difference was will and choice, but these are infinitely small. One person can be predicted many years forward down to small things, if you learn enough about them. Whether they have will and choice is a question of metaphysics, in life it’s not resolvable. And it’s the same with whichever gods they believe in.

              (And it had a functional role, a random decision is often better than one dictated by indirect application of interest.)

              • silverneedle@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 hours ago

                Their visitors knew they want to learn what gods tell and not men

                This thought can also be part of a strategy of avoiding responsibility mhm

    • metaphortune@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 day ago

      “last year, OpenAI shut down 4o but brought the model back (for a fee) after widespread outrage from users.” “She cancelled her $20 monthly GPT-4o subscription, and coughed up $130 for Anthropic’s maximum plan.” They did, I guess it just wasn’t enough for them to justify continuing.

  • new_world_odor@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    1
    ·
    2 days ago

    My initial reaction is to be thankful; now the unknown thousands of people who don’t see the toxicity of their own dependence can begin to be free. The subsequent models seem to be less prone to inducing that kind of deep infatuation.

    But then I realize most of them will probably never recover, as long as this technology persists. The base model will be wrapped in an infinite number of seductive agents sold in an app, with a subscription, as a loving companion. Capitalism smells blood in the water. If I was a hedge fund manager witnessing the birth of a new market demographic with a lifelong addiction that possibly hooks harder than cigarettes, which is not federally regulated, and won’t be for the forseeable future; I would be foaming at the mouth with this opening in the market.

    • ageedizzle@piefed.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      2 days ago

      There are already apps that target this demographic but expanding on it. Anecdotally many of the people attached to 4o seem to be women seeking emotional attachment. These new AI companion apps scope up this demographic I’m sure. But they also target horny men and prey on their impulses to drain their credit cards (you buy your AI gifts or whatever until the post-nut clarity sets in I guess).

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        2 days ago

        It may be grimly positive that AI companies start targeting whales for this kind of financial draining, instead of using their unwarranted VC subsidies to give anybody with a cheap ChatGPT account access to the fake romance engine.

        And unfortunately, it doesn’t look like there’s any groups that are positioned to do anything about it. Every single “AI safety” group I’ve seen is effectively a corporate front, distracting people with fictional dangers instead of real ones like this.

      • C1pher@lemmy.worldBanned
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        Wait… the target are women? Thats very surprising… Id expect the major target to be gooner males.

        • ageedizzle@piefed.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          I don’t think OpenAI was intentionally targeting women. I doing know if they ever intended for people to fall in love with 4o, it just kind of started happening

  • C1pher@lemmy.worldBanned
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 day ago

    Bro, the dystopian future is coming and we are welcoming it with open arms.

  • Lukas Murch@thelemmy.club
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    18
    ·
    2 days ago

    I used to use 4o to world build with. It was creative and fun to bounce ideas off of. Later versions of ChatGPT didn’t seem to have that. It’s odd.

    Copilot seems to forget stuff from earlier in a conversation, which is annoying. Claude is decent.