• Lev@europe.pub
    link
    fedilink
    English
    arrow-up
    89
    ·
    2 days ago

    Daily reminder that Codeberg is always the good alternative to corporate bastards like this idiot

  • rimjob_rainer@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    100
    arrow-down
    1
    ·
    edit-2
    2 days ago

    I don’t get it. AI is a tool. My CEO didn’t care about what tools I use, as long as I got the job done. Why do they suddenly think they have to force us to use a certain tool to get the job done? They are clueless, yet they think they know what we need.

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      1 day ago

      Because unlike with the other tools you use the CEO of your company is investing millions of dollars into AI and they want a big return on their investment.

    • bless@lemmy.ml
      link
      fedilink
      English
      arrow-up
      58
      ·
      2 days ago

      GitHub is owned by Microsoft, and Microsoft is forcing AI on all the employees

      • ksh@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        They all need to be sued for unethical “Embrace, Extend and Extinguish” practices again

    • sobchak@programming.dev
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 days ago

      I think part of it is because they think they can train models off developers, then replace them with models. The other is that the company is heavily invested in coding LLMs and the tooling for them, so they are trying to hype them up.

    • Jhex@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      2 days ago

      Why do they suddenly think they have to force us to use a certain tool to get the job done?

      Not just that… why do they have to threat and push for people to use a tool that allegedly is fantastic and makes everything better and faster?.. the answer is that it does not work but they need to pump the numbers to keep the bubble going

    • MajorasMaskForever@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      It’s not about individual contributors using the right tools to get the job done. It’s about needing fewer individual contributors in the first place.

      If AI actually accomplishes what it’s being sold as, a company can maintain or even increase its productivity with a fraction of its current spending on labor. Labor is one of the largest chunks of spending a company has so, if not the largest, so reducing that greatly reduces spending which means for same or higher company income, the net profit goes up and as always, the line must go up.

      tl;dr Modern Capitalism is why they care

    • 0x0@lemmy.zip
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      They are clueless, yet they think they know what we need.

      Accurate description of most managers i’ve encountered.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 days ago

      They are clueless, yet they think they know what we need.

      AI make money line go up. It’s not clueless, he’s trying to sell a kind of snake oil (ok, not “snake oil”, I don’t think AI is entirely bad).

  • Jocker@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 day ago

    Contradictory to the title, this message is not to the developers, developers don’t care what github ceo thinks, and they should know it. This might be for the management of other companies to allow using ai or force ai usage.

  • redlemace@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    2 days ago

    such an easy choice …

    (edit: I followed up and got out. This too is now self-hosted and codeberg when needed)

  • zarkanian@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    234
    ·
    2 days ago

    This part really stuck out for me:

    This is the latest example of a strange marketing strategy by AI companies. Instead of selling products based on helpful features and letting users decide, executives often deploy scare tactics that essentially warn people they will become obsolete if they don’t get on the AI bandwagon.

    If hype doesn’t work, try threats!

    • A_Random_Idiot@lemmy.world
      link
      fedilink
      English
      arrow-up
      58
      ·
      2 days ago

      Which is how you know they have a good product that they have full faith in.

      when they have to blackmail, threaten, coerce, and force people to accept their product.

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      Threats work well for scams. People who couldn’t be bothered to move by promises of something new and better can be motivated by fear of losing what they already have.

      It’s really unfortunate psychology is looked down upon and psychologists are viewed as some “soft” profession. Zuck is a psychology major. It’s been 2 decades, most of the radical changes in which were not radical in anything other than approach to human psychology.

      BTW, I’ve learned recently that in their few initial years Khmer Rouge were not known as communist organization to even many of their members. Just an “organization”. Their rhetoric was agrarian (of course peasants are hard-working virtuous people, and from peasantry working the earth comes all the wisdom, and those corrupt and immoral people in the cities should be made work to eat), Buddhist (of course the monk-feudal system of obedience, work and ascese is the virtuous way to live, though of course we are having a rebirth now so we are even wiser), monarchist (they referred to Sihanouk’s authority almost to the end), anti-Vietnamese (that’s like Jewish for German Nazis, Vietnamese are the evil). And after them taking power for some time they still didn’t communicate anything communist. They didn’t even introduce their leadership. Nobody knew who makes the decisions in that “organization” or how it was structured. It didn’t have a face. They only officially made themselves visible as Democratic Kampuchea with communism and actual leaders when the Chinese pressured them. They didn’t need to, because they were obeyed via threat (and lots of fulfillment) of violence anyway.

      This is important in the sense that when you have the power, you don’t need to officially tell the people over which you have it that you rule them.

      So - in these 2 decades it has also came into fashion to deliberately stubbornly ignore the fact that psychology works over masses. And everybody acts as if when there’s no technical means to make people do something, then it’s not likely or possible.

    • Echolynx@lemmy.zip
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      For some odd reason, this calls to mind an emotionally immature parent trying to get their toddler to eat vegetables… no reason at all…

      • uzay@infosec.pub
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        Just that the vegetables in this case are actually fastfood and gummibears.

  • Fedditor385@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    edit-2
    2 days ago

    AI can only deliver answers based on training code developers manually wrote, so hod do they expect to train AI in the future if there is no more developers writing code by themselves? You train AI on AI-generated code? Sounds like expected enshittification down the line. Inbreeding basically.

    Also, small fact is that they invested so much money into AI, that they can’t allow it to fail. Such comments never came from people who don’t depend on AI adoption.

    • Showroom7561@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      It’s like all those companies who fast tracked their way into profits by ignoring the catastrophic effects they were having on the environment… Down the road.

      Later is someone else’s problem. Now is when AI-pushers want to make money.

      I hate where things have been heading.

    • WhyJiffie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      same as how it goes on the stock market? they don’t care about the long term, but only the short term. what happens on the long term is somebody else’s problem, you just have to squeeze out everything, and know when to exit.

      they are gambling with our lives. but not with theirs. that’s (one of) the problem: they are not fearing their lives.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    ·
    2 days ago

    Expectation: High quality code done quickly by AI.

    Reality: Low quality AI generated bug reports being spammed in the hopes the spammers can get bug bounty for fixing them, with AI of course.

  • medem@lemmy.wtf
    link
    fedilink
    English
    arrow-up
    38
    ·
    2 days ago

    “Managing agents to achieve outcomes may sound unfulfilling to many”

    No shit, man.

  • ipkpjersi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    1 day ago

    Threatening remarks like that are why I learned PHPUnit and XDebug, and yeah it made me become a better developer, but often times these are just empty statements.

    AI is just another tool in my toolbox, but it’s not everything.

      • Trapped In America@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        84
        arrow-down
        2
        ·
        edit-2
        2 days ago

        Don’t worry, they’re gonna eat themselves doing shit just like this. It’s not a matter of if, but when.

        “AI” has it’s uses (medicine, engineering, etc.), but 99.99% of the snake oil they’re selling are just gimmicky cash grabs. Classic cases of Just because you can, doesn’t mean you should.

        Let them burn their money, I say. Fuck it. Just sit back and enjoy the fire.

        • Pika@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          58
          ·
          2 days ago

          Hard agree. AI is not currently at the stage that CEO’s think it’s at. A few years down the road there’s going to be a hard crash, when the problems overthrow the benefits and they realize they are just throwing money away. Sadly this also will be accompanied with a IT/Software “sinkhole” because many who were competent in the field will have moved on to the next thing as the jobs wern’t there anymore.

          Something similar happened with the Nursing field during COVID, prior to the event, there was a steady if not overflow of medical professionals, but when COVID occurred they started being treated like tools, medical facilities started having to pay mad amounts of money on traveling staff that jumped from facility to facility due to it to even partially make up for it as many left the field. Jump to today, the problem still exists, an educated field like IT or nursing can’t have an event that results in tons of people leaving the profession, as you can’t just snap your finger and get that knowledge back. It will take years to regain that trust and get people back into the fields again.

          • CmdrShepard49@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            36
            ·
            2 days ago

            AI is not currently at the stage that CEO’s think it’s at. A few years down the road there’s going to be a hard crash, when the problems overthrow the benefits and they realize they are just throwing money away.

            I think they’re aware which is why they’re posturing with BS statements such as his. They wouldn’t need to force it on people if it were actually as good as they want people to think it is. They want to cash in now because they know the house of cards will crumble sooner than later.

          • WanderingThoughts@europe.pub
            link
            fedilink
            English
            arrow-up
            12
            ·
            edit-2
            2 days ago

            Even more fun, the stock market is propped up by Nvidia and AI companies buying their chips. If AI crashes, it’s a new financial crisis. And if the market crashes, the layoffs at far were just a warmup.

            • Tollana1234567@lemmy.today
              link
              fedilink
              English
              arrow-up
              4
              ·
              2 days ago

              they already laid off so many people, when it does crash, it will. do they expect the programmers/devs they dint fire to hold thier company over til thier next grift, with so little people.

              • WanderingThoughts@europe.pub
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 days ago

                They’ll expect that and in lot of cases fail, while China, India en EU will try buy everything for cents on the dollar. Then USA starts to lose its dominant position in digital services, what’s now a big part of the export. Or the government can panic and nationalize the whole sector. It’s not sure how things turn out, but it’ll be a weird time.

          • jaybone@lemmy.zip
            link
            fedilink
            English
            arrow-up
            10
            ·
            2 days ago

            I predict it will be even more somewhat lesser skilled white collar type office jobs. Like insurance adjusters and other insurance policy related jobs come to mind. AI will completely fuck this up. There will be massive lawsuits and these companies will go out of business. Same thing with other industries. Once they realize the massive fuckup they made, they will try to switch back but no one will be there available to come back. And then they are fucked. The more industries this happens to, the worse the crash will be as it affects many diverse industries. It’s a huge recipe for disasters, like Great Depression style. And with trump’s tarrifs to fan the flame, we are well on our way.

            • Endymion_Mallorn@kbin.melroy.org
              link
              fedilink
              arrow-up
              4
              ·
              2 days ago

              Don’t worry, until Trump gets his insurance adjusted by an LLM trained on real data about him (won’t happen), he’ll make executive orders exempting the LLM users from legal action.

              also - love the Johnny Mnemonic inspired username.

            • Tollana1234567@lemmy.today
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              2 days ago

              it has a cascading effect, its already affecting state university in the west in enrollment, because they dont see a future in thier degree, they are either not choosing to come to a particular 4 year university, or looking at other universities in other areas.

          • MasterBlaster@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 days ago

            This is exactly what happened to manufacturing and chip making of 40 years of “free trade”. We lack the skilled staff for these jobs.

            Continuing on the nursing topic, well before covid there was a shortage of nurses, then the media blitz convinced many people to get degrees… There were so many looking for work that wages plummeted.

            It’s all a shell game. The goal is to make the labor suplly huge so they can dictate wages, which they did.

            They did it with programmers overthe last ten years… Now nobody can find a job.

            I’m shocked! Shocked I tell you!

            • Tollana1234567@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 days ago

              travelling nurses seems to be the way to go, to earn bank. being a staff at a hospital or medical center doesnt seem attractive, unless your in a really backwoods state like a red one, where they let nurses fall to the cracks to be hired. Also the pandemic, people during thier university years wernt learning anything so they were also fucked from the start, since everything was online and not in person, thats why im seeing such bad reviews in universities in my area. the first 2 years is pretty much crucial to determine your strength in your degree, and then some experience, which was probably non existent during covid, like with labs and research.

          • corsicanguppy@lemmy.ca
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            2 days ago

            Something similar happened with the Nursing field during COVID, prior to the event, there was a steady if not overflow of medical professionals, but

            What drew me to this collection of a full sentence and another fragment spliced in wasn’t the comma splice: it was the perfect example of beggaring the question.

            I’m still not sure whether the bad writing was accidental or an attempt to divert from the false premise.

            At no time has there been sufficient medical staff.

        • ☂️-@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 days ago

          yes, but we will be burdened with the consequences somehow. we will be the ones to pay the price, as always.

      • the_q@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        You can’t point this out! People will flip the responsibility to you!

    • takeda@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      38
      ·
      2 days ago

      Copilot is shit.

      Exactly, my company provides license for copilot and I use it, and while it has some highlights most of the time it actually is more a nuisance than help.

      It especially annoys me because it hijacks autocomplete based on types with is own that frequently has subtle bugs, so now if I have it enabled I need to be on guard all the time. With the traditional autocomplete I could just trust it to be correct.

      • Buckshot@programming.dev
        link
        fedilink
        English
        arrow-up
        19
        ·
        2 days ago

        This is my experience. It saves a bit of typing sometimes but that’s probably cancelled out by the time spent correcting it, rewriting nonsense it produced, and reviewing my corworkers PRs that didn’t notice the nonsense.

        • Blooper@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Here’s where I’ll give it credit:

          1. It can spit out a beautiful readme.md file
          2. It will insert comments to explain the more nuanced aspects of my code for those viewing it for the first time

          Doesn’t make up for the annoying-ass auto complete hijacking though. Stupid thing keeps making up non-existent functions and api’s and inserting them all over the place.

      • AdamBomb@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Turn off the autocomplete, it’s shit. Do use agent mode for targeted tasks that are easy but laborious. Don’t give open ended or subjective prompts. Don’t ask it to do anything creative or novel. It has its uses. Nowhere near what the snake oil salesmen would have you believe, and probably not worth the unsubsidized cost, but for now it has uses.

      • mx_smith@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        You have to put it in Ask mode so it doesn’t touch your code also ChatGPT models are free so if you want to ring up an AI bill use the Claude and Sonnet models.

      • Tollana1234567@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        i wonder if this the reason why its so bad on the phones, it autocomplete with words that arnt even close to what you are typing.

    • NotSteve_@piefed.ca
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      4
      ·
      edit-2
      2 days ago

      Copilot is shit

      Yes and no. I find its terrible at solving more complex problems but its great at writing out tests for a function/view that covers every flow. My team went from having like 40% (shit) coverage to every PR having every case tested (inb4 they’re not good tests, they are good)

      With that being said, fuck CEOs and fuck AI. At least you could (mostly) escape the blockchain hype

  • aliser@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    2 days ago

    does “embracing AI” means replacing all these execs with it? or is it “too far”?

    • Soup@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      No, they’re all super special and have an “instinct” that a robot could never have. Of course the same does not go for artists or anyone who does the actual work for these “titans of industry”.

      *by “instinct” we, of course, mean survivorship bias based on what is essentially gambling, exploitation, and being too big to fail.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    2 days ago

    I asked an AI to generate me some code yesterday. A simple interface to a REST API with about 6 endpoints.

    And the code it made almost worked. A few fixes here and there to methods it pulled out of it’s arse, but were close enough to real ones to be an easy fix.

    But the REST API it made code for wasn’t the one I gave it. Bore no resemblance to it in fact.

    People need to realise that MS isn’t forcing it’s devs to write all code with AI because they want better code. It’s because they desperately need training data so they can sell their slop generators to gullible CEOs.