• @kescusay@lemmy.world
    link
    fedilink
    English
    1404 months ago

    Headline in six months: Salesforce Hires Software Engineers After Realizing Middle Managers Don’t Know How To Turn AI-Generated Code Into Actual Applications

    Being a software engineer is a hell of a lot more than just the actual act of writing code.

    • @SlopppyEngineer@lemmy.world
      link
      fedilink
      English
      314 months ago

      Maybe if we’d put LLM powered puppets in the meetings with management so developers can just continue with their actual work we’d get a lot more done.

    • @NotSteve_@lemmy.ca
      link
      fedilink
      English
      94 months ago

      Knowing companies, they won’t realise anything and will just make their existing employees pick up the slack

    • @invertedspear@lemm.ee
      link
      fedilink
      English
      354 months ago

      Compete? They don’t need to compete. Their vendor lock in strategy is unbeatable. I have no idea how they continue to scam companies onto their platform, but I don’t know anyone that’s happy with it after a few years (except that one ass hat at every company that somehow keeps moving more business processes to it), and yet I’ve never seen any company successfully get off it.

  • @doeknius_gloek@discuss.tchncs.de
    link
    fedilink
    English
    434 months ago

    We will have more salespeople next year because we really need to explain to people exactly the value that we can achieve with AI. So, we will probably add another 1,000 to 2,000 salespeople in the short term.

    Well, good luck!

    I can’t wait for the AI bubble to burst. It’s going to be hilarious to see these kinds of CEOs falling flat on their faces. Unfortunately, it will not be the CEOs who will suffer the most from the consequences.

    • Natanael
      link
      fedilink
      English
      244 months ago

      The funny thing is it’s easier to replace salespeople with AI than developers. They should be losing salespeople first!

      • @peoplebeproblems@midwest.social
        link
        fedilink
        English
        64 months ago

        No man, sales people are far more important to the bottom line. Profits first, then working product in the future. It’s genius, no way that model could go wrong

    • @scarabic@lemmy.world
      link
      fedilink
      English
      54 months ago

      I have never interacted with an enterprise software salesperson as a customer. But I’ve had a ton of them as coworkers since I work in software development. Knowing them from the inside, so to speak, it is impossible for me to imagine how anyone takes them seriously. The only things they actually know or care about are their quota and bonus. How anyone bases a large cash spend on the things they say boggles my mind.

    • @scarabic@lemmy.world
      link
      fedilink
      English
      3
      edit-2
      4 months ago

      I hope it bursts soon. It’s not creating any hiring activity, which is what we little people in the industry need. But it is disruptively shifting things around and stealing funding from everything else as companies panic to put forth some kind of trash so they aren’t seen as being “behind.”

    • @Wooki@lemmy.world
      link
      fedilink
      English
      2
      edit-2
      4 months ago

      Lol that ain’t happening. They are doing this for short term gain. Line mus go up and ceo terms are medium term perfect for overstimulating their stock value and cashing out as they leave. The next ceo will come in to a crash in stock value and hard cuts are the only option.

      So in this case, it’s good for devs as it’s only happening now while its early. gtfo while you can!

      Its also worth mentioning this could be most likely more simple. Its a distraction from a sign of real financial trouble

  • @WalnutLum@lemmy.ml
    link
    fedilink
    English
    224 months ago

    But he went on to say: “We’re not adding any more software engineers next year because we have increased the productivity this year with Agentforce and with other AI technology that we’re using for engineering teams by more than 30% – to the point where our engineering velocity is incredible. I can’t believe what we’re achieving in engineering.”

    This announcement is just advertising for agentforce (their AI) they’re likely not being serious about it.

  • @Vipsu@lemmy.world
    link
    fedilink
    English
    154 months ago

    Maybe some dude in his mothers basement will use A.I to develop a good replacement for salesforce.

  • @Kazumara@discuss.tchncs.de
    link
    fedilink
    English
    124 months ago

    lol, one of our suppliers just changed to them 1.5 years ago.

    Someone managed to fuck the portal software up so much that all the ö you type in a support case get replaced by o, both in the webview and the emails. The ä and ü work fine. It’s extra fucked.

    And our support team sits in Germany, we write in German sometimes. When we use English it is only for the benefit of their Tier 3 guys.

    Plus the implementation of two factor sign in is now delayed by half a year already. It seems to me more developers could be helpful

  • @Telodzrum@lemmy.world
    link
    fedilink
    English
    -34 months ago

    Makes sense, it’s only reasonable to expect economy wide reduction in tech workers and positions as the global workforce recovers from the overtraining and overhiring that was the hallmark of the 2000s and 2010s. This is a good thing, society’s responsibility is to make retraining easy and accessible for the millions of trained tech workers who represent the overage.

  • @gencha@lemm.ee
    link
    fedilink
    English
    -64 months ago

    The sad truth is, we hardly have any software engineers anymore. Trying to find one that is not a prompt monkey has become a serious challenge. Especially new “talent” is a waste of money. You wish it wasn’t so, but AI is on par with engineers. Especially when those engineers just end up using LLMs. Even people who want to learn now have a poisoned well where facts are impossible to find

    • @ShittyBeatlesFCPres@lemmy.world
      link
      fedilink
      English
      84 months ago

      I disagree. I used to be a software engineer (and may be again at some point) and the problem with avoiding junior developers is that we need them if we ever want to have any senior developers.

      Also, LLMs don’t replace 90% of what a software engineer does. Copilot or whatever is a nice tool that spits out code. It’s not able to architect shit or choose the right tech to use in the first place.

      And to be honest, it seems like A.I. progress has hit a bit of a wall and the reality is that it may take decades, trillions of dollars, and maybe even an energy revolution to ever reach its imagined potential. Look at full self-driving cars. The tech seemed like it was 90% there about a decade ago but that last 10% of any big project is the real challenge.

      • @gencha@lemm.ee
        link
        fedilink
        English
        24 months ago

        I actually personally fully agree with you.

        I just see a different picture in the industry. Decision makers also use AI to evaluate your work. If the AI judges that your solution is not good, you face more resistance than if you submitted a solution close to the AI expectations. You are inherently incentived to not introduce original thought beyond what your executives can have explained to them by AI anyway.

        I fully understand that this is short-sighted behavior, but it’s real bottom-line-thinking of today.

    • @lobut@lemmy.ca
      link
      fedilink
      English
      34 months ago

      I’m a software engineer and you got any sources for this? We use ChatGPT and Copilot and stuff and it helps but it doesn’t seem as dire as what you’re saying from what I can see? At least not yet.

      Salesforce overhired during the pandemic like everyone else and is now selling AI as their efficienc boost or whatever.

      • @gencha@lemm.ee
        link
        fedilink
        English
        24 months ago

        There are few reports of this directly from the industry, because nobody wants to admit talent shortage. It’s a much better sell to claim that you pivot towards AI.

        I’m an enterprise consultant for technology executives, and work mostly as a platform architect for a global enterprise. The scale of this issue is invisible to most people.

        I know this is basically “trust me, bro”, and I wish I had more to show, but this evolution is in plain sight. And it’s not like AI introduced this problem either. I’m old. Still, take my Internet connection away from me, and watch me struggle to figure out if I want .includes() or .contains() on a JS array. There is a scale.

        The problem is that we’ve reached a point where it’s easier to generate a convenient result that communicates well, instead of the “correct” solution that your executives don’t understand. Decision makers today will literally take your technical concept from your presentation to have it explained to them by an LLM afterwards. They will then challenge you and your concept, based on their interactions with the LLM.

        LLMs are continuously moved towards a customer-pleasing behavior, they are commercial products. If you ask them for something, they are likely to produce a response that is as widely understood as possible. If you, as a supposed expert, can’t match those “communication skills”, AI-based work will defeat you. Nobody likes a solution that points out unaddressed security issues. A concept that doesn’t mention them, goes down a lot easier. This is accelerated by people also using AI to automate their review work. The AI prefers work that is similar to its own. Your exceptional work does not align with the most common denominator.

        You can’t “just Google it” anymore, all results are LLM garbage (and Google was always biased to begin with as well). All source information pools are poisoned by LLM garbage at this point. If you read a stack of books and create something original, it’s not generally understood, or seen as unnecessarily complicated. If you can ask an AI for a solution, and it will actually provide that, and everyone can ask their LLM if it’s good stuff, and everyone is instantly happy, what are the incentives for developers to resist that? Even if you just let an LLM rewrite your original concept, it will still reach higher acceptance.

        You also must step outside of your own perspective to fully evaluate this. Ignore what you believe about LLMs helping you personally for a moment. There are millions of people out there using this technology. I attended seminars with 100+ people where they were instructed on “prompting” to generate technical documentation and compliance correspondence. You have no chance to win a popularity contest against an LLM.

        So why would I need you, if the LLM already makes me happier than your explanations I don’t understand, and you yourself are also inherently motivated to just use LLM results to meet expectations?

        Yes, I know, because my entire enterprise will crumble long-term if I buy into the AI bullshit and can’t attract actual talent. But who will admit it first, while there is so much money to be made with snake oil?