I saw another article today saying how companies are laying off tech workers because AI can do the same job. But no concrete examples… again. I figure they are laying people off so they can pay to chase the AI dream. Just mortgaging tomorrow to pay for today’s stock price increase. Am I wrong?
Yeah kinda, my coworkers talk to ChatGPT like it actually knows stuff and use it to fix their broken terraform code.
It takes them a week or longer to get simple tickets done like this. One dude asked for my help last week, we actually LOOKED at the error codes and fixed his shit in about 15 minutes. Got his clusters up within an hour. Normally a week long ticket – crunched out in 60 minutes by hand.
It feels ridiculous because it’s primarily senior tech bro engineer types who fumble their work with this awful tool.
I have never seen a clearer divide and correlation between the value I observe being produced, and those that don’t understand the limitations and value of LLMs.
It’s exhausting, because, yes, LLMs are extremely valuables, but only as so far as to solve the problem of “possible suggestions”, and never as “answers and facts”. For some reason, and I suppose it’s the same as for why bullshit is a thing, people conflate the two. And, not just any “people” either, but IT developers and IT product managers, all the way up. The ones that have every reason to know better, are the ones that seem to be utterly clueless as to what problems it solves well, what is irresponsible for it do be used for, correctly evaluating ethics, privacy and security, etc. Sometimes I feel like I’m in a mad house or just haven’t found the same hallucinogenic that everyone else is on.
I’m seeing layoffs of US workers, who are then being replaced by Indian, South American and Ireland nationals… not AI. But they’re calling it AI.
We need to figure out two words for AI that represent off shoring. I can’t think of any though.
AI = Actually Indians
If you want an example, my last job in telecom was investing hard in automation and while it was doing a poor job at the beginning, it started to do better and better, and while humans were needed, we had to do less work, of course that meant that when someone left the job, my employers didn’t look for replacements.
To be honest I don’t see the AI doing the job of tech workers to a point we should worry now… But in 20 years? That’s another story. And in 20 years if I get fired probably no one will want to hire me, so I’m already working on a plan B.
20 years? The way they talk it’s gonna happen in 20 weeks. Obviously, they exaggerate, but it does seem we are on the edge of something big.
Yes, IMO tech is moving towards getting easier.
I’m not saying it is, but I bet that in a couple of years you can spin up a corporate website-management-platform on a 50€ raspberry instead of having a whole IT department managing emails, webservers and so on.
Things are getting easier and easier IMO.
Yeah when I said 20 years I wanted to express something that looks distant, I think that we will see a big change sooner. To be honest the plan B I’m working for, I’m trying to make it happen asap, hopefully next year or in two years, I may be overreacting but personally I’m not going to wait for the drama to really begin to take actions.
deleted by creator
I called Roku support for a TV that wasn’t working and 90% of it was a LLM.
All basic troubleshooting including factory resetting the device and such seemed like it was covered and then they would forward you onto the manufacturer if it wasn’t repaired because at that point they assume it is likely a hardware issue (backlight or LCD) and they want to get you to someone who can confirm and sell you a replacement I’m sure.
Some things like image recognition, text classification, are way way easier using pretrained transformers.
As for generating code, I already used to spent a lot of time chasing bugs juniors made but can’t figure out. The process of making such bugs has now been automated.
I took some obviously ai genetated code (it had comments so I know they didn’t write it) from an offshore senior engineer and asked chatgpt what was wrong with it, and sent the result back to the guy… cause it was right.
I think quite the opposite AI is making each tech worker more efficient at the simple tasks that ai is capable of handling while leaving the complex high skill tasks to humans.
I think that people see human output as a zero sum game and that if ai takes a job then a human must lose a job I disagree. Their are always more things to do more science more education more products more services more planets more stars more possibilities for us as a species.
Horses got replaces by cars cos a horse can’t invent more things to do with itself. A horse can’t get into the road building industry or the drive through industry etc.
More science to do… made me think of portal. :)
I was channelling the lemons
There are lots of types of work in the tech space. The layoffs I seen have impacted sales and marketing (probably happens elsewhere too) because AI makes the day to day work efficient enough they don’t need as many people.
Had a new hire try to do all his automation programming in python with an AI. It was horrifying.
Lists and lists and lists of if else statements they caught if a button errored but never caught if it did the right thing. 90% of their bug reports were directly due to their own code. Trivially provable.
Work keeps trying to tell us to use more AI but refuses to mention whether the training data is using company emails. If it is then a buttload of unlabeled non public data is getting fed into it. So only a matter of time until a “fun fact” from the AI turns into a nightmare.
Most of our stuff is in an obscure field with outdated code, so any coding assistance is not really that impressive.
Without saying too much, my company implemented innovative “AI” applications to reduce time wasted by certain workflows. I think I don’t have to worry about job security for the next decade…
I work for a web development agency. My coworkers create mobile apps, they start off with AI building the app skeleton, then they refine things manually.
I work with PHP and some JavaScript and AI supports me optimizing my code.
Right now AI is an automatization tool that helps developers save time for better code and it might reduce the size of development teams in the near future. But I don’t see it yet, and I certainly don’t see it replacing developers completely.
No, it’s basically filling the role of an auto complete and search function for code based. We’ve had this for a while and it generally works better than a lot of stuff we’ve had in the past, but it’s certainly not replacing anyone any time soon.
I don’t know Python, but I know bash and powershell.
One of the AI things completely reformatted my bash script into a python the other day (that was the intended end result), so that was somewhat impressive.