This feels like a more cogent opinion piece on AI’s potential impact on programming:
AI is coming for your shitty “journalism” jobs writing about AI taking your jobs.
This is a much better article. OP’s article just shows the author’s surface understanding of how coding works and how well an LLM can actually code. There’s way more that goes into a programming task than just coding.
I see LLMs as having the potential of being almost like a super library. I can prompt GPT, Claude, etc. to write me a custom function that I copy, paste, test, scrutinize, and almost certainly change. It’s a tool that will make someone a more productive programmer. It won’t completely subsume a human’s ability to be creative and put the pieces together.
At the absolute worst over the next decade, I could see programming changing from writing and debugging code to prompting, stitching together, and debugging.
It’s the same with CAM software in CNC, like sure, If you set it up right (which is a skill in and of itself) it can spit out a decent toolpath, but there’s tons of magic to be done by hand and understanding the way the underlying G code works allows you to make small changes on the fly.
You can’t write this kind of thing if you understand what a programmer does. The biggest part of the job is finding a good way to break down a problem into executable steps, not just actually writing the code.
Executable and maintainable
AI generated code can’t, as of yet, go in and fix a bug
Stakeholders struggle to give accurate requirements most of the time, they’re not gonna be programming with ChatGPT soon. AI can really improve a good developer’s output though.
TL;DR for the linked article
The article discusses how the rise of AI may impact computer science careers going forward. While coding jobs have long been seen as stable career paths, chatbots can now generate code in various languages. Developers are using AI tools like Copilot to accelerate routine coding tasks. Within a decade, coding bots may be able to do much more than basic tasks. However, programmers will still be needed to guide AI toward productive solutions. Teaching coding is also becoming more challenging, as students could use chatbots to cheat. Conceptual problem-solving skills will remain important for programmers to apply their expertise where AI falls short. The future may belong to those who can think entrepreneurially about how technology solves problems.
In the end, what students study may matter less than their ability to apply knowledge to technology challenges.
This comment was generated by a bot. Send comments and complaints via private message.
However, programmers will still be needed to guide AI toward productive solutions
So it would still be safe, they’d just be doing different work from what they do now. Same as how other advances in tech stacks made it so we do things differently now than 30 years ago.
People are very adaptable
Indeed. Do people still use emacs to code, for example?
Technologies evolve. People coding today in COBOL or Fortran are few and far between (but very well compensated).
Do people still use emacs to code, for example?
Umm. Yes.
Hell yea we do use emacs!
Yeah sure, i use Emacs to code. In evil mode. A lot of Emacs users use it to code. Why would you think otherwise?
Do people still use emacs to code, for example?
Sure. Why wouldn’t they?
Yes, that’s the key. I haven’t written assembly code since the 1990s, I use higher-level abstractions to get to the goal more quickly now. AI-generated code is just yet another layer of abstraction away from machine language.
I just want to mention the clever graphic design of the Illustration by Ben Kothe
It’s a multilayered visual pun. A visual punion, if you will.
There is still going to be a need for programmers and don’t think “AI” is going to be on its own anytime soon. Even using it as an assistant you need to know what you want and have some understanding of the code. Feel like most of what we see of “AI” right now is just propaganda so investors will throw money at them.
deleted by creator