We used to build software, but increasingly we build coding assistants


I think I've changed my mind on this: the software engineering profession as we know it is under threat now, because of AI. A few months ago I had not yet concluded this. I argued the cost of certain types of software development would surely drop, but there was so much latent demand that it wasn't clear to me that this would necessarily devastate employment and salaries. But I think the balance of evidence is that it's now a distinct possibility (but not a certainty) that we are going to eventually eat away at software employment and compensation with AI - in a noticeable way - soon.

A couple thoughts:

Many line-of-business problems that used to require a software engineer to be involved at some point somewhere are going to fall to no-code / low code platforms soon. In other words, the Excels and Google Sheets of the world are about to become even more important. I've seen a lot of evidence that enterprising folks (marketing, sales, product, misc generalists, etc) can iteratively prompt and tweak their way to working solutions with AI, even if the initial version has bugs. This will lead to an explosion of customized software purpose-built for every use case (this is the sort of thing that accretes at larger businesses, but now it will happen for every business). People will whine that this creates unmaintainable messes of spaghetti code, a fate worse than death. In all likelihood, the world will keep turning, even if (much fewer) professionals have to come in and clean up the occasional mess.

Elsewhere, the low end of frontend web development is rapidly being mechanized. We used to do a lot of hand-coding of HTML, but HTML generation is almost uniquely suited for large language models. Old timers will remember we used to laugh at graphical builders like Dreamweaver because of how bad the output was[1], but HTML as a language is *much* more error-tolerant than traditional Turing-complete programming languages, and thus LLMs can make mistakes and still come up with working solutions. This makes it the perfect place to automate a lot of code generation, and I think this is bad because it's a common way to break into the software engineering profession: The feedback loops are shorter and more visual. Cutting off the very bottom of the career ladder of software development is not good for industry employment in the long term, to say the least. This is already destroying the informal gig economy of web development, especially in the developing world:

Free AI tools are killing South Africa's web designer job market

In the short term, current full time professionals won't notice because there isn't a direct and visible connection to job loss. I think this recalls the slowly boiled frog analogy: the current generation of coding assistants aren't anywhere near good enough to replace a human outright. It's just that a bunch of needs that used to build up over time and eventually result in an engineer being hired somewhere will now be delayed. Zoom out to the entire economy and this becomes a very big deal, even if no one ever hears the words "we're laying you off and replacing you with an AI that writes software." The demand curve very gently bends downward as productivity continuously increases, just as the frog is gradually boiled very slowly.

How to program anything in 2024: 1. Write a very good documentation. 2. Let Opus generate the code from it. 3. Verify, polish and test. 4. Done.
Via Twitter: Some engineers now have software development workflows like this (Opus refers to Claude 3 Opus, the most capable model from Anthropic, an artificial intelligence company, at the moment).

By the way, coding assistants used by professional engineers will get better as feedback from the first generation (essentially prototypes built in vitro) becomes training data for the next generation. This is likely a genuine and durable first mover advantage: There's a positive feedback loop where increased coding assistant adoption directly results in a better product. I think there are also coming innovations in interaction models that will make assistants way more powerful than just "autocomplete, but for code." Github's Copilot Chat and Cursor (made by a company called Anysphere) are giving us a glimpse of that future: they can analyze code and suggest refactors, explain subtle errors and workarounds, read and summarize modules, and even directly make changes or add features in response to inline-prompts:

Video from: https://cursor.sh

The underlying base models of coding assistants are improving as they get more training and improved architectures - the actual degree of stepwise improvement in a specific task is hard to predict from generation to generation, but we have enough data now that the long term trend is undeniable: more model parameters and more data leads to better capabilities. This is even before we factor in hardware improvements that give us free performance - more compute and more memory for both better training and inference.

The implications for total software employment are not particularly encouraging. My argument is not that software engineering as we know it will become obsolete. Instead, we have introduced an additional layer of abstraction (code generation capability), making the previous layer (writing code in a high level language) much more productive (to specialists) and accessible (to generalists). The result is demand is shifting away from the straightforward work that used to be entry-level software development, and towards AI/ML engineers and GPUs, for a net loss. You don't need large changes for large effects: In 2023, its likely at most 6% of tech workers in the United States were laid off [2]. This was widely regarded as a bloodbath! If we're conservative and say AI makes software engineers 5% more productive every year, that could imply a bloodbath every year for the foreseeable future, unless demand picks up.

I am a little worried. It's possible that the signs were there all along, and that I didn't realize it sooner because of wishful thinking. No more.

[1] There were a lot of interesting reasons for this, but one of them was that simple mechanical transformations and edits, repeatedly applied to a nontrivial HTML document, create a bizarre nonsensical mess of mostly-useless junk markup that no human would ever write, but requires a higher level understanding of the structure of the document to consolidate and fix without introducing errors. The technology just didn't exist back then.
[2] There were 4.4 million workers with jobs classified as "Computer and Information Technology Occupations" in the United States in 2022, according to the BLS. In 2023, layoff trackers estimated technology companies laid off about 263,000 employees. If we assume everyone counted was in the United States (they weren't), then that's works out to just under 6%. This number is imperfect: BLS includes nearly 1 million "computer support specialists" in this 4.4 million number, and those aren't software engineering jobs, but also, the layoff trackers include non-software jobs, so we're overcounting on both sides and an unknown quantity of error is canceled out.