Opinion The End of Software - "Majoring in computer science today will be like majoring in journalism in the late 90’s"

Link 1 | Link 2 | Archive 1 | Archive 2

To understand how software will change, we can benefit from studying how technology has changed other industries. History tends to rhyme, if you listen.

Before the internet, media behaved very differently—it was expensive to create. You had to pay people to make content, edit it, and distribute it. Because content was expensive to create, it had to make money. And consumers paid—newspapers, magazines, books, cable, and pay per view. Warren Buffett famously loved newspapers—and who wouldn’t love a predictable subscription business with local monopolistic dynamics?

When the internet happened, media companies viewed it as a way to reach broader audiences and reduce their distribution costs. But what no one saw coming was that the internet not only reduced distribution costs to zero, but it also drove the cost of creating content to zero. User generated content flourished, and when content doesn’t cost anything to create, it no longer has to make money. How does content behave when it no longer has to make money? The relaxation of this economic constraint led to a Cambrian explosion–you can take a picture of a cup of coffee, post it to a million views or none at all and the market clearing price is still met. This produced a deluge of content that none of us could reasonably consume. This necessitated products to direct attention, merchandise this content, and route us effectively–we understand these now as user-generated content platforms.

These platforms completely T-boned media companies. As a media company, you were competing for the same attention of users, but with a strictly higher COGS. The more people you had on your payroll that were creating content, the more exposed you were to being flanked by user-generated content platforms. Structurally, investing in media has been a losing value proposition ever since and value creation has shifted entirely to the platforms that control distribution.

Software is expensive to create. You have to pay people to create it, maintain it, and distribute it. Because software is expensive to create, it has to make money. And we pay for it–software licenses, SaaS, per seat pricing, etc. Software margins have historically been an architectural envy–90+% margins and zero marginal cost of distribution.

Software is expensive because developers are expensive. They are skilled translators–they translate human language into computer language and vice-versa. LLMs have proven themselves to be remarkably efficient at this and will drive the cost of creating software to zero. What happens when software no longer has to make money? We will experience a Cambrian explosion of software, the same way we did with content.

Vogue wasn’t replaced by another fashion media company, it was replaced by 10,000 influencers. Salesforce will not be replaced by another monolithic CRM. It will be replaced by a constellation of things that dynamically serve the same intent and pain points. Software companies will be replaced the same way media companies were, giving rise to a new set of platforms that control distribution.

SaaS, ARR, magic numbers–these are all shorthand to understand the old model of business building in software, one where the expense associated with creating software was a moat. The invisible hand has been stayed in software for a long time, but LLMs will usher in its swift, familiar corrective force. Majoring in computer science today will be like majoring in journalism in the late 90’s.
 
Coding is more like manufacturing than journalism. It's much cheaper to get Ranjeet, Chang or Oleg to write code while living 25 to a room in a shithole than it is to use white nerds.
The majority of coders writing shitty JavaScript will be culled by crappy LLM bots. A smaller amount of coders who actually know what they are doing and are critical to the world working will stick around for at least another decade or two. You need a real AI to replace those, not LLM junk.
 
The majority of coders writing shitty JavaScript will be culled by crappy LLM bots. A smaller amount of coders who actually know what they are doing and are critical to the world working will stick around for at least another decade or two. You need a real AI to replace those, not LLM junk.
Coding is more like manufacturing than journalism. It's much cheaper to get Ranjeet, Chang or Oleg to write code while living 25 to a room in a shithole than it is to use white nerds.
Its a spending and education problem, its not like working in journalism per se but its like working in any modern industry where the quality of work goes down due to spending going down and that spirals off into education deprioritizing work quality as the industry does not incentivize that. Like how anime embraces 3d, Disney embraces calarts and everybody is embracing AI, tech is embracing Javascript niggers who know nothing about code basics. Also all softwares are sold as services nowadays which is just cancer.
 
Has AI become such a dirty word we need euphemisms like "LLM" to pretend what they're doing is something different? In any event, that article was corporate jargonspeak to the point of gibberish.

I suspect this whole AI thing is going to be analogous to the flying cars or electricity so cheap we won't bother to meter it claims of the 1950s, but time will tell I guess. I mean, I basically am impressed by what Elon Musk has accomplished, but even I wince at some of the nonsense the guy continues to spout.

Warren Buffett famously loved newspapers
He didn't love newspapers as newspapers he loved that in pre-internet times the they had a stranglehold on classified advertising free from any real competition in their local markets and generated a steady, predictable cash flow. Once this was no longer the case he pretty quickly divested himself of them. Not really sure what that proves.

This necessitated products to direct attention, merchandise this content, and route us effectively–we understand these now as user-generated content platforms.

Salesforce will not be replaced by another monolithic CRM. It will be replaced by a constellation of things that dynamically serve the same intent and pain points

As an aside, kind of impressive that you can contradict yourself in no more than three sentences. Unless somebody is positing what we have currently is a "constellation" of anything but Facebook, Twitter, Instragram, Reddit, YouTube, TikTok, etc., ... a bunch of Cokes all without a credible Pepsi in sight. Hell, I'm not even sure there's a competitor to any of them that rises to the level of RC Cola.
 
Has AI become such a dirty word we need euphemisms like "LLM" to pretend what they're doing is something different?

The joke lately in the software world is that AI stands for "Anonymous Indians" since so many "AI" companies are just pajeet mechanical turk operations carefully hidden behind a flashy corporate memphis homepage.
 
Software is expensive because developers are expensive. They are skilled translators–they translate human language into computer language and vice-versa. LLMs have proven themselves to be remarkably efficient at this and will drive the cost of creating software to zero. What happens when software no longer has to make money? We will experience a Cambrian explosion of software, the same way we did with content.
Of course it's bullshit because, even if there was some magic black box that writes any code you want, you'd still need some guy to actually translate what you want into the logical blocks. Yeah AI can create shitty code (and it's already been dumbed down to irrelevance) but it will never reach the level of giving you a finished product for everything you can imagine.
 
LLMs have proven themselves to be remarkably efficient at this and will drive the cost of creating software to zero.
this is an extremely optimistic view of the situation.

i remember articles about how AI was gonna put lawyers out of business because you can tell chatGPT to write legal documents, only for those legal documents to end up containing hilarious errors and nonsense like false citations and references to cases that do not exist at all.

LLMs are good at creating things that superficially resemble other things that the model has seen, without any actual understanding of the things. this is great for applications like image generation where 'it looks very similar to the real thing' is really all you need. but if you apply this to code generation you get a wall of text that is formatted to look like code and (mostly) follows the correct syntax, but what happens when you actually try to run the code is a complete coinflip. maybe it just refuses to compile because it tries to include external libraries that do not exist, maybe it crashes with a segmentation fault, maybe it produces outputs that are nonsensical.

to give an analogy that non coders can understand - you can use generative AI to create pictures of big tiddy anime girls that look pretty hot, and that's fine because looking pretty is all you need your anime pictures to do. but if you use it to create a blueprint for a skyscraper, looking like a real blueprint isn't good enough, and if you try to actually construct a building in real life based on this blueprint you have absolutely no idea if it will hold up or if it will randomly collapse, because the AI does not have the capacity of understanding the structural engineering that goes into making sure that real blueprints are actually safe and sound.

that's kind of where AI generated code is at.
 
I asked AI to generate a grandstaff with all the notes in position with note names and it gave me something with moon runes. Will AI replace coders? Maybe, but I wouldnt trust the programs they write. Its a lot like companies going to india for their new consulting company. It will probably cost them more in the long run with people either needing to redo large portions of the codebase or the cost of maintaining a car built out of ducttape.
I've personally seen 5 such projects as a consultant. Company thought it would be better to hire a cheaper firm and they got what they paid for.
 
Has AI become such a dirty word we need euphemisms like "LLM" to pretend what they're doing is something different? In any event, that article was corporate jargonspeak to the point of gibberish.
"AI" means too many things to too many people and marketing departments. It particularly evokes ideas of literal strong/sapient intelligence, which isn't in play yet. Large Language Model (LLM) narrows it down to the dumb text generating algorithms which seem like magic sometimes but are massively overhyped.
 
Coding is more like manufacturing than journalism. It's much cheaper to get Ranjeet, Chang or Oleg to write code while living 25 to a room in a shithole than it is to use white nerds.
Good.
Custom and critical parts will still go to British, Scandinavian and North American staff because the pajeets, gooks and slav doesn't have mentality to do a good job. Just like engineering manufacturing.
 
There are unsolvable problems here. You can't stop an LLM from "hallucinating."

Also, the reason they say "LLM" instead of "AI" is that it's becoming very clear that LLMs aren't actually intelligent. You probably wouldn't want to hire a person who acted like an LLM does.

They're not going to stop making up software packages. And because they can't stop making up packages, it creates fertile ground for the next generation of hackers to make those packages exist and contain malicious code. It's going to be a carnival for stupid development practices.

Hiring is starting to happen again.

One of the problems with LLM outputs is that you start to see the cracks in the procedural generation very early.

I was showing a good use case for it to my kids the other day, and used it to make a little bedtime fairy tale. We gave it some inputs and told it to make a story. It did that, the story was boring and had a very unmotivated shift to the ending but fine. Like the output of a talented 9 year old. Then, we told it to make a fairy tale with totally different inputs. Gave it a new general theme, new character types. 80% of what it came up with was exactly the same. It was a lot more impressive when it made the first one than when it made the second. Realizing it's a good Mad Libs machine disappointed the kids but it's going to disappoint some CEOs and VCs a lot more, and I will laugh.
 
I've tried using AI to write code/configs. Every single response it's given has had at least one major problem, and in some cases been completely wrong in almost every word.

I ain't too worried.

Same experience here.

Reminds me of this quote from the AlphaGo documentary: "If DeepMind has figured out how to write code that doesn't have bugs, that is a bigger news story than AlphaGo."
 
I know Chris Paik is a VC guy but does he even write code?
Yeah I use copilot in vs and I have chatgpt make me retardo python so I can avoid writing the mundane bits but I always have to go back in and fix shit. Same goes with any other task I give it(not going to lie it does really save time for some stuff).

Programming is not simply translating. It is taking Desire or Need and turning it into Product. Even the human brain fucks that up on occasion. There are times I have made a detailed mock up for my dev(s) and they completely misunderstand something even after we have talked about it. AI is no where near improving on that process and to be honest once it gets to that point no human will need to work(or really even exist).
Vogue wasn’t replaced by another fashion media company, it was replaced by 10,000 influencers.
Was Vogue replaced? They still receive over 11 million visits monthly. Sure their medium changed but I would be willing to bet that to the people who matter Vogue is a more important voice than some dipshit snorting cinnamon on tiktok. The top 1% of influencers might have a rivaling voice but they often fall off rapidly because quite honestly they are just not that interesting once you see them enough. They are always cloying. They are desperate for views and attention because advertising is kind of shit for influencers. That is why Chase Bank still advertises with Vogue but Logan Paul does not have their logo tattooed on his face.

Also what top tier fashion events or any event for that matter are going to let those 10,000 influencers in? Who is going to give an interview to 10,000 influencers? Is every designer or important figure going to fire up a stream? No. They are going to talk to the top outlets and then the influencers are going to munch up that data and spit out their interpretation of that data.

Still I get what he is saying but I feel like he is not getting the scenario correct. Newspapers were diminished by radio, radio by tv, tv, magazines, and general sanity and decency by the Internet.

Like I get that this guy has made a lot of money through VC but I do not see anything that indicates he actually has a profession beyond being at Thrive Capital during one of the most gimme moments in funding history.
 
Back