Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My advice to everyone feeling existential vertigo over these tools is to remain confident and trust in yourself. If you were a smart dev before AI, chances are you will remain a smart dev with AI.

My experience so far is that to a first approximation, the quality of the code/software generated with AI corresponds to the quality of the developer using the AI tool surprisingly well. An inexperienced, bad dev will still generate a sub-par result while a great dev can produce great results.

The choices involved in using these tools are also not as binary as they are often made out to be, especially since agents have taken off. You can very much still decide to dedicate part of your day to chiseling away at important code to make it just right and make sure your brain is engaged in the result and exploring and growing with the problem at hand, while feeding background queues of agents with other tasks.

I would in fact say the biggest challenge of the AI tool revolution in terms of what to adapt to is just good ol' personal time management.





> If you were a smart dev before AI, chances are you will remain a smart dev with AI.

I don't think that's what people are upset about, or at least it's not for me. For me it's that writing code is really enjoyable, and delegating it to AI is hell on earth.


> For me it's that writing code is really enjoyable, and delegating it to AI is hell on earth.

It's very sad, for me.

Like I told someone recently - letting the LLM write my code for me is like letting the LLM play my video games for me.

If all I wanted was the achievement on my steam profile, then sure, it makes sense, but that achievement is not why I play video games.

I'm looking at all these people proudly showing off their video game achievements, gained just by writing specs, and I realise that all of them fail to realise that writing specs is a lower-skill activity than writing programs.

It also pays far, far less - a BA earns about half what an average dev earns. They're cosplaying at being BAs, not realising that they are now employed for a skill that pays less, and it's only a matter of time before the economics catch up to them.

I don't see a solution here.


My job for the last 8 years has involved

Talking to sales to get an idea what the customer wanted from the business side (first B2B at a product company and now consulting) -> talking to the customer and hashing out more detailed requirements -> designing the architecture and a proposed technical plan -> presenting it to the stakeholder (sometime internal sometime external) -> doing the work or delegating and leading the work -> presenting the work to the stakeholder and leading the UAT -> getting it to production.

The coding part has been a commodity for enterprise developers for well over a decade. I knew a decade ago that I wasn’t going to be 50 years old reversing b trees on a whiteboard trying to prove my worth.

Doing the work is the only thing that the AI does.

While I don’t make the eye popping BigTech comp (been there. Done that and would rather get a daily anal probe than go back), I am making more than I could make if I were still selling myself as someone who “codez real gud” as an enterprise dev.


Look, there are at least dozens of us who like and enjoy programming for programming's sake and got into this crazy industry because of that.

Many of these people made many of the countless things we take for granted every day (networking, operating systems, web search; hell, even the transformer architecture before they got productized!).

Seeing software development --- and software engineering by proxy --- get reduced to a jello that will be stepped on by "builders" in real-time is depressing as shit.

It's even more depressing to see folks on HACKER news boost the "programming never mattered" mentality that's taken hold these last few years.

Last comment I'll make before I step off my soapbox: the "codez real gud" folks that makes the big bucks bring way more to the table than their ability to code...but their ability to code is a big contributor to why they bring more to the table!


> Look, there are at least dozens of us who like and enjoy programming for programming's sake and got into this crazy industry because of that.

You and me both, and I truly sympathise, but really we were just lucky that we could enjoy our passion at work.

> It's even more depressing to see folks on HACKER news boost the "programming never mattered" mentality that's taken hold these last few years.

Delivering stuff to customers for money is always what we've been paid for; that's not new, it's just that perhaps many of us didn't really pay much mind to that in the past. That's perhaps why there's traditionally been so much complaining about artificial deadlines and managers and sales teams; many of us also didn't really notice that the programming was never the thing that our employers cared about; it is just a link in a long chain from idea to income.

The way I'm looking at our current situation is this: I spent my whole career and much of my free time learning to become a great furniture maker, and I take a lot of pleasure producing functional and elegant items. Now someone has handed me some power tools. I can mourn the loss of care and love that goes into hand-crafting something, but I can also learn to use the tools to crank out the good-enough cabinets that my employer wants me to make, focussing on the more abstract elements of the craft and doing less of the laborious stuff. I think I can still take pleasure and pride in my work in this way, and personally I find the design aspect of software development to be a lot of fun. I can still hand-craft things sometimes too; there will no doubt always be important difficult parts of a project that would take as long to describe to an LLM as they would to write by hand, at least for those of us with sufficient experience of the latter.

I can also, hopefully, finally knock out some of those side projects that I have had on my list for many years but never had time to make. I would prefer that those things existed in a less than perfect state, than that they were perfect but only in my head :-)


Well as depressing as it is, check out the 2024 and 2025 YC batches. Guess how many of them are “ai” something or other? It’s never been about “hackers”. Not a single founder who takes VC funding is thinking about a sustainable business - at least their investors aren’t - they are hoping for the “exit”.

It’s always been jello. I at 51 can wax poetically about the good old days or I can keep doing what I need to do to keep money appearing in my account.


> Talking to sales to get an idea what the customer wanted from the business side (first B2B at a product company and now consulting) -> talking to the customer and hashing out more detailed requirements -> designing the architecture and a proposed technical plan -> presenting it to the stakeholder (sometime internal sometime external) -> doing the work or delegating and leading the work -> presenting the work to the stakeholder and leading the UAT -> getting it to production.

You are not the first person to say things like this.

Tell me, you ever wondered why a person with a programming background was filling that role?


If not the technical person, then who? It’s a lot easier for a technical person to learn how to talk the language of the business than a business person to have a deep understanding of technology.

On the enterprise dev side of the industry where most developers work, I saw a decade ago that if I were just a ticket taker who turned well defined requirements into for loop and if statements, that was an undifferentiated commodity.

You’re seeing now that even on the BigTech side knowing how to reverse a binary tree on the whiteboard is not enough.

Also if you look at the leveling guidelines of any major tech company, their leveling guidelines above mid level are based on scope, impact and dealing with ambiguity - not “I codez real gud”


Those levels bake in the expectation of "codez real gud" at FAANG/MANGA/whatever style tech companies since the technical complexity of their operations is high and a high skill bar needs to be hurdled over to contribute to most of those codebases and make impact at the scale they operate at.

One's ability to reverse a binary tree (which is a BS filter, but it is what it is) hasn't been an indicator of ability in some time. What _is_ though, is the wherewithall to understand _when_ that's important and tradeoffs that come with doing that versus using other data structures or systems (in the macro).

My concern is that, assuming today's trajectory of AI services and tooling, the need to understand these fundamentals will become less important over time as the value of "code" as a concept decreases. In a world where prompting is cheap because AI is writing all the code and code no longer matters, then, realistically, tech will be treated even more aggressively as a line item to optimize.

This is a sad reality for people like me whose love for computers and programming got them into this career. Tech has been a great way to make a wonderful living for a long time, and it's unfortunate that we're robbing future generations of what we took for granted.


You give way too much credit to the average mid level developer at BigTech. A lot of the scalability is built in and they just built on top of it.

There are millions of people that can code as well as you are I and a lot cheaper if you are in the US. Thousands of developers have been laid off over the last three years and tech companies keep going strong - what does that tell you?

I’m just as happy to get away from writing for loops in 2026 as was to be able to get away with LDA, LDX and BRA instructions once I could write performant code in C.

And how are we robbing future generations? Because some of us (not that I can take credit for any of it) move the state of technology from the 1Mhz Apple //e I had in 1986?


> Also if you look at the leveling guidelines of any major tech company, their leveling guidelines above mid level are based on scope, impact and dealing with ambiguity - not “I codez real gud”

Your entire comment is this specific strawman - no one, and I mean no one, is making this claim! You are the only one who is (ironically, considering the job you do) too tone-deaf and too self-unaware to avoid making this argument.

I'm merely pointing out that your value-prop is based on a solid technical foundation, which I feel you agree on:

> If not the technical person, then who? It’s a lot easier for a technical person to learn how to talk the language of the business than a business person to have a deep understanding of technology.

The argument is not "Oh boo hoo, I wish I could spend 8 hours a day coding for money like I used to", so stop pretending like it is.


There is an entire contingent of comments here who miss translating requirements into code.

Even the comment I replied to mentioned “being a BA” like the most important quality of a software engineer is their ability to translate requirements into code.


> The argument is not

Then what is it.

be blunt and obvious in your reply or go home.


> Then what is it.

It's that the erosion and atrophying of the fundamental skill that made you (or, in this case, the GP) valuable is a matter of concern, because you (or GP, as the case may be) are willingly embracing the fact that you will be no more valuable than the average office office worker, and so can expect that compensation will drop to match.

As an example, moving to Python from C was was moving to a higher level of abstraction, but it still didn't jettison the need for actually knowing how to program!

Moving to LLMs from Python does jettison any need to know what an object is, what "parse, don't validate" actually means, etc.

If the problem you are solving with the LLM doesn't need that knowledge, then that job doesn't need all those valuable programming skills anyway, and thus you are no more valuable than the average clerk toiling away in the middle of some organisation.

> be blunt and obvious in your reply or go home.

Very classy.


I guess the entire thing is I like building working systems.

I love talking to business folks, I love when I can do that “git init”. I love that new AWS account smell and molding a complete architecture.

Now I can do a lot more if it by myself. It was a time problem before - not a knowledge problem

What has made me valuable for 30 years is an ability to go from business goal -> to working implementation. They can pay someone a lot less than me (or any American - I’m in no way bragging about comp) to code.

Companies don’t pay my employer the bill rate they charge for me based on how well I code. While I’ve been expected to produce production level code as part of my job across 5 companies in the past decade not a single one asked me to write a line of code as part of the interview. They were much more concerned about ability to get things done.

Ironically, even the job at BigTech that landed in my lap was all behavioral (AWS ProServe). I damn sure didn’t get that job because of my whopping two years of AWS experience at the time. Most of my answers for “tell me about a time when…” were leading non AWS projects.

I’m not bragging - I’m old. My competitive advantage should be more than just my coding ability.


> What has made me valuable for 30 years is an ability to go from business goal -> to working implementation.

Look, it seems we are at about the same level of industry experience. I'm not even a f/time programmer anymore, and haven't been for some time (technically, I'm a professional problem solver, I suppose).

I am saying that, while I don't need to delve into details (unless it's a hobby project), what makes me valuable (in a similar position that you have, except that I don't write a line of code) is the current ability to program.

I (and you, no doubt) would be useless in the type of position that you are in if you didn't sweat blood earlier in your career getting things right while programming.

What I am saying is that my entire value proposition is built on a high skill level in programming. Letting those skills atrophy is, in my opinion, devaluing myself.


I hate to sound like a broken record. But I consider my skillset at 51 all of the things I said that involve getting from signed contract to happy customer at the end. I’m actually slowly working on moving even further up and becoming halfway competent at pre-sales.

You cdn substitute customer for “the business”.

When you step back “the code” is the smallest part. Once I learned how to take a holistic view of the entire system - I specialize in AWS architecture + app dev - including how to deal with people.

In enterprise dev - no one cares about the code - they care about functionality. They never cared about the code. In large tech companies they have to care about the code.


> When you step back “the code” is the smallest part.

For me, the coding part is not even small, it's non-existent!

I still feel that the coding skills I have make me much more valuable.


I play video games for fun. I also enjoy automation games where the point is to get the game to play itself. I like achievements, but I won't "cheat" to get them.

My company pays me to build software that helps make them money. They don't care how I write that software as long as I do it fast and correctly. If that's by hand, I'll do it by hand. If vibe coding can get the job done, then I'll do that.

"Vibe coding" isn't just writing specs. It's ensuring that the vibe coding process doesn't introduce regressions, new bugs, etc. My boss writes specs for me, which, if I were to naively plop them into cursor or Claude code, would generate stuff that kinda works but not in a way that could be considered production ready. I plan, adjust the plan, generate, regenerate, refine. Could it be done faster by hand? Maybe. But it's the tool I've chosen for the job and the bosses are happy with it.


I've been coping by reminding myself that I was absurdly lucky to have found a job that was also enjoyable and intellectually stimulating for so long, and if all AI does is bring software engineering down to the level of roughly every other job in the world in terms of fun, I don't really have much ground to complain

I cannot figure out what you mean by "BA" in this context

> I cannot figure out what you mean by "BA" in this context

Business Analyst - those people who learn everything about what the customers requirements, specs, etc are. What they need, what they currently have, how to best advise them, etc.

They know everything, except how to program.


> They know everything, except how to program

In my experience, they know nothing, including how to program.


> In my experience, they know nothing, including how to program.

In my experience (Banking, Insurance, Fintech, etc) they were invaluable.

If, while developing, you hit some ambiguity in the spec, you could always go to the BA that wrote the spec and clarify, and I've never had the situation where they responded "Wait, let me ask the customer"; they knew what the business process should be, what the workflow should be, etc.

It worked for the customer as well - when they had trouble deciding "should or workflow do $FOO or $BAR?", a quick chat to our BA would be enough for them to make a decision.

Now, having worked in Agile shops (which I believe are the majority), there is no space for a BA - the ethos is "through something together, and if the customer doesn't want it, refine it until they do", so any BAs in this shop tend to be superfluous anyway because there is no place for them in the process anyway.

That's a failure of the process, not a failure of the role.


I was a BA forever ago during a summer job in college. That job wasn't for me at all! Looking back on the experience, putting together a FRD felt much like writing a CLAUDE.md with some prompts thrown in!

> I was a BA forever ago during a summer job in college. That job wasn't for me at all! Looking back on the experience, putting together a FRD felt much like writing a CLAUDE.md with some prompts thrown in!

But soon enough, all s/ware dev jobs will be that, because LLMs can write code faster than humans can.


Business Analyst

I think you were incredibly lucky to get to write code that you enjoyed writing.

Most of the commercial code I've written, over a 30+ year career, has been shite. The mandate was always to write profitable code, not elegant code. I started (much like the OP) back in the 80's writing code as a hobby, and I enjoyed that. But implementing yet another shitty REST CRUD server for a shitty website... not so much.

I totally see a solution: get the LLM to write the shitty REST CRUD server, and focus on the hard bits of the job.


^ This. People bemoan the death of coding, but easily 80%+ of the code I've written commercially was just CRUD or ETL shite. I've done a few interesting things (a formula parser, a WYSWIG survey builder for signature pads, a navigation controller for line-guided industrial vehicles, etc.) but yeah, don't miss writing reams of boilerplate. I always tried to take a Kent Beck inspired Smalltalk/TDD inspired approach to the code I wrote and took pride in my work, but ultimately you're working in a shitty corporate environment where none of your colleagues cares because they're burning at both ends, the management only does lip service to Quality, and Deadlines and the Bottom Line are Everything. If LLMs make this shit more bearable then bring 'em on, I say!

> I always tried to take a Kent Beck inspired Smalltalk/TDD inspired approach to the code I wrote and took pride in my work, but ultimately you're working in a shitty corporate environment where none of your colleagues cares because they're burning at both ends, the management only does lip service to Quality, and Deadlines and the Bottom Line are Everything.

LLMs are a multiplier. If this depressed you, then there is no way I can see the following happening.

> If LLMs make this shit more bearable then bring 'em on, I say!

What LLMs are going to do is multiply the amount of "none of your colleagues care" and "Management only does lip-service to ..."


It's not that none of my colleagues care, or that management is necessarily bad. That doesn't help, but it's not the cause.

It's the nature of the job. A CRUD REST server needs to be built. It's a shitty job, but someone has to do it. The interesting part of the job is over there, in whatever actually-novel part of the system is being built. But someone still has to build the CRUD REST server. There are frameworks and patterns that help, but not as much as you'd think, or they claim.

It's just part of the job. By far the largest and least interesting part of the job.


This is a part of it, but I also feel like a Luddite (the historical meaning, not the derogatory slang).

I do use these tools, clearly see their potential, and know full well where this is going: capital is devaluing labor. My skills will become worthless. Maybe GP is right that at first only skilled developers can wield them to full effect, but it's obviously not going to stop there.

If I could destroy these things - as the Luddites tried - I would do so, but that's obviously impossible.

For now I'm forced to use them to stay relevant, and simply hope I can hold on to some kind of employment long enough to retire (or switch careers).


> know full well where this is going: capital is devaluing labor

But now you too can access AI labor. You can use it for yourself directly.


Kind of. But the outcomes likely do not benefit the masses. People "accessing AI labor" is just a race to the bottom. Maybe some new tools get made or small businesses get off the ground, but ultimately this "AI labor" is a machine that is owned by capitalists. They dictate its use, and they will give or deny people access to the machine as it benefits them. Maybe they get the masses dependent on AI tools that are currently either free or underpriced, as alternatives to AI wither away unable to compete on cost, then the prices are raised or the product enshittified. Or maybe AI will be massively useful to the surveillance state and data brokers. Maybe AI will simply replace a large percentage of human labor in large corporations, leading to mass unemployment.

I don't fault anyone for trying to find opportunities to provide for themselves and loved ones in this moment by using AI to make a thing. But don't fool yourself into thinking that the AI labor is yours. The capitalists own it, not us.


As someone who has leaned fully into AI tooling this resonates. The current environment is an oligopoly so I'm learning how to leverage someone else's tool. However, in this way, I don't think LLMs are a radical departure from any proprietary other tool (e.g. Photoshop).

Indeed. Do you know how many small consultancies are out there which are "Microsoft shops"? An individual could become a millionaire by founding their own and delivering value for a few high-roller clients.

Nobody says there's no money to make anymore. But the space for that is limited, no matter how many millions hustle, there's only 100 spots in the top 100.

what makes you think that's actually possible? maybe if you really had the connections and sales experience etc...

but also, if that were possible, then why wouldn't prices go down? why would the value of such labor stay so high if the same thing can be done by other individuals?


I saw it happen more back in the day compared to now. Point being, nobody batted an eyelash at being entirely dependent on some company's proprietary tech. It was how money was made in the business.

Software development was a race to the bottom for the majority of developers aside from the major tech companies for a decade. I’m seeing companies on the enterprise/corp dev side - where most developers work - stagnate for a decade and not keep up with inflation in tier 2 cities - again where most developers work.

That is a fiction. None of us can waste tens of thousands of dollars whipping out a C compiler or web browser on a whim to test things.

If these tools improve to the point of being able to write real code, the financial move for the agent runners is to charge far more than they are now but far less than the developers being replaced.


> it’s obviously not going to stop there.

I don’t think it is obvious actually that you won’t have to have some expert experience/knowledge/skills to get the most out of these tools.


I think the keyword here is "some".

It already seemed like we were approaching the limit of what it makes sense to develop, with 15 frameworks for the same thing and a new one coming out next week, lots of services offering the same things, and even in games, the glut of games on offer was deafening and crushing game projects of all sizes all over the place.

Now it seems like we're sitting on a tree branch and sawing it off on both sides.


Originally spinners and weavers were quite happy. One spun, the other weaved, and the cloth was made.

Then along game the flying shuttle and the weavers were even happier - producing twice as much cloth and needing half as many spinners.

The the spinning jenny came along and spinners (typically the wife of the weaver) were basically unemployed, so much so that the workers took to breaking into the factories to destroy the jennys.

But the weavers were on the same track. They no longer owned their own equipment in their own home, they were centralised in factories using equipment owned by the industrialists.

Over the entire period first spinners, then weavers, lost their jobs, even with the massive explosion in output.

Meanwhile lower skilled jobs (typically with barely paid children) abounded (with no safety requirements)

Fortunately in the 1800s English industrialists had some amount of virtue, and the workers organised into unions, so economic damage wasn't as widespread as it could have been.

This power imbalance between the owners and workers was only really arrested after the world wars - first with ww1 where many owner's sent their children to battle and lost their heirs, then later with strong government reacting to the public post ww2.


Today. Ask again in 6 months. A year.

People have been saying this for multiple years in a row now.

And it has been getting more true for years in a row.

Disagree entirely.

If you state “in 6 months AI will not require that much knowledge to be effective” every year and it hasn’t happened yet then every time it has been stated has been false up to this point.

In 6 months we can come back to this thread and determine the truth value for the premise. I would guess it will be false as it has been historically so far.


> If you state “in 6 months AI will not require that much knowledge to be effective” every year and it hasn’t happened yet then every time it has been stated has been false up to this point

I think that this has been true, though maybe not quiet a strongly as strongly worded as your quote says it.

The original statement was "Maybe GP is right that at first only skilled developers can wield them to full effect, but it's obviously not going to stop there."

"full effect" is a pretty squishy term.

My more concrete claim (and similar to "Ask again in 6 months. A year.") is the following.

With every new frontier model released [0]:

1. the level of technical expertise required to achieve a given task decreases, or

2. the difficulty/complexity/size of a task that a inexperienced user can accomplish increases.

I think either of these two versions is objectively true looking back and will continue being true going forward. And, the amount that it increases by is not trivial.

[0] or every X months to account for tweaks, new tooling (Claude Code is not even a year old yet!), and new approaches.


Using a LLM to program is simply another abstraction level. Just how C was to assembly.

I feel like the nondeterminism makes LLM-assisted programming a different sort of concept than using a compiler. Your prompt isn't your source code.

Fortran to Assembly.

Six months ago, we _literally did not have Claude Code_. We had MCP, A2A and IDE integrations, but we didn't have an app where you could say "build me an ios app that does $thing" and have it build the damn thing start to finish.

Three months ago, we didn't have Opus 4.5, which almost everyone is saying is leaps and bounds better than previous models. MCP and A2A are mostly antiquated. We also didn't have Claude Desktop, which is trying to automate work in general.

Three _weeks_ ago, we didn't have Clawdbot/Openclaw, which people are using to try and automate as much of their lives as possible...and succeeding.

Things are changing outrageously fast in this space.


> Six months ago, we _literally did not have Claude Code_.

Claude Code came out a year ago.


If I could destroy these things - as the Luddites tried - I would do so

Would travel agents have been justified in destroying the Internet so that people couldn't use Expedia?


Society was been better without the internet. We have lost all our privacy, our third spaces, the concept of doing hobbies for fun instead of as content, and much more.

> capital is devaluing labor

I guess the right word here is "disenfranchising".

Valuation is a relative thing based mostly of availability. Adding capital makes labor more valuable, not less. This is not the process happening here, and it's not clear what direction the valuation is going.

... even if we take for granted that any of this is really happening.


> If I could destroy these things - as the Luddites tried - I would do so, but that's obviously impossible.

Certainly, you must realize how much worse life would be for all of us had the Luddites succeeded.


If the human race is wiped out by global warming I'm not so sure I would agree with this statement. Technology rarely fails to have downsides that are only discovered in hindsight IMO.

Sure, but would it have been better or worse for the Luddites?

Or perhaps they would have advanced the cause of labor and prevented some of the exploitation from the ownership class. Depends on which side of the story you want to tell. The slur Luddite is a form of historical propaganda.

Putting it in today's terms, if the goal of AI is to significantly reduce the labor force so that shareholders can make more money and tech CEOs can become trillionaires, it's understandable why some developers would want to stop it. The idea that the wealth will just trickle down to all the laid off work is economically dubious.


Reaganomics has never worked

> Reaganomics has never worked

Depends how you look at it.

Trickle down economics has never worked in the way it was advertised to the masses, but it worked fantastically well for the people who pushed (and continue to push) for it.


> it worked fantastically well for the people who pushed (and continue to push) for it.

That would be "trickle up economics", though.


Sure, because it all trickles into their pockets.

problem today is that there is no "sink" for money to go to when it flows upwards. we have resorted to raising interest rates to curb inflation, but that doesn't fix the problem, it just gives them an alternative income source (bonds/fixed income)

I'm not a hard socialist or anything, but the economics don't make sense. if there's cheap credit and the money supply perpetually expands without a sink, of course people with the most capital will just compound their wealth.

so much of the "economy" orbits around the capital markets and number going up. it's getting detached from reality. or maybe I'm just missing something.


Yeah it's called wealth transfer and the vast majority is on the wrong end.

For those who survived sure. For those at the time, I'm sure they would disagree

[flagged]


You can reject the ideas in the aggregate. Regardless, for the individual, your skills are being devalued, and what used to be a reliable livelihood tied to a real craft is going to disappear within a decade or so. Best of luck

> The historical luddites are literally the human death drive externalized. Reject them and all of their garbage ideas with extreme prejudice.

Yes, because fighting for the rights of laborers is obviously what most people hate.


For a different perspective:

"Except the Luddites didn’t hate machines either—they were gifted artisans resisting a capitalist takeover of the production process that would irreparably harm their communities, weaken their collective bargaining power, and reduce skilled workers to replaceable drones as mechanized as the machines themselves."

https://www.currentaffairs.org/news/2021/06/the-luddites-wer...


[flagged]


Either you're thinking of the "room temperature semi-conductor" thing out of Korea, or you're some boomer who forgot that cold fusion was in the 80s.


I resonate with that. I also find writing code super pleasurable. It's immediate stress relief for me, I love the focus and the flow. I end long hands-on coding sessions with a giddy high.

What I'm finding is that it's possible to integrate AI tools into your workflow in a big way without giving up on doing that, and I think there's a lot to say for a hybrid approach. The result of a fully-engaged brain (which still requires being right in there with the problem) using AI tools is better than the fully-hands-off way touted by some. Stay confident in your abilities and find your mix/work loop.

It's also possible to get a certain version of the rewards of coding from instrumenting AI tools. E.g. slicing up and sizing tasks to give to background agents that you can intuit from experience they'll be able to actually hand in a decent result on is similar to structuring/modularization exercises (e.g. with the goal to be readable or maintainable) in writing code, feelings-wise.


I'll use some small code completion but that's it. And only when I can do it locally with non-proprietary software.

I'm in the enjoy writing code camp and do see merits of the hybrid approach, but I also worry about the (mental) costs.

I feel that for using AI effectively I need to be fully engaged with both the problem itself and an additional problem of communicating with the LLM - which is more taxing than pre-LLM coding. And if I'm not fully engaged those outcomes usually aren't that great and bring frustration.

In isolation, the shift might be acceptable, but in reality I'm still left with a lot of ineffective meetings - only now without coding sessions to clear my brain.


I think an additional big part of why LLM-aided coding is so draining is that it has you constantly refreshing your mental model of the code.

Making sense of new or significantly changed code is very taxing. Writing new code is less taxing as you're incrementally updating the model as you go, at a pretty modest pace.

LLMs can produce code at a much higher rate than humans can make sense of it, and assisted coding introduces something akin to cache thrashing, where you constantly need to build mental models of the system to keep up with the changes.

Your bandwidth for comprehending code is as limited as it always was, and taxing this ability to its limits is pretty unpleasant, and in my experience, comes at a cost of other mental capabilities.


I get a dopamine hit with AI by being able to accomplish tasks fast, mostly in frontent or using a dynamic language like python because you see the changes in real time

Hope: I want to become a stronger dev.

Reality: Promoted to management (of AI) without the raise or clout or the reward of mentoring.


LLMs are similar in a lot of ways to the labor outsourcing that happened a generation or two ago. Except that instead of this development lifting a billion people out of poverty in the third world a handful of rich people will get even more rich and everyone else will have higher energy bills.

> ...the reward of mentoring.

I really feel this. Claude is going to forget whatever correction I give it, unless I take the time and effort to codify it in the prompt.

And LLMs are going to continue to get better (though the curve feels like it's flattening), regardless of whatever I do to "mentor" my own session. There's no feeling that I'm contributing to the growth of an individual, or the state-of-the-art of the industry.


AIs have made me realize that I don't actually care about writing code, even though it's all I've done for my entire career.

I care about creating stuff. How it gets from the idea in my brain to running on the computer, is immaterial to me.

I really like that I go from idea to reality in half the time.


Same here, and I also really enjoy the high level design/structure part of it.

THAT part doesn't mesh too well with AI, since it's still really bad at autonomous wholistic level planning. I'm still learning how to prompt in a way that results in a structure that is close to what I want/reasonable. I suspect going a more visual block diagram route, to generate some intermediate .md or whatever, might have promise, especially for defining clear bounds/separation of concerns.

Related, AI seems to be the wrong tool for refactoring code (I recently spent $50 trying to move four files). So, if whatever structure isn't reasonable, I'm left with manually moving things around, which is definitely un-fun.


Definitely go for that middle step. If it's something bigger I get them to draw out a multi-phase plan, then I go through and refine that .md and have them work from that.

Same.

I've been exploring some computer vision recognition stuff. Being able to reason through my ideas with an LLM, and make visualizations like t-SNE to show how far apart a coke can and a bag of cheetos are in feature-space has been mind blowing. ("How much of a difference does tint make for recognition? Implement a slider that can show that can regenerate the 512-D features array and replot the chart")

It's helping me get an intuitive understanding 10x faster than I could reading a textbook.


Thing with factories, is that only like 25% of the original employees are left to take care of the belt, and remaining actions not covered by the robots.

Everyone is hoping to be part of those 25%.


In your own open source projects you can ban AI. A number of projects did just that. Programming doesn't need to be an activity people do just to earn money.

> I don't think that's what people are upset about, or at least it's not for me. For me it's that writing code is really enjoyable, and delegating it to AI is hell on earth.

Maybe this is the reason why I don't care that much about coding agents or have a strong opinion about them, because code was only a means to an end for me. What I enjoy is to learn about and understand systems and designing those systems, whether it's computers, operating systems or software architectures. I never did enjoy just hacking away or writing CRUD stuff.


> For me it's that writing code is really enjoyable, and delegating it ...

This.

On my fun side project, I don't accept pull requests because writing the code is the fun part.

Only once did someone get mad at me for not accepting their pull request.


I think this is subjective, I personally enjoy "managing" agents more than handwriting code and dealing with syntax

At the very least, it feels ergonomic and saves me keystrokes in the same way as stuff like snippets & aliases


Likewise! It's a fun puzzle to solve to figure out the right incantation to get the LLM to do what I want it to do without going off the rails.

There's room for both. Give AI the boilerplate, save the exciting stuff for you.

but are employers going to be fine with that?

That remains to be seen. As long as the work gets done... Don't ask, don't tell.

It does NOT remain to be seen. https://www.cnbc.com/2025/09/26/accenture-plans-on-exiting-s... Big players are already moving in the direction of "join us or leave us". So if you can't keep up and you aren't developing or "reinventing" something faster with the help of AI, it was nice knowing you.

I didn't say don't use AI at all, I said give it the boilerplate, rote work. Developers can still work on more interesting things. Maybe not all the interesting things.

That may be fine ... if it remains your choice. I'm saying companies are outmoding people (programmers, designers, managers, et al) who don't leverage AI to do their job the fastest. If one programmer uses AI to do boilerplate and then codes the interesting bits personally and it takes a week and another does it all with AI (orchestrating agents, etc) and it takes 2 hours and produces the same output (not code but business value), the AI orchestrator/manager will be valued above the former.

I get your point, but I think smart people will figure out a balance. That 2 hours of output could take a week to debug and test.

Yes! I am not advocating for the 2 hours and the "vision" of managers and CEOs. Quite the contrary. But it is the world we live in for now. It's messy and chaotic and many people may (will?) be hurt. I don't like it. But I'm trying to be one of the "smart people". What does that look like? I hope I find out.

I don't like it, either. I hear people ranting about doing "everything with AI" on one meeting, and what a productivity boost it is, then I get tagged on a dumpster fire PR full of slop and emoji filled log statements. Like did you even look at your code at all? "Oh sorry I don't know how that got in there!"

These are the same employers that mandate return to office for distributed teams and micro-manage every access of our work. I think we know how its going to play out.

exactly

thankfully I started down the FIRE route 20 years ago and now am more or less continuing to work because I want to

which will end for my employer if they insist on making me output generative excrement


Why do you need to delegate writing code to AI? I don't do that. Is your job mandating that you vibe code? How would they know?

They would know when you are two or three times slower producing code because you insist on having your own handcrafted, bespoke artisanal code that doesn’t meet the requirements any better than Claude could do.

Two to three times faster? Has productivity doubled or tripled? Stuff shipping in 1/3 the time? I'm not aware of this - which company is now shipping features twice as fast?

AI can produce greenfield code faster, sure, but you spend more time debugging it. If you write the code, it's slower to get the first version out, but then you understand the code and can debug much faster going forward.

You can also use AI to write unit tests, documentation, and stuff like that, while writing the code yourself.


It’s copium to think that LLM code is more buggy than your median to slightly above median developer or that’s all that most companies need - median developers.

And debugging code is also easier with AI. Just today I had to revisit code that I personally wrote from the design, the implementation the refactoring, etc from the first git init and I couldn’t remember half the decisions I made. I launched Codex and started asking it questions about the code.

Where is the productivity gain? How many junior developers and mid level ticket takers are struggling to find a job now because the market is saturated and those true seniors who can operate at a larger scope and impact can do the work themselves without having to delegate

My personal anecdote is that I had four offers within 3 weeks after being Amazoned in late 2023. One was from the company that acquired the startup I left in 2020 where I would have been responsible for leading the integration between all of the companies they acquired [1] and the other was a former coworker who was now a director st a well known non tech F500 company. He wanted me to lead the migration and “modernization “ efforts. I decided to stick with consulting.

Those offers didn’t come because of my coding abilities. That’s a commodity.

I was looking again in 2024. It took one outreach and talking to the right people. Absolutely no one asked me the first thing about coding even though I do it maybe 60%-70% of the time.

Going way back to 2016, I had two offers - one interview was me doing a merge sort on the whiteboard the other interview was me talking about strategy with the then new director who needed to build up a software development team. He asked me about my experience. He didn’t mske me stand up and do some algorithmic test on the whiteboard. He treated me like an industry professional

[1] I did the whole “lead integration efforts by a company owned by private equity acquiring other companies” thing before I joined the startup - never again.


> It’s copium to think that LLM code is more buggy than your median to slightly above median developer or that’s all that most companies need - median developers.

You are not understanding the point. AI has to be properly supervised because it makes mistakes. Now if you are making more or as many mistakes as the AI, then you should look for a new career. You should understand the code better than the AI, because the AI has a limited context window, and for a large codebase, you should know that codebase better than the AI.

Now, you can use AI to help you understand code that someone else wrote. You can use AI to check your code. You can use AI to write unit tests. You can use AI to debug. You can use AI to summarize code. There are so many uses of AI.

But you -- you as a developer -- need to understand your codebase. If you do not understand the codebase, you can't properly supervise the AI. And there is one efficient way for you to understand a big complicated codebase. The most efficient way possible for you to learn it. That is by you writing code in that codebase, and debugging that code, and learning how to code in that codebase.

If you don't do that, then you are not qualified to supervise the AI. Now you are letting the AI loose on a codebase much larger than its context window, and you will fill the codebase with bugs.

It's like a student. Sure you can use AI to help you study, to explain things to you, but the moment you let the AI do your homework, then you are no longer learning. The homework is the practice of solving problems in that area, and as a developer, you need to write code in a codebase otherwise you have no value and the AI has no value.


I don't understand the problem. Nothing is stopping you from writing code.

Woodworking is still a thing despite IKEA, big box furniture stores, etc.

People will pay for quality craftsmanship they can touch and enjoy and can afford and cannot do on their own - woodworking. Less so for quality code and apps because (as the Super Bowl ads showed us) anyone can create an app for their business and it's good enough. The days of high-paid coders is nearly gone. The senior and principals will hang on a little longer. Those that can adapt to business analyst mode and project manager will as well (CEOs have already told us this: adapt or get gone), but eventually even they will be outmoded because why buy a $8000 couch when I can buy one for $200 and build it myself?

I like writing new, interesting code, but learning framework #400 with all its own idiosyncrasies has gotten really old.

I just rebuilt a fairly simple personal app that I've been maintaining for my family for nearly 30 years, and had a blast doing with an AI agent - I mostly used Claude Sonnet 4.5. I've been dreading this rebuild mostly because it's so boring; this is an app I built originally when I was 17, and I'm 43 now. I treated Claude basically like I'd treat my 17-year-old self, and I've added a bunch of features that I could never be assed to do before.


i agree. it seem like an expectation these days to use AI sometimes... for me i am happy not using it at all, i like to be able to say "I made this" :)

I suppose the question is "Do you feel Steve Jobs made the iPhone?"

Not saying right/wrong but it's a useful Rorschach Test - about what you feel defines 'making this'?


it's more just a personal want to be able to see what I can do on my own tbh; i don't generally judge other people on that measure

although i do think Steve Jobs didn't make the iPhone /alone/, and that a lot of other people contributed to that. i'd like to be able to name who helps me and not say "gemini". again, it's more of a personal thing lol


So not disagreeing as you say, it is a personal thing!

I honestly find coding with AI no easier than coding directly, it certainly does not feel like AI is doing my work for me. If it was I wouldn't have anything to do, in reality I spend my time thinking about much higher level abstractions, but of course this is a very personal thing too.

I myself have never thought of code as being my output, I've always enjoyed solving problems, and solutions have always been my output. It's just that before I had to write the code for the solutions. Now I solve the problems and the AI makes it into code.

I think that this probably the dividing line, some people enjoy working with tools (code, unix commands, editors), some people enjoy just solving the problems. Both of course are perfectly valid, but they do create a divide when looking at AI.

Of course when AI starts solving all problems, I will have a very different feeling :-)


If you managed an AI (or rather, ai system) that wrote a compiler or web browser like Claude code or cursor did, would you feel like you did it?

Just a curious question, not trying to be combative or anything.

I myself will go into planning mode and ask it to implement a feature, and ask it to give me tradeoffs between implementation details. Then I might chat with it a bit to further understand the implementation before it writes the plan.

I find it to be very effective and gives me a sense of agency in my features.


Then don't delegate it to AI.

I’m not worried about being a good dev or not but these AI things thoroughly take away from the thing I enjoy doing to the point I’d consider leaving the industry entirely

I don’t want to wrangle LLMs into hallucinating correct things or whatever, I don’t find that enjoyable at all


I've been through a few cycles of using LLMs and my current usage does scratch the itch. It doesn't feel like I've lost anything. The trick is I'm still programming. I name classes and functions. I define the directory structure. I define the algorithms. By the time I'm prompting an LLM I'm describing how the code will look and it becomes a supercharged autocomplete.

When I go overboard and just tell it "now I want a form that does X", it ends up frustrating, low-quality, and takes as long to fix as if I'd just done it myself.

YMMV, but from what I've seen all the "ai made my whole app" hype isn't trustworthy and is written by people who don't actually know what problems have been introduced until it's too late. Traditional coding practices still reign supreme. We just have a free pair of extra eyes.


I also use AI to give me small examples and snippets, this way it works okay for me

However this still takes away from me in the sense that working with people who are using AI to output garbage frustrates me and still negatively impacts the whole craft for me


Having bad coworkers who write sloppy code isn't a new problem, and it's always been a social problem rather than a technical one. There was probably a lot less garbage code back when it all only ran on mainframes because fewer people having access meant that only the best would get the chance, but I still think that opening that up has been a net benefit for the craft as a whole.

Hah. I don't work with (coding) people, so thankfully I don't have that problem

Serious question: so what then is the value of using an LLM? Just autocomplete? So you can use natural language? I'm seriously asking. My experience has been frustrating. Had the whole thing designed, the LLM gave me diagrams and code samples, had to tell it 3 times to go ahead and write the files, had to convince it that the files didn't exist so it would actually write them. Then when I went to run it, errors ... in the build file ... the one place there should not have been errors. And it couldn't fix those.

The value is pretty similar to autocomplete in that sometimes it's more efficient than manually typing everything out. Sometimes the time it takes try select the right thing the complete would take longer to type manually, and you do it that way instead, and sometimes what you want isn't even going to be something you can autocomplete at all so you do it manually because of that.

Like autocomplete, it's going to work best if you already know what the end state should be and are just using it as a quicker way of getting there. If you don't already know what you're trying to complete, you might get lucky by just tabbing through to see if you find the right result, or you might spend a bunch of time only to find out that what you wanted isn't coming up for what you've typed/prompted and you're back to needing to figure out how to proceed.


I think there is more existential fear that is left unaddressed.

Most commenters in this thread seem to be under the impression that where the agents are right now is where they will be for a while, but will they? And for how long?

$660 billion is expected to be spent on AI infrastructure this year. If the AI agents are already pretty good, what will the models trained in these facilities be capable of?


> My advice to everyone feeling existential vertigo over these tools is to remain confident and trust in yourself. If you were a smart dev before AI, chances are you will remain a smart dev with AI.

We replaced the chess board in the park with an app that compares the Elo score of you and your opponent, and probabilistically declares a winner.

But don't worry, if you were a good chess player before we introduced the app, chances are you will remain a good one with the app. The app just makes things faster and cheaper.

My advice to the players is to quit mourning the loss of the tension, laughter and shared moments that got them into chess in the first place.


Good news, AI coding assistants aren't a magic button that give you the final result without having to play the game at all. You'll still need to make plenty of moves on your own at your job, and you're free to use or not use them as much as you want outside them. Your job was never to play chess though in this analogy though, which is where it misses pretty hard; you were being paid to produce software, and the process was incidental to it.

> you were being paid to produce software, and the process was incidental to it.

Yes, the people who write articles like the one in this post understand this. Previously, they could do it and get paid while doing a thing they loved.

Now that process is no longer economically viable: they can get paid, or they can do the thing they loved. They lost something, so they mourn the loss. At least they would, but a bunch of tone-deaf people keep interrupting them to explain why they shouldn't.


There's a time and a place for certain tools.

Sometimes I like playing chess at the park with strangers or friends. Sometimes I like playing chess online with friends in another country.

Sometimes I like to play games online with my siblings. Sometimes I like to invite people over to play video games with me on the couch.

Sometimes I wanna watch a movie in the theater. Sometimes I wanna fire up Netflix and watch that same movie, but on my couch.

Sometimes I wanna vibe code an entire app in a weekend. Sometimes I wanna play code golf to solve a puzzle, where LLM usage defeats the purpose.

None of these are being replaced in my life despite having more "advanced" options. If anything, I get to enjoy things more because I have more options and ways to enjoy them.


>We replaced the chess board in the park with an app that compares the Elo score of you and your opponent, and probabilistically declares a winner.

The chess board is still there, not sure I see how LLM tools compels one to stop writing personal projects without AI assistance.


Yes, absolutely. I think the companies that don't understand software, don't value software and that think that all tech is fundamentally equivalent, and who will therefore always choose the cheaper option, and fire all their good people, will eventually fail.

And I think AI is in fact a great opportunity for good devs to produce good software much faster.


I agree with the quality comments. The problem with AI coding isn't so much the slop, it's the developers not realizing its slop and trying to pass it off as a working product in code reviews. Some of the stuff I've reviewed in the past 6 months has been a real eye opener.

So fire them and hire the experienced people excluded due to ageism who can't get a foot in the door anywhere because their resume shows that they went to college before the internet became commercialized.

I think if companies are gonna allow engineers to vibe code they really need to provide training on how to vibe code and not produce slop.

If a company is gonna REQUIRE vibe coding as an accelerator then it is in the company's best interest to invest in education!


I think the issue is that given the speed the bad dev can generate sub-par results that at face value look good enough overwhelm any procedures in place.

Pair that with management telling us to go with AI to go as fast as possible means that there is very little time to do course correction.


I think no one is better positioned to use these tools than experienced developers.

For me the problem is simple: we are in an active prisoner's dilemma with AI adoption where the outcome is worse collectively by not asking the right questions for optimal human results, we are defecting and using ai selfishly because we are rewarded by it. There's lots of potential for our use to be turned against us as we train these models for companies that have no commitment to give to the common good or return money to us or to common welfare if our jobs are disrupted and an AI replaces us fully.

For some of us, our jobs have already been completely disrupted by other factors (e.g., ageism) so there is nothing to lose here after 2 years and 2400+ job applications with nothing to show for it.

Clearly society wants me to strike out on my own; and that has been facilitated by the rise of agentic coding.

If you've not been paying attention to the news, caring for the common good/welfare is now obsolete and self destructive. We are in survival mode. It's everyone for themselves now.


I think it represents a bigger threat than you realize. I can't use an AI for my day job to implement these multi-agent workflows I see. They are all controlled by another company with little or no privacy guarantees. I can run quantized (even more braindead) models locally but my work will be 3-5 years behind the SOTA, and when the SOTA is evolving faster than that timeline there's a problem. At some point there's going to be turnover - like a lake in winter - where AI companies effectively control the development lifecycle end-to-end.

> My advice to everyone feeling existential vertigo over these tools is to remain confident and trust in yourself.

I do try to do that and have convinced myself that nothing has really changed in terms of what is important and that is systems thinking. But it's just one more barrier to convincing people that systems thinking is important, and it's all just exhausting.

Besides perhaps my paycheck, I have nothing but envy for people who get to work with their hands _and_ minds in their daily work. Modern engineering is just such a slog. No one understands how anything works nor even really wants to. I liken my typical day in software to a woodworker who has to rebuild his workshop everyday to just be able to do the actual woodworker. The amount of time I spend in software merely to being able to "open the door to my workshop" is astounding.


It used to be fun before companies figured out how to put claustrophobic guardrails on our autonomy.

One thing I'm hoping will come out of this is the retiring of coders that always turn what should be a basic CRUD app (just about everything) into some novelty project trying to pre-solve every possible concern that could ever come up, and/or a no-code solution that will never actually get used by a non-developer and frustrate every developer that is forced to use it.

It's a combination of things... it's not just that AI feels like it is stripping the dignity of the human spirit in some ways, but it's also that the work we are doing is often detrimental to our fellow man. So learning to work with AI to do that faster (!!) (if it is actually faster on average), feels like doubling down.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: