Is it hard to imagine that things will just stay the same for 20-30 years or longer? Here is an example of the B programming language from 1969, over 50 years ago:
printn(n,b) {
extrn putchar;
auto a;
if(a=n/b) /* assignment, not test for equality */
printn(a, b); /* recursive */
putchar(n%b + '0');
}
You'd think we'd have a much better way of expressing the details of software, 50 years later? But here we are, still using ASCII text, separated by curly braces.
I observed this myself at least 10 years ago. I was reflecting on what I had done in the approximately 30 years I had been programming at that time, and how little had fundamentally changed. We still programmed by sitting at a keyboard, entering text on a screen, running a compiler, etc. Some languages and methodologies had their moments in the sun and then faded, the internet made sharing code and accessing documentation and examples much easier, but the experience of programming had changed little since the 1980s.
“ Drucker makes a manager’s life easier, Deming makes it harder”
This is why companies can’t do Agile, especially Scrum: Scrum requires the most of the people in power, who typically can’t be bothered and who get to dictate process.
Some of them feel bad about it and some of them refined metallurgy to build Saturn V rockets and go to space. We are very much living in the new space race. The discussion here is split 50/50 between the “Thank you! I feel the same way” folks and the “I am having the time of my life!” folks.
That is a fucking travesty. If there’s one thing we should be able to rely on C for it’s that it works with assembly, and it’s always been the case that 0 is false and any other value is true. That’s a compiler bug as far as I’m concerned. I don’t use C++ because it’s gone in a ludicrous unhelpful direction since 2000 or so, but it’s sad to learn that C of all languages decided to favor pedantry over working code.
Note that this is explicitly comparing two values, which is very different from checking whether a single value is true. Surely you wouldn't expect -1 == 0 to evaluate to true.
> Surely you wouldn't expect -1 == 0 to evaluate to true.
I wouldn't, no - but that's exactly what's happening in the test case.
Likewise, I wouldn't expect -1 == 1 to evaluate to true, but here we are.
The strict semantics of the new bool type may very well be "correct", and the reversed-test logic used by the compiler is certainly understandable and defensible - but given the long-established practice with integer types - i.e "if(some_var) {...}" and "if(!some_var) {...}" - that non-zero is "true" and zero is "false", it's a shame that the new type is inconsistent with that.
I still remember one of my first teachers of programming softly shaming me for writing a condition like
if (something == true)
I haven't done so ever since (1997), and thus I avoid the contrary (with == false) as well, using ! instead. But I would be a lot less ashamed if I knew that there are such conditions in production software.
I would also never guess that the problem described in the article may occur...
Buy your teacher a drink! I went into university with the baggage of ten years of programming experience, mentored by my industry-experienced father. One of our profs had the exact reverse point of view (i.e. "foo == true" was according to him "good practice"), and I wisely chose to disregard his opinions on coding practices from that point on.
The (minor, but still) optimization that is enabled by assuming _Bool can contain only 1 or 0 is that negating a boolean value can be with x^1, without requiring a conditional.
That being said, for just testing the value, using the zero/nonzero test that every (?) cpu has is enough; I'm not sure what is achieved here with this more complex test.
Tech companies never last. Apple will miss a disruptive innovation or make a key strategic error causing them to lose their dominant spot. Look at the top tech companies 50 years ago: how are they doing today?
Is like the transition from monarchies to nation states.
By the 19th century, the rise of nation-states accelerated due to the spread of nationalism, the decline of feudal structures, and the unification of countries like Germany (1871) and Italy (1861). Centralized governments, uniform laws, national education systems, and a sense of collective identity became defining features. The French Revolution (1789) played a pivotal role by promoting citizenship, legal equality, and national sovereignty over dynastic rule
Maybe in 2300 they'll say something similar about nationalism
What comes to mind is Java vs assembly. Claude is just a reallyreally high level language compiler. I work with senior Java devs who have never written assembly.
On the learning front, I spend the weekend asking Claude questions about Rust, and then getting it to write code that achieved the result I wanted. I also now have a much better understanding of the different options because I've gotten three different working examples and gotten to tinker with them. It's a lot faster to learn how an engine works when you have a working engine on a dyno than when you have no engine. Claude built me a diesel, a gasoline and an electric engine and then I took them apart.
>If you largely accept their edits unchanged, your codebase will accrue massive technical debt over time and ultimately slow you down vs semi-automatic LLM use.
Worse, as its planning the next change, it's reading all this bad code that it wrote before, but now that bad code is blessed input. It writes more of it, and instructions to use a better approach are outweighed by the "evidence".
People can take on debt for all sorts of things. To go on vacation, to gamble.
Debt doesn't imply it's productively borrowed or intelligently used. Or even knowingly accrued.
So given that the term technical debt has historically been used, it seems the most appropriate descriptor.
If you write a large amount of terrible code and end up with a money producing product, you owe that debt back. It will hinder your business or even lead to its collapse. If it were quantified in accounting terms, it would be a liability (though the sum of the parts could still be net positive)
Most "technical debt" is not buying the code author anything and is materialized through negligence rather than intelligently accepting a tradeoff
All those examples were borrowing money. What you're describing as "technical debt" doesn't involve borrowing anything. The equivalent for a vacation would be to take your kids to a motel with a pool and dress up as Mickey Mouse and tell them its "Disney World debt". You didn't go in debt. You didn't go to Disney World. You just spent what money you do have on a shit solution. Your kids quite possibly had fun, even.
> term technical debt has historically been used
There are plenty of terms that we no longer use because they cause harm.
Ask HN 1800: How to avoid losing spinning wheel skills in new spinning jenny era?
Ask HN 1920: How to avoid losing farrier skills in new automobile era?
Ask HN 1980: How to avoid losing typewriting and shorthand skills in new microcomputer era?
Ask HN 1990: How to avoid losing assembly language skills in new C++ era?
Ask HN 1995: How to avoid losing DOS TUI app dev skills in new Windows era?
Ask HN 2000: How to avoid losing Visual Basic skills in new web application era?
(The answer, btw, is if you are still interested in such niche skills, then you just have to practice on your own, or find a niche product or marketplace).
Funny enough, I lived through all of those eras starting with assembly in 1986 in sixth grade),
1996 - C and Fortran on DEC VAX and Stratus VOS mainframes
2001 - C/C++ on PCs and mainframes and starting to work on VB
2006 - JavaScript/C#/some Perl
2011 - C# on Windows ruggedized devices
2016 - .NET Core
2021 - Working as an L5 at AWS (ProServe)
2026 - staff consultant at a 3rd party consulting company. Every single project I’ve done has had Bedrock (AWS service that host most of the popular models) and I constantly have three terminal sessions open - one to run code, one running Codex and the other running Claude.
The issue is not that “FSD” is $99/mo. The issue is that a feature of $22k cars (lane keeping) is behind the $99/mo paywall.
Basically not enough people were buying the subscription for Elon to get his payout. But someone who just wants auto steering isn’t going to decide they’ll pay $99/mo for that. So this is just going to make people who like that feature not buy a new Tesla when the time comes.
reply