We should normalize "finished" software products that stop feature creep and focus strictly on bug fixes and security updates.
It takes real courage for a builder to say, "It’s good enough. It’s complete. It serves the core use cases well." If people want more features? Great, make it a separate product under a new brand.
Evernote and Dropbox were perfect in 2012. Adding more features just to chase new user growth often comes at the expense of confusing the existing user base. Not good
And people liked that model, see the huge backlash when Adobe went subscription for creative suite.
I do wedding photography as a side hustle, I upgrade my camera maybe once every ~7 years. Cameras have largely been good enough since 2016 and the 5D Mark IV. I have a pair of R6 mk II that I'll probably hold onto for the next 10 years.
Point being, Lightroom has more or less been feature complete for me for a very, very long time. For about the price of 1/year subscription, I could have purchased a fixed version of Lightroom with support for my camera and not had to buy it again for another 10 years.
We are getting milked for every nickle and dime for no reason other than shareholder value.
It actually discourages real improvements. Before the subscription model, if Adobe wanted to sell me another copy of Lightroom they had to work really hard to make useful features that people actually wanted, enough to the point they'd buy thew version.
Now, they don't have to. You have to keep paying no matter what they decide to do.
> And people liked that model, see the huge backlash when Adobe went subscription for creative suite.
That backlash was short lived. Adobe went from $4.4 billion in revenue in 2021 to $23.7 billion. It used to cost $2500 for the "master collection". Now it's $50 a month.
I was one of those people that disliked switching to subscription. I stayed on CS6 for years. I'm also only a relatively casual user though. I once tried Affinity Photo for some work. Their workflow, for my needs, would have made me take ~6hrs more time than the similar workflow in Photoshop. So I paid the $120 a year for photoshop/lightroom because $120 is way less than 6hrs of my life. If of course that was my specific case. It might not be true for others. The point was though, $120, at least for me, is not that much money relative to what I charge/get-paid. So I gave in.
Further, Photoshop is a good example (to me) of software that can't stop updating. New formats come out HEIC for example. New cameras with new raw formats come out. New tech comes out. HDR displays are ubiquitous at this point (all apple products, some large percent of Android, PC, and TVs) (which BTW, Photoshop does not yet truly support so expect an upgrade).
It appears to be $69.99 per month with an annual contract, $104.99 per month if month-to-month. But the point of subscription-based things is to make you forget and not notice the price increases.
> That backlash was short lived. Adobe went from $4.4 billion in revenue in 2021 to $23.7 billion
So? Anecdotally, the vast majority of Adobe product users are still upset about the subscription model (but not upset enough to switch to worse software)
> It used to cost $2500 for the "master collection". Now it's $50 a month.
They're upset, yet they're paying for it. It sounds like the software was underpriced, because people are still using it. Honestly, blame the consumers, not the businesses in these scenarios.
For lightroom at least, no, because there are very few or even no good alternatives. It looks like there are a lot of photo editor apps out there, but most of them are crap or designed for different workflows. I can say because I evaluated various options before begrudging accepting lightroom was the only decent choice.
The subscription model irks me because it's a bit overpriced and they keep trying to shove subscription features on us. No, I don't, and will never care about ridiculously overpriced cloud storage nor generative AI tools. How about adobe fixes issues in the core product first? If given the choice, I would definitely choose a pay-once, no-upgrades licence. But adobe saw their opportunity and started squeezing us for more on a product that was fine.
The plus side of this is it's motivated me to consider building my own photo editing software.
I’m surprised capture one wasn’t able to meet your needs, as an ex-heavy Lightroom user that has been very happy with their transition to C1 with a perpetual license.
> Before the subscription model, if Adobe wanted to sell me another copy of Lightroom they had to work really hard to make useful features that people actually wanted, enough to the point they'd buy [the new] version.
What backlash against Adobe? I think you are mistaking comment section consensus for reality. People on forums and social media complain, but the comment section consensus is often dead wrong!
There was no real backlash against Adobe. They added subscriptions and grew revenue. Some people grumbled online, but they paid, which means they don’t like the old model, they like the new one.
There is absolutely no monopoly in photo editing software. Entering this market is fairly easy with a new product. I wonder what market (in software or outside software) could you name as more competitive.
The catch was that old boxed software eventually breaks on new OS versions or devices.
However, SaaS has the potential to "freeze" features while remaining functional 20+ years down the road. Behind the scenes, developers can update server dependencies and push minor fixes to ensure compatibility with new browsers and screen sizes.
From the end-user's perspective, the product remains unchanged and reliable. To me, that’s very good!
In the old days there was no expection when and if users would upgrade anything, so vendors had to take extra care to ensure compatibility or they would lose business. People in a single office could be running 6 different versions of Microsoft Office, and the same file had to be viewable and editable on all of them. A company could decide to upgrade to Office 2010 but stay on Windows XP, so the Office division had the finanical incentive to ensure that newer versions would work on an older OS.
Nowadays the standard is "you must be on the newest version of everything all the time, or the app won't work". Don't want to upgrade to Win 11? Want to use Firefox instead of Chrome? Don't want all the bells and whistles that come with the newest version of the software? Too bad.
Because security fixes don't get backported, when they could, and few are still doing separate security vs. feature updates.
Even Windows is doing it now with CUs, bundling feature & vulnerability patches together, then deprecating the last version. You don't have a choice anymore, it's "accept the features or else"
Which resulted in businesses holding on to extremely old software and OSs because migration to the latest system was expensive and difficult. Now it's effortless.
SaaS has that potential but the reality is more often that the vendor gets acquired, or they just decide to stop supporting it, and shut it down. You have no options to keep running what you had, only to migrate to a replacement, which is likely another SaaS which will do the same thing.
> The catch was that old boxed software eventually breaks on new OS versions or devices.
We have great emulators and virtual machines these days. As long as something runs offline and isn't doing anything weird you can just install it once and keep using it and it will never break.
Or, as in the case of Microsoft Publisher, announce that it will be going away on a certain date with no recourse.
Before 10/26 I have to re-work my desk position manual and a deposit sheet which use Publisher and which MS Word is _not_ suited for. Probably will do them in LyX or LaTeX.
This is one of the biggest issues in software development: So few projects are willing to admit that they are finished. I can probably count on one hand how many software products I use every day that actually get better (or stay the same) on update. The vast majority of them peaked somewhere around v1.0, and are just getting worse every time the developer touches them.
Dropbox is a great example. It's now a fundamentally different product than the original, and has re-created exactly the problem the original solved. There's no longer a good cloud-synced folder tool; everybody has gone back to implementing network filesystems that are much more complex and a badly leaky abstraction.
I remember the first time I experienced this was an iOS app called Task Eater. It was simple To-dos. Attractive, snappy, everything you could need. The dev released a "final update" where he basically declared it was done. This was pretty early iOS (iPhone 4 era?).
The only problem is he never updated it to roll forward to future iOS version/iPhone models and it hasn't been usable for years (and years).
This made me search it up - the world moves so fast it's difficult to find any information on it whatsoever.
I think it's common in libraries of small to medium size. I often see Haskell and Rust packages that are not updated because full functionality achieved, no bugs and 100% test coverage.
I wrote my comment before I read yours, but it perfectly explains why software doesn't do that. Sadly I can't take credit for the core idea: https://news.ycombinator.com/item?id=47272024
I think the hard part is understanding if “finished” software is still maintained.
I’ll pick on AppZapper here. It doesn’t need to do much, it’s a Mac app that finds the files related to an application and deletes them, for a complete uninstall. It released in 2006, with v2 coming out in 2010. The website looks like it’s still from that era and the last update was in 2020, almost 6 years ago. To be fair, it’s a cool app, from the “delicious” design era. It makes a ray gun “zap” sound as it uninstalls an apps, for no reason other than to be fun. It’s great.
Is it still in active maintenance with nothing to do, and there have been no meaningful changes in how apps are installed in the last 6 years, or has it been abandoned? I really don’t know. If I’m a new user trying to decide if I should pay $20 for this app, what do I do?
I don’t want updates for updates sake, as that leads to enshitification, but there needs to be some sign that a user isn’t throwing money at a dead product. Or in the ls example, developing a workflow around a dead tool.
A ran into a similar question with Yojimbo. It also launched in 2006. When v2 came out it was a paid upgrade and seemed fairly minor for how much was being charged (at least to a broke college student). It felt like they didn’t really care about it. But here we are 20 years later and it still seems to be going. But it was 14 years between v2 and v3, and the last release was in 2023. With 14 years of support, maybe that v2 would have been a worthwhile upgrade for me in hindsight, but when they go years without an update, I start to question if I should be looking to something with more support. Should a new user buy and invest time in putting their data into something that hasn’t been updated in 3 years, or is that a red flag?
Apple tends to expect a lot from its developers. Changing processor architectures is a big one. Unlike other platforms, Apple cuts support and moves on. If an app is abandoned, it starts to show sooner on Apple platforms than anywhere else.
If it’s something that will become critical to a user’s workflow, that can be a big problem. Why invest in an app without a future?
I prefer the perspective that a computer program is akin to a mathematical constant. This was true in the old days. A program I wrote in C64 BASIC, way back in 1980s, should still work precisely the same today (even on one of the freshly-manufactured Commodore 64 Ultimates).
You've honed right in on what's changed since the old days: Platform vendors (such as Apple) now continuously inject instability into everything.
You might argue that developments such as "changing processor architectures" justify such breaks from stability (though I myself have qualms with one of the richest companies in the world having a general policy of "cutting support and moving on"). But I would point out that Apple (and other vendors) create instability far beyond such potentially-justifiable examples.
To me, it appears as if Apple actively works to foster this modern "software is never finished" culture. This is seen perhaps most clearly by they way they remove apps from their iOS store if they haven't been updated recently (even as little as 2 years!): https://daringfireball.net/linked/2022/04/27/apple-older-gam...
Shouldn't we be demanding stability from our platforms? Isn't the concept of "finished software" a beautiful one? Imagine if you could write a useful program once, leave it alone for 40 years, and then come back to it, and find that it's still just as useful. Isn't this one of the most valuable potential benefits of software as a premise? Are the things we're trading this for worth it?
I keep looking for more useful DOS applications. Have had a git repo for 15+ years that has my config and virtual C:. Once something is installed it will just keep working and I sync that repo to all my machines so it have the exact same things running with the exact same versions. Binaries from 40 years ago still run fine. Just wish more modern applications supported it.
I also have QEMU installations of some Windows versions and old Debian versions (I like Debian 3.0, since the entire library of packages fits neatly on a single DVD ISO... that was the last release that small). Those are also useful for having stable platforms to run offline applications on without having to be bothered by upgrades.
> You mean that dumb app that forced you to move files into a single folder instead of adapting to your workflow was prefect?
Users in 2012 were overwhelmingly of the cohort who's metaphor for doing work on a computer was the filesystem. You opened Files with Programs, worked on them, saved them. You wanted your latest Files on all your computers (1), and you wanted to Share them (2).
Unless users were on a system managed by a sysadmin there were only really two solutions for problems (1)&(2): you would Email the File (A), or you would copy it to a Floppy/CD/USB (B) and physically move it.
Note the caveat of "in absence of a sysadmin". So either on a school or corporate work environment, or if you happened to have a geek in your family/social group who did it as a passion project. Or y'know, if _you_ were the geek you could roll your own.
While you're there, note the tag line in the title of the post "Throw away your USB key"
Now rereading your comment it is clearly an example of exactly what OP was referring to:
> It takes real courage for a builder to say, "It’s good enough. It’s complete. It serves the core use cases well." If people want more features? Great, make it a separate product under a new brand.
If Dropbox did not "adapt to your workflow", then just _don't use Dropbox_.
Instead you attack it as "dumb" and demand it change...and those users for whom Dropbox _was_ perfectly adapted don't have their solution anymore.
Software doesn't have to be forever changing and chasing user growth; it's not a zero-sum game. The bits don't care if no one uses them. But _people_ care if you take away their bits.
> ...we shouldn't normalize that, we should push for improvements
Agreed, you should create a new solution and put it out there! Just as suggested by the post you've replied to :)
AI will create ever more AI-generated synthetic content because current systems still can't determine with 100% certainty whether a piece of content was produced by AI. And AIs will, intentionally or unintentionally, train on synthetic content produced by other AIs.
AI generators don't have a strong incentive to add watermarks to synthetic content. They also don't provide reliable AI-detection tools (or any tools at all) to help others detect content generated by them.
Maybe some of them already embed some simple, secret marker to identify their own generated content. But people outside the organization wouldn’t know. And this still can’t prevent other companies from training models on synthetic data.
Once synthetic data becomes pervasive, it’s inevitable that some of it will end up in the training process. Then it’ll be interesting to see how the information world evolves: AI-generated content built on synthetic data produced by other AIs. Over time, people may trust AI-generated content less and less.
I really hope SynthID becomes a widely adopted standard - at the very least, Google should implement it across its own products like NotebookLM.
The problem is becoming urgent: more and more so-called “podcasts” are entirely fake, generated by NotebookLM and pushed to every major platform purely to farm backlinks and run blackhat SEO campaigns.
Beyond SynthID or similar watermarking standards, we also need models trained specifically [0] to detect AI-generated audio. Otherwise, the damage compounds - people might waste 30 minutes listening to a meaningless AI-generated podcast, or worse, absorb and believe misleading or outright harmful information.
Earlier this year, we at Listen Notes switched to Better Stack [0], replacing both Datadog and PagerDuty, and we couldn’t be happier :) Datadog offers a rich set of features, and as a public company, it makes sense for them to keep expanding their product and pushing larger contracts. But as a small team, we don't have a strong demand for constant new features. By switching to Better Stack, we were able to cut our monitoring and alerting costs by 90%, with basically the same things that we used from Datadog previously.
It takes real courage for a builder to say, "It’s good enough. It’s complete. It serves the core use cases well." If people want more features? Great, make it a separate product under a new brand.
Evernote and Dropbox were perfect in 2012. Adding more features just to chase new user growth often comes at the expense of confusing the existing user base. Not good