Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It is also demonstration how much performance we have left behind as we have adopted laptops as the mainstream computing platform/

I don't know. Desktops simply tend to have a much longer lifespan and so far every new laptop I owned in the past 10 years outperformed my then desktop PC for that precise reason (at the very least in certain aspects like performance per watt).

What I'm rather afraid of is other manufacturers adopting Apple's insane vendor lock-in and total hardware lock-down. Any failure in an M1 system results in complete data loss and a logic board replacement - RAM on SoC, SSD soldered to the board, etc. No more choice and zero upgradability - the oldest components in my current desktop are 12+ years old, but the rest was gradually upgraded (SSD, GPU, CPU, additional RAM, HDD replacements, etc.)

Such things won't be possible in brave new world of SoCs...



When it comes to PC Desktops, I do feel there might be more money to be made in still allowing modularity that allows PC enthusiasts and builders to still build their own computers. There is a massive consumer market for it.

People aren't going to be buying new desktops every 1-2 years, but many of them will buy new GPU's, or cases, or fans etc. The best part is that we get to upgrade the components that will actually provide better performance. For instance, why would I upgrade my entire machine just for a better graphics chip, if my SSD, CPU and RAM are still in tip top shape, short answer is I wont.


In reality those things also change depending on your luck and how 'in-sync' you are with the industry. Just upgraded your ddr3 ram in 2013? Get ready for ddr4 which will require both new ram and a new motherboard - and any new high-performance cpu will only be compatible with ddr4 so you're forced to upgrade lest you be stuck on an old standard. This is about to happen with ddr5 as well, by the way.


Absolutely there are certain upgrades which require large component upgrades but those are few and far between. But this doesn't require a new GPU, case, case fans, storage media etc.

DDR4 as an example was released 7 years ago and DDR5 is still yet to be released not less a required upgrade for a new CPU.


Depends, I always used my desktops for so long that when it to replace anything, I just had to buy a new one anyway due to cascade effect of updates.

There were a couple of exceptions, I just got lucky that old stuff was still around to buy.


I have had the same experience. I built a desktop telling myself I'll be able to upgrade parts when I'm no longer satisfied with the performance. Turned out that by then the sockets had changed sufficiently that it didn't make much sense.


It's a scale and the original commenter was talking about Apple's approach which is that entire board would need to be replaced. When it comes to PC desktops, you can still upgrade the CPU, RAM and Mobo without needing to upgrade the GPU (Costly), power supply and storage devices.

PC's have always been this way and sometimes you get lucky that you can use the same slot for a long time, other times you dont.

For me, if I wanted the latest CPU I would need to upgrade my mobo and CPU. But not my RAM, my GPU ($1500), power supply or my SSD's. If I was on a Mac and one of these parts broke, I would need to actually buy all of the internals again which are soldered to the main board.


That is why around 2006 I went 100% into workstation class laptops + docking station, and don't miss desktops at all.


Right and that's but but clearly my post was in response to the user who commented that with increased integration/soldering of parts if one part fails you need to throw the entire thing out rather upgrading that specific component.

In your case, if nothing fails in your machine both options don't impact you at all. Whereas if a part does fail you might be up for a more expensive repair bill than previous




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: