Having used CRT monitors, 1920x1080 displays, 4K displays and 5K displays, as well as various Retina Macbooks over many years, mostly for coding, here's my opinion:
The only good solution today is the iMac 5K. Yes, 5K makes all the difference — it lets me comfortably fit three columns of code instead of two in my full-screen Emacs, and that's a huge improvement.
4K monitors are usable, but annoying, the scaling is just never right and fonts are blurry.
Built-in retina screens on macbooks are great, but they are small. And also, only two columns of code, not three.
One thing I noticed is that as I a) become older, b) work on progressively more complex software, I do need to hold more information on my screen(s). Those three columns of code? I often wish for four: ClojureScript code on the frontend, API event processing, domain code, database code. Being older does matter, too, because short-term memory becomes worse and it's better to have things on screen at the same time rather than switch contexts. I'm having hopes for 6K and 8K monitors, once they cost less than an arm and a leg.
So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.
> So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.
This opinion seems bizarre to me. You start by offering personal (and valid) anecdote, then end up saying "I don't think you can develop [...]". But this flies in the face of evidence. Most people by far do not use your preferred monitor setup (iMac 5K) and in my country a vast number of developers use 1366x768 to develop all sorts of high quality software.
It's one thing to say "as I grow older, I find I prefer $SETUP". No-one can argue with that, it's your opinion (and it might very well become mine as I grow... um, older than I already am!). It's an entirely different thing to claim, as you do here and I think TFA does in similar terms, "you cannot prefer lower tech setups", "you cannot develop software this way", "it's very difficult to develop software without $SETUP". The latter is demonstrably false! I've seen it done, again and again, by people who were masters at their craft.
I don't think they were doubting that somebody does develop in those random setups, they were disagreeing with the people that say it doesn't matter and you can code anywhere. In your quote, a royal you.
But they are not random setups. They are extremely common setups in my part of the world. People -- who are pretty good at what they do -- can and do develop using these tiny screens. In this regard, "it doesn't matter". Or taking less literally, they wouldn't complain if they got a better monitor, but it's not the primary concern for them. So taking a cue from TFA's title: "no, it's not time to upgrade your monitor".
Following your logic, I can't claim anything, because there is always someone, somewhere, who will come up with a contrary opinion.
I do respect your opinion, but I still hold to mine. I also think this discussion won't lead anywhere, because we are glossing over the terms. Not every "developer" is the same, not every "software" is of the same complexity. I can fix CSS in a single 80x25 terminal, I can't do that when thinking about changes in an ERP system.
Note I do not dispute your opinion. You're entitled to it and you know what works for you.
Regrettably, following (my) logic does mean you cannot say "you [the generic you] cannot develop like this", because this is indeed easily disproven. People can and do (and sometimes even prefer to). That's the problem with generalizing from your personal opinion ("I don't like this") to the general ("people cannot like this").
Ya, the iMac 20" 5K is deal. I got an LG 28" 4K just to interface with a MBP..it was a painful compromise but stand alone 5K monitors are just too expensive right now.
When we are talking pixel count, we have to talk about size of the display also. a 28" 4k is acceptable, a 40"+ 4K is best used as a TV.
The best display I use right now is a 11" iPad Pro with a refresh rate at 120 Hz. You really can feel it, especially for stylus work.
The only good solution today is the iMac 5K. Yes, 5K makes all the difference — it lets me comfortably fit three columns of code instead of two in my full-screen Emacs, and that's a huge improvement.
4K monitors are usable, but annoying, the scaling is just never right and fonts are blurry.
Built-in retina screens on macbooks are great, but they are small. And also, only two columns of code, not three.
One thing I noticed is that as I a) become older, b) work on progressively more complex software, I do need to hold more information on my screen(s). Those three columns of code? I often wish for four: ClojureScript code on the frontend, API event processing, domain code, database code. Being older does matter, too, because short-term memory becomes worse and it's better to have things on screen at the same time rather than switch contexts. I'm having hopes for 6K and 8K monitors, once they cost less than an arm and a leg.
So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.