Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, it's not.

Dividing one day into just 1000 units is way less precise unless one uses decimals, in which case it's just plain inconvenient.



I think the idea is that for most human uses of time we don't specify start or end times to a precision of more than about 5 minutes. Stuff like train timetables you might want to go down to about a minute. So one could argue that we have at least 60 times the resolution we really need for day-to-day use.

If you absolutely need more precision (accurate timestamping) then decimals are available.


I take it you haven't used a microwave recently. Or done any other cooking.


Yep tho most ppl use microwaves by pressing the "30s" button (I guess it would be labelled 1/2 or 1/3) n times. Other cooking seldom requires time precision < 1 minute, for finicky precise things you usually watch the process and manage it by eye, rather than relying on absolute time.


"way less precise" ? There are only 1440 minutes in a day, so a beat is 1 minute and 26.4 seconds, precise enough. And then, if you you want more precision, like we use seconds for minutes, you can divide a beat by 100 (@500.12), not less inconvenient than using seconds.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: