Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

`git add` can be quite slow when handling large files, in large repositories, with a large index or on slow platforms. For instance this optimization in git brought the runtime of `git add .` on Windows with 200k files from 6s to 3s: https://github.com/git/git/commit/d1664e73ad96aa08735bf81d48....

100ms let alone 3s is much too long a wait, so Sublime Merge predicts the outcome of staging and presents that immediately. This made a noticeable improvement to responsiveness even on small repositories under Linux.



I've never noticed a delay in my life.

So, "predicts outcome", what does that even mean? I know the outcome of `git add` is... the file being added. And it has to run the command in the end anyway.


It means that the outcome is shown immediately on the next rendered frame instead of waiting for the command to complete, then reading the index and updating the UI from that. You knowing the outcome is exactly the point; the UI needs to be responsive immediately because you're not waiting to find out what happens.

As an example: Lets say it takes 200ms to run `git add` and then another 100ms to read the index; you want to stage 3 files. You click the first stage button, nothing happens and you move to click the next stage button, now the UI updates and removes that first file shifting what you're about to click on. This behavior was extremely annoying when it came up in testing.

With prediction here's what happens instead: You click the first stage button, immediately the UI updates removing that file, you move to click the 2nd which again is immediately removed. Transparently in the background `git add` is run and we confirm the end the result is as predicted.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: