Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why Did Thomas Harriot Invent Binary? (springer.com)
117 points by nanna on April 22, 2023 | hide | past | favorite | 48 comments


Fascinating! I was unaware of Thomas Harriot.

If you're interested in Leibniz and his invention of binary, I highly recommend Wolfram's post on that: https://writings.stephenwolfram.com/2013/05/dropping-in-on-g...

I have a project on the backburner to create a "Handwriting Simulator", that you could give a task/equation to, and have it generate animations of performing that calculation as a human would with a pen and paper according to a named writing system (binary, roman numerals, hindu-arabic, et cetera). I would like to compare and contrast the efficiency of handwriting systems, and perhaps explore if there are some potential new ones yet to be invented. I've come full circle to the opinion that I've been undervaluing 2D handwritten notations as tools for thought. If anyone has any pointers I'd appreciate it! This reminds me of that, because I'd expect binary notation by hand to be extremely inefficient compared to other systems, so there was little/no value for it until logic machines.


Harriot is quite an interesting character. He spent time in the Virginia colony and taught himself the Native American Algonquin language. He had extensive journals of the natural world in the Americas, but they were jettisoned when he was leaving Virginia, and he had to try to recreate his notes from memory when he returned to England.


note also that napier invented binary arithmetic independently a few years later, also before leibniz

https://en.wikipedia.org/wiki/Location_arithmetic

unlike harriot, he published his method, in detail, in 01617, in the same book as napier's bones (but three years after replacing prosthaphaeresis with trig functions with the natural logarithm)

https://archive.org/details/rabdologiae00napi


https://wikipedia.org/wiki/Ancient_Egyptian_multiplication

"Ancient Egyptian multiplication" is also worth mentioning if slightly off topic.


Worth noting that since Common Core came out in 2010, this is how students in most US states are taught multiplication, only instead of partitioning by powers of two they partition by decimal digits (e.g. 4 * 56 --> 4 * 50 + 4 * 6). Conceptualizing multiplication this way equips you well for handling large numbers in your head.

A high school student today might be shocked to see the sort of complexity third- and fourth-graders in prior generations used to contend with for basic operations:

4 * 56 ---> 4 * 6 -> carry the 2, keep the 4 -> 4 * 5 -> (previous step's result) + (carried 2) -> (previous steps' result) * 10 + (kept 4)

This algorithm had to be expanded for each additional nonzero digit in each operand. Most of us who did well before Common Core passively learned the Common Core method independently, out of necessity.

Lattice method is badass though.


> A high school student today might be shocked to see the sort of complexity third- and fourth-graders in prior generations used to contend with for basic operations

Really? The classic multiplication algorithm is completely mechanistic. You just need to learn the multiplication table and then even a trained monkey can do it.

Then once a student learns and practices the mechanistic way, you can easily explain the math that lays beyond it. The inverse way doesn't really work.

I'm working with high-school students, and I'm already seeing the effects of people who can explain me how the algorithm works, but then struggle to actually multiply even 3-digit numbers.

Oh, and the lattice multiplication is the worst. It completely obscures the math behind it.


> The classic multiplication algorithm is completely mechanistic. You just need to learn the multiplication table and then even a trained monkey can do it.

So's the 'Common Core' one though? (I didn't go to school in the US, learnt both ways in the UK before 2010.)

The difference is it requires some addition, vs. 'long multiplication' requires some carry. I would always prefer addition personally, assuming I'm doing it in my head. And frankly if I'm not doing it in my head then I'm typing it into a calculator, not writing it out.


I completely agree. I recently taught my 10 year old daughter how to do multiplication and division the old fashioned way and she was completely blown over by the fact that it was so easy to do and faster and worked with any number of digits.


Some people can't memorize times tables, so the common core algorithm might be more generally applicable for those people even if it's less efficient than using a table. There's some value in finding an approach that works for everyone as an educator instead of one that works for Most if you can do that, I think.

In CPU terms, maybe you don't always have the die space for all the tables you need to make a blazing fast implementation of mul or sin or cos, but you can afford to fit a slow iterative solution instead.


> Some people can't memorize times tables

Well, perhaps then schools need to be fixed? Memorization is a skill like anything else, and it can be trained via a variety of ways (memory palaces, mnemonics, writing stuff down by hand).

If a student can't remember ~20 non-trivial entries in the multiplication table, then they likely will struggle with anything else non-trivial.

> There's some value in finding an approach that works for everyone as an educator instead of one that works for Most if you can do that, I think.

Also known as "dumbing down the education".

There are trade-offs everywhere. The lattice method simplifies bookkeeping on paper, but try to visualize it for mental math and you'll likely fail. While the classic textbook algorithm is fairly easy to do.

So the lattice method is optimized for a use-case that never really matters in practice: being able to multiply numbers on a piece of paper, while not having access to a calculator. While it's pessimized for a use-case that actually matters once in a while: pure mental math.


I am one of those people that cannot memorise times tables.

That hasn’t stopped me getting a degree in computer science, and working for a FANG since I graduated nearly 10 years ago as a software engineer. I have provided value.

I can do just fine in the real world without having memorised the times tables.

Different people have their heads wired differently. I was lucky enough to be in a state school that had a great learning support program, and some teachers that saw potential in me despite struggling with some conventionally trivial aspects of education.

The solution after diagnosis[1]? Just give me a calculator (I was allowed to use one in the normally “non calculator” exams). And in a pinch, I can still do it by hand but slowly. I often don’t just “know” the result, and that’s something I’m fine with not being “fixed” about me.

[1] https://www.bdadyslexia.org.uk/dyscalculia/how-can-i-identif...


> I am one of those people that cannot memorise times tables.

Have you actually tried that? Just rote-learning it as a chant or using any other method? Or by copying it down by hand several times?

Memorization is a trainable skill, and it's often overlooked in the US.

> I can do just fine in the real world without having memorised the times tables.

You can live just fine without long multiplication, division, and even basic math, physics or chemistry. You can even get university degrees. It's absolutely true.

It's not the question of necessity.


I'm not iglio but I also have issues with memorization in general, despite effort. Probably due to a head injury I got as a child, but hard to really be sure.

I work around it the same way in practice; having access to google so I can look things up is really helpful. I can do the actual work on paper if I have access to formulas or reference materials, but I've forgotten things like long division as many times as I've learned them, and I was never able to memorize times tables successfully. Naturally this is a hindrance to classwork - I had one great math teacher who recognized that memorization was a problem and he allowed me to have a reference book of formulas during my finals so I was able to pass with no issues since I was able to do the math, but most other uni courses were a huge challenge. Courses where you're allowed to use a calculator were great in comparison.


My memory was not great either. It's still not that great, so I'm learning Chinese to train it (memorizing Hanzi characters helps).

For me it's outlandish that somebody can forget long division. I literally haven't done it in 20 years, but I've just tried it and I can do it fine.

I guess not using calculator at school helped a lot. My math and physics classes all had arithmetic that could be done mentally, or occasionally on a piece of paper.


The way I learned it was by keeping an actual printed table handy to look at during multiplication problems. For me at least, that didn't take long and never required memorizing "for its own sake" -- I don't know if that'd generalize, since I don't know of anyone else going that way.

Learning by chant or whatever just seemed super boring (and prone to learning sequential access rather than as random access).

The same principle helped for touch typing and dvorak: tape a layout card to your screen at first.


Back when I was at school, we had the multiplication table printed on the back cover of all students' workbooks. After a while, I just remembered it.

We also had mnemonic chants for some non-trivial entries.


Your conclusion that schools should be fixed is correct, but the important part is making it flexible enough to handle different kinds of students.

As long as the school system teaches and treats everyone exactly the same there will be people failing because they can't keep up and others because they're so bored out of their mind (and then punished for it) they start hating education and learning.


> There's some value in finding an approach that works for everyone as an educator instead of one that works for Most if you can do that, I think.

"Dumbing down" really seems to be an apt description of this. It's interesting that the US school math scores started to go down after the Common Core implementation, after decades of gradual rise: https://nces.ed.gov/fastfacts/display.asp?id=38


For some reason we had something like common core math in my elementary school in the early 2000s and I just had a traumatic flashback. Teacher forced us do EVERY sum like this:

17890 + 456 = 1 x 10000 + 7 x 1000 + 8 x 100 + 7 x 10 + 0 x 1 + 4 x 100 + 5 x 10 + 6 x 1 = 1 x 10000 + 7 x 1000 + (8 + 4) x 100 + (9 + 5) x 10 + (0 + 6) x 1 = 1 x 10000 + 7 x 1000 + 12 x 100 + 14 x 10 + 0 x 1 = 10000 + 7000 + 1200 + 140 + 6 = 10000 + 7000 + 1 x 1000 + 2 x 100 + 1 x 100 + 4 x 10 + 6 = 10000 + 7000 + 1000 + 200 + 100 + 40 + 6 = 10000 + 8000 + 300 + 40 + 6 = 18346

We had to write down everything and weren't allowed to skip ANY step, let alone write down the answer directly. Being taught sums and multiplications graphically is cool, I guess, but this was outright psychological torture.


My English teacher in the seventh grade had us write a short story without a single pronoun and without repeating any adjective, verb, or noun. It was basically a hazing, and probably the biggest factor in my being an ok writer


Wait, did it reduce the quality of your writing or increase it?


It was a helpful exercise for stretching one's ability to express meaning. The constraints were there to force you to write more thoughtfully; the stories were probably trash even for that writing level, but that pacing and practice were what mattered


This has some value in reinforcing that decimal number representation is just a shorthand for polynomials but definitely misguided as a "normal" method of addition.


> A high school student today might be shocked to see the sort of complexity third- and fourth-graders in prior generations used to contend with for basic operations

Evidently a college graduate today would be shocked to see the sort of complexity a high school student was expected to master, if the paper of record is to be trusted[1].

[1] http://graphics8.nytimes.com/packages/pdf/education/harvarde...


humorous, but notice how few subjects there are in that exam. Students in that era were not subjects of relentless streams of ad images, entertainment or other media. There is no conclusion to the experiment of media that has emerged in the last decades, with regard to the attention span, development years for education, or emotional coping.


I dunno. I never heard of the "common core" multiplication algorithm and my 3rd grader kid only knows "regular" multiplication. He grasped it from the first few examples and had no problems with it. "Common core" seems more unnecessary complication than the regular algo which works on arbitrary numbers: 7593 x 267: everything done in one go compared to tons of workarounds and additions. Again, my kiddo does these without blinking and I see no reason to teach him differently.


For anyone else not making sense of the old method, that's because it should have '(previous step's result) + (carried 2)'


woops thank you


Now do the necessary steps to see how many it takes to multiply large numbers, let's say 1'785'375 * 7'367'594, in one method vs. the other and then comeback and tell me again if you'll ever use anything else than lattice one.


If I needed to multiply a pair of seven digit numbers, I would probably approximate them by dropping all but, say, the two most significant digits of each. If that amount of wrongness is a big problem, then I probably should have used a calculator in the first place.


I hate when I'm trying to figure out how many guests my friend's wedding venue will fit with 1,785,375 to a table and 7,367,594 tables available, Common Core fails me and I've forgotten how to lattice ;)


> only instead of partitioning by powers of two they partition by decimal digits (e.g. 4 * 56 --> 4 * 50 + 4 * 6).

When I was in secondary school we called this “factoring.” Distributivity is great.


Being in elementary in the 70s I do two things:

- On paper I do 46 etc.

- In my head I always did 450 + 4*6


I've always heard it described as Ethiopian Binary, surprised to see the wiki credit discovery to Egyptians.


we have evidence from the 3975-year-old moscow papyrus that the egyptians knew the algorithm then

well, one egyptian hacker at least

possibly they had learned it from ethiopian hackers but we don't have any evidence of what math the ethiopians did or didn't know at the time; the earliest ge'ez inscriptions are from the iron age, a thousand years later, and as far as i know don't talk about multiplication algorithms

it's hard to know what mathematical sophistication the oral traditions or possible other lost forms of writing contained; an ethiopian version of something like khipu or wampum or beeswax tablets or tamil palm-leaf books could have been arbitrarily sophisticated but completely lost by now

but that means the people who describe it as ethiopian binary don't know either they're just making up guesses based on no evidence


Now I know that one of the ways I do it in my head has been formalized and given name.


The quest for the root and archeology of ideas and applications is fascinating.

With all due respect with well known scientists like Leibniz, the distribution of your ideas was and remains to be key to be named. It is always nice to keep searching for prior influences and inventions.


Unless people are willing to forge "unpublished manuscripts" to gain priority. Need to be careful with that stuff - even when those are legit, the fact that they were not published immensely increases the complexity of priority. Imagine some goat herder invented binary in 4000 BC without telling anybody.


Vernor Vinge had an interesting take on this idea (1969):

https://en.m.wikipedia.org/wiki/Tatja_Grimm%27s_World


A similar notation for fractional units is used to write formulas for tinting paint. Sometimes you'll see paint color formulas written as a matrix of fractional fluid ounces for each colorant, but they skip some fractions so that the positional values are sometimes 2 or 3. I suppose this makes it a base 4 representation.


Random:

Anytime is a good time to learn to count in binary on your fingers. It's a nifty trick, especially handy for computer folks.


Hrm.. (remembering the first episode of Sons of Anarchy)


[flagged]


His name is "Leibniz" not "Liebnitz". Sorry but that's two mistakes in a seven character word.


Per the first paragraph of the article, Harriot at the very least beat Leibniz by a century or so.


But it appears Leibniz gave credit to the ancient Chinese.


Sounds like Harriot invented it independently and didn't know about the ancient Chinese.


They didn't do anything meaningful with it.


Vinary?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: