This may be slightly OT - but the one thing I have always wondered about the 19th century is why Babbage didn't go down the route of Boolean logic/math for computation? Furthermore, why he didn't incorporate and use relays (from telegraphy)? For that matter, why didn't Hollerith?
Ultimately, to me it seems all connected to this "fixed" idea (that is, thinking "inside the box") that computation was about numbers - that there was only calculation and tabulation - counting, iow.
While Boole came up with the math/logic - few pursued the idea of making machines to work with it (a few -did- exist, but were obscure, iirc); embodying logic using electrical circuits would have to wait. Then there was the "revelation" of computation being about symbols, not numbers - which opened everything up.
I'm just kinda rambling here; my apologies. And, hindsight is always easier than foresight, of course. It just bugs me, and makes me wonder what "obvious" ideas we are missing today, that will change the world, that we won't know until we look back. I guess I wonder why many of the most world changing ideas (in the long run) are often very simple, and why did it take so long for humanity to see/discover them? I wonder if anyone has studied this - of if it is something that can be studied...?
The first relay was invented by Henry in 1835. Babbage built his first difference engine in 1822. Mechanical arithmetic dates from the Leibniz stepped reckoner of 1694, which, like Leibniz, was way ahead of its time. It had a hardware multiplier. Babbage clearly knew about that. He was basically adding control logic around mechanical arithmetic.
As for Hollerith, he did use relays. The 1890 Census gear was just counters, but he followed up, establishing the Tabulating Machine Company, which became the Computing-Tabulating-Recording company, which became International Business Machines Corporation, which is today's IBM. Tabulators gained more relays and electrical components over the decades, although they still used mechanical adding wheels.
19th century relays were bulky, unreliable, and needed manual adjustment and contact cleaning. With telephony, telegraph, railroad signaling, and IBM efforts, they were made much more reliable and somewhat smaller in the early 20th century.
I would think that computers were being made to be useful first and foremost. And binary computing was not really useful for logic and number crunching until a number of advancements.
To be fair, Ada Lovelace was probably the first to think about using it for something other than pure computation. I may be wrong, but I believe there's nothing special about the representation being binary. You just map "logic" to digits (one or more). The base (or a generic set) you obtain the digits from doesn't matter.
Lovelace was probably the first to think
about using it for something other than
pure computation.
This is deeply misleading.
For example the idea to have automata make music, and more generally for automata to be creative, predates Lovelace, Music making automata were a staple of the renaissance. For example the mathematician and astronomer Johannes Kepler, when visiting the "Kunstkammer" of Rudolf II in 1598, was amazed at an automaton representing a drummer who could "beat his drum with greater self-assurance than a live one" [1]. Incidentally, Kepler corresponded with Wilhelm Schickard on the latter's "arithmeticum organum", the first ever proper mechanical calculator (could do addition, subtraction, multiplication and division). Automating creativity was very much an idea with much currency in the renaissance. Indeed some of the key advances in mechanical automata, which later evolved into computers, where driven by the desire to automate creativity [2]. The "conceptual leap" that some people lazily ascribe to Lovelace, wasn't hers!
Using numbers to represent syntax (what you lazily attribute to Lovelace too) is much older and can be found for example in Leibniz's 1666 "Dissertatio de arte combinatoria", where one finds a detailed exposition of a method to associate numbers with linguistic notions. I have no idea if Leibniz was the first to do this. Leibniz also built some early calculators/computers, and is thus a cornerstone of the tradition that lead to the emergence of computers. This tradition was known in Babbage's time, and most likely to Babbage.
[1] W. Zhao, Rendering the Illusion of Life: Art, Science and Politics in Automata from the Central European Princely Collections.
[2] D. Summers Stay, Machinamenta: The thousand year quest to build a creative machine.
I don't think that is what is meant by "be the first to" here.
I'm no expert in english, but in my native language the analog expression could mean "if somebody would think of X, she surely would".
> Babbage considered using number systems other than decimal including binary as well as number bases 3, 4, 5, 12, 16 and 100. He settled for decimal out of engineering efficiency - to reduce the number of moving parts - as well as for their everyday familiarity.
Ultimately, to me it seems all connected to this "fixed" idea (that is, thinking "inside the box") that computation was about numbers - that there was only calculation and tabulation - counting, iow.
While Boole came up with the math/logic - few pursued the idea of making machines to work with it (a few -did- exist, but were obscure, iirc); embodying logic using electrical circuits would have to wait. Then there was the "revelation" of computation being about symbols, not numbers - which opened everything up.
I'm just kinda rambling here; my apologies. And, hindsight is always easier than foresight, of course. It just bugs me, and makes me wonder what "obvious" ideas we are missing today, that will change the world, that we won't know until we look back. I guess I wonder why many of the most world changing ideas (in the long run) are often very simple, and why did it take so long for humanity to see/discover them? I wonder if anyone has studied this - of if it is something that can be studied...?