Computers use base two because their most natural unit can occupy one of a couple states: 0 or 1. One, two, base two.
Human hands, on the other foot, have ten fingers. Since our favorite mapping between things and integers is finger counting, we naturally end up with more than two states. Zero, one, two, three, four, five, six, seven, eight, nine, and the fully-extended state, ten. That's eleven states, which is why the global human standard is base-eleven.
Maybe because by the time hands were closed into fists, people had forgotten about the negotiating and gotten down to fighting.
In other words, the "fist" symbol was a reserved word for a very different purpose than counting.
Or possibly more realistically: if the way you communicated numbers didn't rely on position to indicate magnitude, you didn't need zeros as place holders. You could just "sign", 3 * 100 + 5 * 10 (whatever the symbols for those were ). Hell, maybe the symbol for 10 was 1 x "fist" and 100 was "pump your first twice."
There's a lot more possible states than eleven. 32 if you count just whether a finger is extended. More if you allow crossing fingers, interaction between hands, etc. And this isn't theoretical either:
Maybe a coincidence (I don't know details of computer history), but another interesting reason why binary is used might be information density.
Base e (2.71...) has highest information density. Of course, computers can't use irrational numbers as their base, so base 3 would be a closest pick. However, base 3 is much harder to represent in digital logic, so base 2 is a better pick and is still relatively close to e.
What actually happened, probably, is that humans first counted by putting things in correspondence with fingers; actual bijections between fingers and things. This naturally led to ten, the number of fingers total and thus a grouping size of understandably frequent use, becoming a very significant number (culturally, linguistically, etc.). Ten was a very significant number for a very long time before positional notation with its digits and bases was invented. (The Romans had no digits, but they loved their fives and tens)
And thus, when such notation for numerals WAS invented, it felt quite natural to take ten as base, it being such a significant number.
The Sumerians counted the individual finger bones on one hand to count to twelve, but adopted base sixty. I guess they combined it with five fingers on the other hand?
> Computers use base two because their most natural unit can occupy one of a couple states
Not really. That's just the way things worked out, mainly due to how electronics (transistors) work, and with the miniaturization of electronics.
Babbages Analytical Engine was designed as a base-10 machine, mainly because this was the most efficient way to represent things using gears (and with all the friction that entails especially with large numbers - lot's of ingenious mechanism was designed by him to work around this limitation).
Even ENIAC, which used vacuum tubes, was base-10; this had to do with a number of reasons (reliability of vacuum tubes, war rationing issues possibly - at least when it was designed; there are probably other reasons as well).
Other bases have also been tried, but base-2 is default mainly because its easy to create simple circuits (especially miniaturized) to represent such states, and replicate/connect them efficiently (essentially, once you have a NAND or NOR gate represented, with transistors or anything else, you have the basic building block for the rest of a computer).
Other than that, there isn't anything particularly unique in regards to binary representation (base-2) for computation versus other bases.
I'm a Lisp programmer, damn it! I will use one of the fingers for one tag bit which indicates a pointer into the heap when zero, and from there I will boostrap bignums somehow, and the rest of the numeric tower.
You can count from zero to two on two fingers, so two-fingered aliens may invent numerals for zero, one and two. If they develop that into a place value system, it will be (fingers+1)-ary, or in their case trinary.
It's been speculated that prior to base ten (or maybe alongside it) people used base twelve (using knuckles) hence anachronisms like dozens (inches, hours), 60 (minutes) and 360 (degrees), etc.
Not correct, the frequency of multiples of 12 in human counting is thought to be because of the lunar cycle and that each human hand has 12 phalanges -- aka finger bones -- in the fingers excluding the thumb. See how the Chinese count to ten on one hand, using a combination of thumb and fingers, bent for some numbers, for six to ten. There are very few languages with a duodecimal system.
What you're missing is that positional numeral systems developed relatively late--in recorded history, which is well after number systems start to be encoded in language. Earlier numeral systems are counting systems, where zero isn't a proper number but rather a signifier for a lack of stuff.
Let's look at numbers in natural languages. In English, we start with 12 basic numbers--one through twelve--and then we start counting "three-ten", "four-ten", etc. through "nine-ten." After that, we say "two tens" (the "tens" gets corrupted to -ty in Modern English), then "two tens one", etc. Note that we're not saying "two tens zero"--that's a sign that zero is not really fundamental in our counting system (etymologically, the term "zero" in English appears to date only to around 1600, contrast that to the -ty affix that dates back to at least Proto-Germanic, although many of the numbers themselves have roots back in Proto-Indo-European).
You can also see this effect in early numeral systems. Note that Roman numerals--the most common numeral system in Europe until the Early Modern--has distinct letters for 5 (V), 50 (L), and 500 (D), which is the usual case in most of its contemporary numeral systems. The Greek numerals for, say, 666, would be χξϛ--same general principal as Roman numerals, even though it has distinct numerals for every digit rather than just ones and fives.
The actual development of a true zero and true positional numeral system appears to have only independently happened very few times. The Mesoamericans probably developed it around the same time as the Long Count calendar (exact date uncertain, but roughly contemporary with the Roman Empire). Hindu-Arabic numerals developed probably slightly later (thought to be around 400 AD or so)--and it's from this system that pretty much every modern numeral system comes. The quipu could definitely represent numbers in true positional fashion, although the dating of this is unknown to me.
Base 10 predominates in modern numeral systems primarily because of the primacy of Hindu-Arabic numerals. The derivation of number terms in natural languages shows a rather confusing panoply of numbering bases. The vigesimal and sexagesimal number systems of Mesoamerica and Mesopotamia do show residual base-5 and base-10 in their construction, and the terminology in relevant native languages tends to indicate a base 10 strata (so the number in "78" in Mayan and Nahuatl boils down to "three twenty ten eight"), which strongly suggests that these systems are chosen for accounting purposes, not for things like "counting on fingers and toes." It's also worth pointing out that the human visual system subitizes small numbers--basically, you don't need to count three objects, you just take a glance and immediately know "there are three"--and this process tends to break down around 4-6 objects. It's not hard to imagine that number systems like duodecimal or vigesimal are based on counting subitized groups.
Yep!! That's what's commonly misunderstood about the "invention of zero". Positional numbering is the part that's actually innovative. This hit me when I was a math teacher and trying to consider our algorithms for arithmetic from first principles. I was teaching high school students how to work with exponential expressions with variables, which seems very esoteric, unless you realize that you've been doing it for years implicitly.
> What you're missing is that positional numeral systems developed relatively late--in recorded history,
Mesoamericans were carving base-20 long count dates at least as early as they were writing words, actually. The oldest complete date is 36 BCE. This was a true number system, including a glyph for zero.
We're still talking ~2,000 years ago, which I consider recorded history (basically, the distinction between history and prehistory is roughly ~6,000 years ago). Certainly much, much newer than language.
It sounds like pedantry, but it's important: history can't be recorded if there isn't any recording being done. Again, current evidence has the long count preceeding written language in Mesoamerica. That time frame only works for the mesopotamian and nile river civilizations.
While we’re at it, the (positional) Mesopotamian sexagesimal system evolved around 2000 BC, and had a placeholder “zero” symbol for empty places sometime before 0 AD.
Fully extended is one set over none extended: base ten.
It makes sense if you think of all of them extended as a set, rather than one less than a set. And so by the time positional notation came around, it was cemented that all of them extended was a set. (It took us a while to get 0 and positional notation.)
Just look at Roman numerals:
A finger. A hand. Both hands. A hand representing both hands. Both hands representing both hands. Etc.
Human hands, on the other foot, have ten fingers. Since our favorite mapping between things and integers is finger counting, we naturally end up with more than two states. Zero, one, two, three, four, five, six, seven, eight, nine, and the fully-extended state, ten. That's eleven states, which is why the global human standard is base-eleven.
Wait, what?