Posted by apollinaire 1 day ago
There are only two things that have been understood much later about zero, that it is a quantity that in many circumstances should be treated in the same way as any other numbers and that it requires a special symbol in order to implement a positional system for writing numbers.
The concept of zero itself had already been known many thousands of years before the invention of the positional system for writing numbers.
All the recorded ancient languages were able to answer to the question "How many sheep do you see there?" not only with words meaning "one", "two", "three", "four", "few", "many" and so on, but also with a word meaning "none". Similarly for the answers to a question like "How much barley do you have?".
Nevertheless, the concept of the quantity "zero" must have been understood later than the concepts of "one", "two", "many" and of negation, because in most languages the words similar in meaning to "none", "nobody", "nothing", "null" are derived from negation words together with words meaning "one" or denoting small or indefinite quantities.
Because the first few numbers, especially 0, 1 and 2 have many distinctive properties in comparison with bigger numbers, not only 0 was initially perceived as being in a class somewhat separate from other numbers, but usually also 1 and 2 and sometimes also 3 or even 4.
In many old languages the grammatical behavior of the first few numbers can be different between themselves and also quite different from that of the bigger numbers, which behave more or less in the same way, consistent with the expectation that the big numbers have been added later to the language and at a time when they were perceived as a uniform category.
> Inside the Chaturbhuj Temple in India (left), a wall inscription features the oldest known instance of the digit zero, dated to 876 CE (right). It is part of the number 270.
This is the real oldest known zero:
> Because we know that the Çaka dynasty began in AD 78, we can date the artifact exactly: to the year AD 683. This makes the “0” in “605” the oldest zero ever found of our base-10, “Hindu-Arabic” number system.
> But the Cambodian stone inscription bears the first known zero within the system that evolved into the numbers we use today.
Maybe the Khmers came up with it or maybe they got the idea from India. Either way, we should set the facts straight.
Not even old languages, in many languages today (notably slavic) nouns decline differently with "1", "2", "3", "4" and say "5".
In many languages, including existing ones (Lithuanian, Irish and Slovenian come to mind) there exists a concept of a grammatical number "dual", in addition to singular and plural.
Hence a wonderful piece of Soviet humour: A factory needs 5 fireplace pokers, but none of the workers knows the correct plural form for 5 fireplace pokers; not wanting to appear ignorant when they send their request to management, they request "3 fireplace pokers and 2 more". Some months later they receive the fireplace pokers with a note saying "here are 4 fireplace pokers and 1 more" -- because management didn't know the word either!
https://www.unicode.org/cldr/charts/46/supplemental/language...
grep for "two" (note: "two" means 'dual', not literally '2')
In the context of brain, Buddhism and various branches of Hinduism spent a lot of time pondering. Several meditative techniques involve meditating over this hollowness or the ultimate absence.
I do know that Zero the numerical placeholder is different from Buddhist idea of "Zeroness" but in the context of brain it should be similar I feel. Absence of something means cardinality of zero but Buddhists literally asked this question about reality itself.
Is it? I ate all the apples. There are no more apples left. There are zero apples. Don’t we come face to face with nothing left all the time? There are zero guests left. There are zero episodes of the show to watch. There are zero days until Christmas. There is zero money in my wallet.
When there is zero of something, what is there zero of? That question becomes a lot more difficult to answer. In theory you could say there is zero of anything not observed, but this isn't a very precise or useful definition and isn't what's done in practice. The things we choose to actively describe there being zero of depends on knowledge of what existed previously, on cultural and linguistics norms, on context etc.
For example, I can say "there are zero apples in the basket", but this requires me to know that there previously existed apples in the basket as opposed to oranges, or that the addressee of my statement was expecting apples to be there. This knowledge wouldn't be required if there was at least one apple.
Using zero fundamentally requires more mental reasoning than using small positive integers.
There's something or there's nothing
Level 2 if there's something:
There's one or more of this thing and it's not going to transform into same amount of something different.
Also with stolen paintings: you have no painting,but you have evidence of it's existence.
And lastly, in song: "can't buy me love."
And of course physics: T(sub) 0. As well as space flight: t-zero.
These are not equivalent statements though. You simply learned at some early age zero represents absence, but that is not a natural concept.
The following statement does not apply though: - there are no oranges left … as this implies that there have been oranges available at some time before.
"There are 0 apples left." is the answer to the question "How many apples are left?".
"There are 0 oranges." is the answer to the question "How many oranges are on the table?" (or "in the box" or wherever).
Everywhere where 0 appears in speech, it is the result of a counting or measuring operation, which provides the answer to a question, expressed or implied.
That counting or measuring operation could have had any other number as its result, instead of 0, which demonstrates that the nature of 0 is the same as that of any other cardinal number, i.e. it is a quantity (term introduced already by Aristotle, in his "Categories", where the various kinds of concepts and the words that name them were classified by the kinds of questions to which they provide answers).
"are" and "zero", meaning non-existence/non-"are"-ness, in the same sentence is seen as strange by many, yes, and that include me.
I've also read the linked article "https://www.quantamagazine.org/why-the-human-brain-perceives... which discusses the discovery of "number neurons" and goes on to discuss that there might be no "number neurons" after all.
Declaring 2x2=0+2+2, 2x1=0+2, 2x0=0 while 2xx2=1x2x2, 2xx1=1x2, 2xx0=1 seemed arbitrary.
What helped was learning about negative exponentiation and exponentiation simplification, so 2xx0 = 2xx2 x 2xx(-2) = 2xx2 / 2xx2 = 1.
That said, I clearly still take issue with unintuitive interpretations of "nothing" https://stackoverflow.com/questions/852414/how-to-dynamicall...
Your teacher didn't tell you (or told you, but you didn't recognize it as something valuable and forgot it) that exponentiation to zero is a new definition over exponentiation to positive integers. Exponentiation to positive integers is defined somehow, and that definition says nothing about exponentiation to zero. It is a new definition, not something that you deduce.
The same holds for 0^0 or 0/0 (with some amounts of confusion, lies, and hypocriticism).
Which I found incredibly silly once I leaned about negative exponentiation and could now deduce the pattern.
Similarly, 0/0 became much more tractable to me once I learned L'Hôpital's rule https://en.wikipedia.org/wiki/L'H%C3%B4pital's_rule
2x0 = 2x1 + 2x(-1) = 2-2 = 0
and 2xx0 = 2xx1 x 2xx(-1) = 2/2 = 1
Inverting the concept and having those patterns stem from a fundamental identity can wait until its not seen as the mathematical equivalent of "Because I said so".Or if I were using latex:
2 \times 0 = 1_{+}
Because multiplication, being repeated addition and exponentiation repeated multiplication both behave the same way. When asked to repeat the operation zero times, they return the unit or 1 of the underlying group, which is typically denoted as 1_{whatever}However, the 1 of addition is 0 when using the standard notation for integers
This has to do with rings and the relationship between the two identities of the two underlying groups. It ultimately stems from the distributive property between multiplication and addition
I think we invent abstractions because they allow us to reason about patterns in reality in a consistent way. The fact that division by zero is undefined is (to my mind) because it doesn't correspond to any useful pattern in the parts of reality we typically apply arithmetic to (accounting, estimation, etc).
What I like about this point of view is that it encourages thinking about what you are trying to accomplish, rather than fixating on formal rules. In some contexts "division by zero" can correspond to a meaningful pattern - look up the geometry of the projective line, for instance. In such cases you might want to include it in your model, rather than declaring it to be undefined as a convention!
> it doesn't correspond to any useful pattern in the parts of reality we typically apply arithmetic to (accounting, estimation, etc).
You can of course re-define it, but then we aren't talking about the same thing any more. The operation of inverting multiplication, is not defined for zero.
Similarly the fact that multiplication and division are inverses is a property of this model. Conceptually you can imagine splitting and copying groups of objects quite independently of one another (and which one you view as fundamental is really a post-hoc choice).
In general these days we mostly see clean mathematical abstractions because all the scaffolding has already been removed by mathematicians past. And as a result people come to believe that this is how mathematics is done. But always there is an initial period of exploration (which eventually gets forgotten) as people try to work out how to axiomatise the various systems they are interested in.
Contrast 0^0 with 0/0. Both of them are considered undefined, but the former is often defined "locally", e.g in a given textbook or article to have the value 1. That's not true for 0/0 because it's not been found to be useful.
In measure theory it's often useful to augment your reals with positive/negative infinity. In projective geometry it's meaningful to allow division by zero (to counter one detail of your comment).
Take out a scientific calculator and start doing division by smaller and smaller units:
1/0.9
1/0.09
1/0.009
...
1/0.000000000000000000009
So we can then ask what happens if we do this more and more with incremental steps? It seems that we go toward infinity.
thus the more you do this the higher the returned number and I think we just discovered (I think I have to refresh my memory and hope not say something very wrong) the concept of calculus. Then we can use a new concept: limits - what happens when that 0.000000000............N is as close to zero as possible.
Negative numbers however. They have no good physical models, at least that I know of. They are mostly just a tool to make accounting easier, or to denote one out of 2, indistinguishable cases (direction on the axis x, direction of angular velocity, electric charge). But the teachers don't insist confusing the kids with them.
> The whole numbers were synonymous with the integers up until the early 1950s. In the late 1950s, as part of the New Math movement, American elementary school teachers began teaching that whole numbers referred to the natural numbers, excluding negative numbers, while integer included the negative numbers. The whole numbers remain ambiguous to the present day.
https://en.wikipedia.org/wiki/Integer
> In mathematics, the natural numbers are the numbers 0, 1, 2, 3, and so on, possibly excluding 0. Some start counting with 0, defining the natural numbers as the non-negative integers 0, 1, 2, 3, ..., while others start with 1, defining them as the positive integers 1, 2, 3, ... . Some authors acknowledge both definitions whenever convenient. Sometimes, the whole numbers are the natural numbers plus zero. In other cases, the whole numbers refer to all of the integers, including negative integers. The counting numbers are another term for the natural numbers, particularly in primary school education, and are ambiguous as well although typically start at 1.
https://en.wikipedia.org/wiki/Natural_number
What a shame.
This sentence no verb. Self referentially.
*edited to say feel instead of fill
(no, but since in English I can verb my adjectived nouns into all sorts of noun-y verb craziness, nothing can get nouned and verbed out of adjectiving all day long, because English.)
0 it's a mathematical convention. We can't have 0 apples in a basket. We either have some apples or we don't have any.
If 0 is weird, negative numbers are also weird. Infinity is weird. Spaces with more than three dimensions are weird.
Complex numbers are weird from an algebraic perspective but not that weird from a geometric perspective.
Empty set is weird.
Everything that's abstract is weird.
For my mind abstract algebra is more weird than calculus and mathematical analysis so that's why I enjoy calculus and mathematical analysis more.
But being weird and abstract can also be useful.
I just checked my basket, it contains 0 apples.
Zero was invented approximately a thousand years before the pendulum was scientifically developed for use in technology like clocks.
I wonder where zero is when I point a web cam at a screen and see the tunnel effect. Am I, the observer the zero camera?