There are two ways to grasp a-zero. One is a usual manner in which 'users of computing' understand and apply it. It involves infinitesimal values, numbers less than zero ( minus numbers), a concurrent use of [linear, 2D and 3D-numbers 'more than' and 'less than' zero]... so on!
Computers apply other possibility, which relates zero to 'a zero-sensing part' in each computer. A relating of absolute nothingness (a-zero) to 'each use of a number' is thus realized!
Incidently a digital-concept of "before-units counting" was used by ancient Indians to carry zero into number applications, which remains unchanged in Vedic Mathematics!
Why two differrent computing principles should exist in twenty-first century (in computers 'digital whole number states' and 'whole number states and parts of a-unit having plus minus combination' by people)? It seriously affects human-computer interactions!
Why not use digital numbers alone (excluding minus number), which could simplify maths?
2007-03-11
19:53:49
·
1 answers
·
asked by
kkr
3
in
Science & Mathematics
➔ Mathematics