[lit-ideas] Re: Philosophy Of Maths

  • From: "John McCreery" <john.mccreery@xxxxxxxxx>
  • To: lit-ideas@xxxxxxxxxxxxx
  • Date: Tue, 17 Jun 2008 09:31:53 +0900

On Tue, Jun 17, 2008 at 8:41 AM, Donal McEvoy <donalmcevoyuk@xxxxxxxxxxx>
wrote:

>
>
> We are tottering it seems on the large issue of realism/anti-realism in the
> theory of knowledge and, in particular, of mathematical knowledge.


Indeed. So perhaps it might be worthwhile to look a bit more carefully at
what we are talking about. According to a Wiki article that looks pretty
good

In mathematics <http://en.wikipedia.org/wiki/Mathematics>, a *natural number
* (also called *counting number*) can mean either an element of the
set<http://en.wikipedia.org/wiki/Set>
 {1 <http://en.wikipedia.org/wiki/1_%28number%29>,
2<http://en.wikipedia.org/wiki/2_%28number%29>
, 3 <http://en.wikipedia.org/wiki/3_%28number%29>, ...} (the
positive<http://en.wikipedia.org/wiki/Positive_number>
 integers <http://en.wikipedia.org/wiki/Integer>) or an element of the set {
0 <http://en.wikipedia.org/wiki/0_%28number%29>, 1, 2, 3, ...} (the
non-negative <http://en.wikipedia.org/wiki/Non-negative> integers). The
former is generally used in number
theory<http://en.wikipedia.org/wiki/Number_theory>,
while the latter is preferred in mathematical
logic<http://en.wikipedia.org/wiki/Mathematical_logic>
, set theory <http://en.wikipedia.org/wiki/Set_theory>, and computer
science<http://en.wikipedia.org/wiki/Computer_science>.
A more formal definition will follow.
Natural numbers have two main purposes: they can be used for
counting<http://en.wikipedia.org/wiki/Counting> ("there
are 3 apples on the table"), and they can be used for
ordering<http://en.wikipedia.org/wiki/Partial_order> ("this
is the 3rd largest city in the country").
Properties of the natural numbers related to
divisibility<http://en.wikipedia.org/wiki/Divisibility>,
such as the distribution of prime
numbers<http://en.wikipedia.org/wiki/Prime_number>,
are studied in number theory <http://en.wikipedia.org/wiki/Number_theory>.
Problems concerning counting, such as Ramsey
theory<http://en.wikipedia.org/wiki/Ramsey_theory>,
are studied in combinatorics <http://en.wikipedia.org/wiki/Combinatorics>.


The immediately striking thing about this discussion is that claims for or
against the reality of natural numbers prior to human thinking about them
must deal with not one but two definitions: (1) natural numbers excluding
zero and (2) natural numbers including zero. If we imagine ideas about the
nature of numbers emerging from the act of counting {one, two, three....}
then definition (1) appears more natural, a better candidate for prior
reality. This intuition is reinforced by history, in which zero appears much
later than simple counting.

The natural numbers had their origins in the words used to count things,
beginning with the number one.
The first major advance in abstraction was the use of
numerals<http://en.wikipedia.org/wiki/Numeral_system> to
represent numbers. This allowed systems to be developed for recording large
numbers. For example,
theBabylonians<http://en.wikipedia.org/wiki/Babylonia> developed
a powerful place-value
<http://en.wikipedia.org/wiki/Positional_notation> system
based essentially on the numerals for 1 and 10. The ancient
Egyptians<http://en.wikipedia.org/wiki/History_of_Ancient_Egypt> had
a system of numerals with distinct
hieroglyphs<http://en.wikipedia.org/wiki/Egyptian_hieroglyphs>for
1, 10, and all the powers of 10 up to one million. A stone carving from
Karnak <http://en.wikipedia.org/wiki/Karnak>, dating from around 1500 BC and
now at the Louvre <http://en.wikipedia.org/wiki/Louvre> in Paris, depicts
276 as 2 hundreds, 7 tens, and 6 ones; and similarly for the number 4,622.
A much later advance in abstraction was the development of the idea of
zero<http://en.wikipedia.org/wiki/0_%28number%29> as
a number with its own numeral. A zero
digit<http://en.wikipedia.org/wiki/Numerical_digit> had
been used in place-value notation as early as 700 BC by the Babylonians,
but, they omitted it when it would have been the last symbol in the number.
[1] <http://en.wikipedia.org/wiki/Natural_number#cite_note-0> The
Olmec<http://en.wikipedia.org/wiki/Olmec>
 and Maya civilization <http://en.wikipedia.org/wiki/Maya_civilization> used
zero as a separate number as early as 1st century BC, apparently developed
independently, but this usage did not spread beyond
Mesoamerica<http://en.wikipedia.org/wiki/Mesoamerica>.
The concept as used in modern times originated with the
Indian<http://en.wikipedia.org/wiki/India>
mathematician Brahmagupta <http://en.wikipedia.org/wiki/Brahmagupta> in 628.
Nevertheless, medieval computists
<http://en.wikipedia.org/wiki/Computus> (calculators
of Easter <http://en.wikipedia.org/wiki/Easter>), beginning with Dionysius
Exiguus <http://en.wikipedia.org/wiki/Dionysius_Exiguus> in 525, used zero
as a number without using aRoman
numeral<http://en.wikipedia.org/wiki/Roman_numeral> to
write it. Instead *nullus*, the Latin word for "nothing", was employed. The
first systematic study of numbers as
abstractions<http://en.wikipedia.org/wiki/Abstraction> (that
is, as abstract entities <http://en.wikipedia.org/wiki/Entity>) is usually
credited to the Greek <http://en.wikipedia.org/wiki/Ancient_Greece>
 philosophers Pythagoras <http://en.wikipedia.org/wiki/Pythagoras> and
Archimedes <http://en.wikipedia.org/wiki/Archimedes>. However, independent
studies also occurred at around the same time in
India<http://en.wikipedia.org/wiki/India>
, China <http://en.wikipedia.org/wiki/China>, and
Mesoamerica<http://en.wikipedia.org/wiki/Mesoamerica>
.
In the nineteenth century, a
set-theoretical<http://en.wikipedia.org/wiki/Set_theory>
 definition <http://en.wikipedia.org/wiki/Definition> of natural numbers was
developed. With this definition, it was convenient to include zero
(corresponding to the empty set <http://en.wikipedia.org/wiki/Empty_set>) as
a natural number. Including zero in the natural numbers is now the common
convention among set theorists <http://en.wikipedia.org/wiki/Set_theory>,
logicians <http://en.wikipedia.org/wiki/Logic> and computer
scientists<http://en.wikipedia.org/wiki/Computer_science>.
Other mathematicians, such asnumber
theorists<http://en.wikipedia.org/wiki/Number_theory>,
have kept the older tradition and take 1 to be the first natural number.


This account, from the same Wiki source (
http://en.wikipedia.org/wiki/Natural_number) brings us back to the reason
that two definitions persist. Set theory, with zero defined as the empty
set, is far, far and away the theory in terms of which modern mathematics
and its applications in fields like computer science (remember "1" and "0")
have developed. From this perspective, the number theorists' decision to
cling to the earlier definition is a move like that made in many different
fields of modern mathematics, i.e., begin with conventional assumptions,
eliminate one or more, and see what happens to your mathematics.

In this case, the interesting questions of the theory of natural numbers
come down to what remains when you have no zero. What, for example, are the
consequences of having an identity number (1) for multiplication but no
corresponding identity number (0) for addition? Given the existence of 1 and
a number n, n*1=n. But without a zero, the corresponding operation n+0=n
does not exist. There is no number that, when added to n, equals n itself.
And any x=n+m, where n and m are both natural numbers, must be larger than
either n or m (x>n, x>m).

Thus, the decision to restrict ourselves to the more "natural" definition
(1) has results that will seem decidedly unnatural to anyone who learns
ordinary arithmetic, which requires the 0 for addition.

Which, then, of the two definitions, (1) and (2), is more "real," more W3.3
than the other? That is a deep, deep question.

John


-- 
John McCreery
The Word Works, Ltd., Yokohama, JAPAN
Tel. +81-45-314-9324
http://www.wordworks.jp/

Other related posts: