[accesscomp] Dan's Tip of the Day

  • From: "Robert Acosta" <boacosta@xxxxxxxxxxx>
  • To: "Bob Acosta" <boacosta@xxxxxxxxxxx>
  • Date: Mon, 20 Apr 2015 07:32:29 -0700

Fact of the Day:

The following can be read forward and backwards: Do geese see God?

A baby octopus is about the size of a flea when it is born.

The first man to fly over the North Pole-and indeed the South Pole-was
called Dickie Byrd.

*****

Moore's Law at 50: Its past and its future

When Gordon Moore, then at Fairchild Semiconductor, was asked in 1965 to
theorize about the future of the newly developed integrated circuit, he had
one in his lab with a then-amazing 64 transistors on it - double the 32 that
was state of the art only a year earlier. Connecting those dots on a graph
with the single component planar transistor invented in 1959, Moore noticed
that the number of components was roughly doubling every year. In an article
he contributed to a special issue of Electronics magazine published that
spring, he speculated that it could continue to do so for at least a decade.
It wasn't until that decade had passed, and Moore's friend Carver Mead
noticed that the trend had held up, that the term Moore's Law was coined.

As we look forward to the future of Moore
<http://www.extremetech.com/tag/moores-law> 's Law

http://www.extremetech.com/tag/moores-law



after its amazing 50-year run - 50 years officially as of April 19th - it is
helpful to look back at how it came to be, and how much it has evolved to
fit a changing industry already. That provides a basis for speculating on
what will happen to the pace of computing innovation going forward.


1965: Gordon Moore's very-educated guess


Moore's prediction was the result of combining two very important
observations he made in the process of writing his original article. First,
that at any given time there was an optimal number of components to put on a
chip. More components meant a lower cost per component, except that as the
number of components increased yield decreased, so at some point there were
diminishing returns to cramming more components on a chip. He graphed the
tradeoff between complexity and yield in the chart below, with an
extrapolation out to 1970.

Moore's first observation was that there is a natural optimal density of
components to achieve the lowest overall costs -- that changes with time

Moore's first observation was that there is a natural optimal density of
components to achieve the lowest overall costs - one that changes with time

Second, he realized that the optimal number of components on a chip was
increasing rapidly - it had doubled every year since the first planar
transistor had been created in 1959. That produced an exponential curve,
which he graphed in the chart below. He extended the line of historic data
into the future, predicting that the doubling could continue for at least
ten years into the future. While Moore had been inspired to think about the
rapid progress in the miniaturization of components by hearing Douglas
Engelbart speak on the subject, Moore was the first to plot the points on
paper and make a specific prediction about how it would progress. Moore
never thought of his prediction as a law, or even anything related to
underlying physical principles. But he did explain in the article in some
detail how he thought each possible technical problem that needed to be
solved over the next decade could be successfully addressed.


<http://www.extremetech.com/wp-content/uploads/2015/04/Moores-second-observa
tion-was-that-the-optimal-density-of-components-had-doubled-each-year-since-
the-first-integrated-circuits-were-created.gif> Moore's second observation
was that the optimal density of components had doubled each year since the
first integrated circuits were created

Moore's second observation was that the optimal density of components had
doubled each year since the first integrated circuits were created


1975: Carver Mead immortalizes an already-modified Moore's Law


By the time Carver Mead coined the term Moore's Law around 1975, Moore
himself had already modified it. Even though Moore never expected his
projections to be very precise, it had nearly perfectly predicted
semiconductor progress for a decade. However, Moore felt that gains in
component density would begin to taper off, and suggested that by 1980 a
doubling every two years was a more likely prospect.


Intel's House reshapes Moore's Law into its current form



<http://www.extremetech.com/wp-content/uploads/2015/04/691px-Transistor_Coun
t_and_Moores_Law_-_2008_1024.png> Like Moore predicted, transistor count has
increased exponentially since the invention of the integrated circuitWhile
component density gains were slowing by 1975, Intel's Dave House observed
that individual components were themselves getting faster. He theorized that
this meant computing power on a chip could double about every 18 months -
slower than Moore's original 1965 prediction, but faster than the 1975
revision. This is the form of the Law that has become popular and has been
carefully - almost slavishly - tracked and relied on by the semiconductor
industry.

If you work in the semiconductor industry, the specifics of Moore's Law are
very important to you. There are some great reasons to question whether
progress in integrated circuit technology can continue on the pace it has.
My colleague, Joel Hruska, will have plenty to say on that in another
article. For many of us, though, the primary impact of Moore's Law has been
an increasing abundance of computing power at a reduced cost - we don't
really care how the industry makes it happen. So it is worth considering
those innovations in the larger context of computing before, and perhaps
after, the integrated circuit.


From the abacus to the supercomputer


Despite the focus given to the computing revolution brought on by the
invention of the transistor and the integrated circuit, computers existed
long before anyone thought of using silicon to create them. A stroll through
the Computer History Museum's chronologically arranged exhibits starts with
the abacus, which in turn gives way to the slide rule, mechanical
calculators dating back to Babbage, and then decades of increasingly
powerful mainframes that relied on vacuum tubes. Before integrated circuits,
discrete transistors even made early supercomputers like the Atlas and the 3
MFLOP CDC 6600 possible.

If we look at the progress of computing in the 30 years before Moore wrote
his article, we can chart the gains in processing power from the 1 cycle per
second of Konrad Zuse's 1938 Z1 mechanical computer - arguably the first
true programmable model with a modern architecture - to the 3MFLOPs of the
1965 CDC 6600. Even if we charitably grant the Z1 1FLOP, the gain
corresponds to the doubling of compute power every 12 to 18 months during
that time - similar to the rate projected by Moore for integrated circuits,
but across several different physical implementations. In his book on The
Singularity, Ray Kurzweil goes even further back, compiling data since 1900
and the mechanical tabulator. If we graph that data on a log scale, we can
see that we've been making exponential progress for over a century:


<http://www.extremetech.com/wp-content/uploads/2015/04/Since-as-far-back-as-
1900-compute-performance-per-dollar-has-roughly-doubled-every-one-to-two-yea
rs.png> Since as far back as 1900 compute performance per dollar has roughly
doubled every one to two years


After the integrated circuit


The modern integrated circuit is running into all sorts of limits in size
and power that may spell the end of the strictly defined version of Moore's
law. But we have plenty of new technologies waiting in the wings to pick up
the pieces, in the same way the integrated circuit took over from
transistors, and transistors did from vacuum tubes. Perhaps the most obvious
is massively parallel computing
<http://www.extremetech.com/tag/parallel-computing> ,

http://www.extremetech.com/tag/parallel-computing



best typified today by the modern GPU. It has given us massive increases in
performance not just for graphics, but for more and more applications that
are being rewritten to take advantage of large number of processing cores.
Beyond that lies the weird world of quantum computing
<http://www.extremetech.com/tag/quantum-computing> ,

http://www.extremetech.com/tag/quantum-computing



which is slowly starting to take practical shape. Or perhaps new kinds of
physical computer architectures, like those using light or perhaps graphene
<http://www.extremetech.com/tag/graphene> .

Many children are familiar with the fable about the mathematician who asked
the king for a simple doubling of a grain of rice on each square of a
chessboard, and how it ran the king out of rice. In the same way, we are in
the fortunate position in computing that despite successive technologies
running out of steam, innovators always seem to come up with "the next big
thing" in the nick of time to keep our amazing progress going. When
interviewed on the subject, Moore himself reflected both that semiconductor
technology couldn't keep up its rapid progress, and that other technologies
like nanotechnology and graphene

http://www.extremetech.com/tag/graphene



might step up to fill the need.

[Moore's charts as reprinted in Understanding Moore's Law
<http://www.chemheritage.org/community/store/books-and-catalogs/understandin
g-moores-law.aspx> .

http://www.chemheritage.org/community/store/books-and-catalogs/understanding
-moores-law.aspx



Transistor count chart from Wikimedia
<http://commons.wikimedia.org/wiki/File:Transistor_Count_and_Moore's_Law_-_2
008_1024.png> .

http://commons.wikimedia.org/wiki/File:Transistor_Count_and_Moore's_Law_-_20
08_1024.png



Kurzweil data from The Singularity is Near, page 70
<http://www.singularity.com/charts/page70.html> ]

http://www.singularity.com/charts/page70.html



David Cardinal

Extreme Tech, April 16 2015



http://www.extremetech.com/extreme/203031-moores-law-at-50-its-past-and-its-
future?mailing_id=1226581
<http://www.extremetech.com/extreme/203031-moores-law-at-50-its-past-and-its
-future?mailing_id=1226581&mailing=ExtremeTech&mailingID=FBBBFB17FB564E0189B
F4FF88FBAC02D>
&mailing=ExtremeTech&mailingID=FBBBFB17FB564E0189BF4FF88FBAC02D









"If any- one is in Christ, he is a new creation; old things have passed
away; behold, all things have become new."

2 Corinthians 5:17,





To subscribe to Dan's tips or HotSpot with God Daily Devotional, send a
blank message to dthompson5@xxxxxxxxx and include "subscribe Dan's Tips" or
Subscribe HotSpot with God" in the subject line.







Robert Acosta, President

Helping Hands for the Blind

(818) 998-0044

www.helpinghands4theblind.org



GIF image

GIF image

PNG image

PNG image

Other related posts: