Tuesday, October 21, 2008

NANOTECHNOLOGY - from Nano-Dream to Nano-Realm

2008©B.F.Dorfman Extended and updated on November 02, 2008
NANOTECHNOLOGY TODAY
FROM A BIRD'S EYE VIEW

Nanotechnology is not a specific technology
- It is specific time

There are many reasonable definitions of technology in general, and nanotechnology – in particular. Depending on definition, our perception of nanotechnology may be as a dream about relatively remote future, or actual state of the nowadays hi-tech.

1st Definition: Technology becomes “nano” when it produces new kind of artifacts due to a nanometer-range resolution at least in one of three dimensions. In this definition, ‘nano’ firstly started in the time of WWII with thin optical coatings, and finally – in the middle of 1950s with introduction of ‘double-diffused’ transistor.

2d Definition. Technology becomes “nano” when it produces new kind of artifacts due to a nanometer-range resolution in all three dimensions. In this definition, ‘nano’ was born at the border 1980s-1990s, when the design rule in semiconductor industry was firmly established below 1.0 micrometer ‘waterline’. It had matured at early 2000s while submerging bellow 100 nm.

3d Definition. Technology becomes “nano” producing intelligent products without direct intervention of external intellect (human or digital) - based on complete self-organization, self-repair, and even - self-development.

In this definition, we now almost exactly in the middle of the road to Nanotechnology from its conception at 1950s.

‘Football field’ on Intel’ chip: up to ~ 2,000,000,000 transistors/ 45 nanometer ‘design rule’. To make 45 nm visible, we must magnify ~1 cm chip to a real football field (the heads of players may be hit by satellites). This is a current cutting edge of "top-down" approach: steady progress of hi-tech’ resolution to atomic scale.

Two players fight for 540-atom buckyball - a cutting edge in some direction of progress in the "bottom-up" approach: from atomic scale to products. Even at such magnification, buckyball is almost unnoticeable; it will need two more orders of magnification to distinguish its details.


Time of the Great Unification of the Technologies

True nanotechnology will only start at the point where the “design rules” of “top-down” and “bottom-top” approaches will be equalized & synergized – ~ 3-4 decades from now – as it was shown two decades ago: Pictures 1, 2 below are from the book ‘Evolutions of technologies, or New history of the Time’ (B.F. Dorfman, Moscow, 1990, in Russian).

Nanotechnology is not a specific technology, or specific technologies. It is specific time.

Time when all the major branches of the current development will be united and synergized.

Time of the Great Unification of the Technologies. Plausibly, just starting at ~2040.

1. Nanotechnology ≡ Great Unification of the Technologies.

Progress in "Design Rules" from 1cm to 1mm was taking ~10,000 years, from 1mm to 100mm – a century, from 100mm to 100nm ~½ century. It would take another half a century to reach the atomic scale.

2. Approaching the atomic-scale in geometric dimensions, most of physical phenomena are nearly equalizing in the time (èspeed) dimension.

This may change our ‘common sense’. For instance, electromechanical relays, which were once starting points for both theory and practically working computers, – may return as the best known digital element, but on molecular base. (To make a relay-equivalent for PC-2010, one would need some megaton of relays, gigaton of wire, gigawatt of power, millennium for realization and astronomical time for system operation).

On diagram "Great Unification of Technologies", the historical time scale is not linear because the real time of progress was not linear. New universe was in a latent state up to the Middle-Age mechanics (clocks and automats), it became visible with the first Industrial Revolution, exploded with the Second one.



3. Some milestones of the “top-down” approach to ‘nano’ from the conception to the current moment.

The real ‘Big bang’ was transistor. From that moment and over half a century the time scale of progress is almost strictly linear: diagram 3 on the following screen. (NOTE: This is a strongly simplified diagram. The complete diagram showing all the the milestones of the “top-down” approach to ‘nano’ from the conception to the current moment and envisioned beyond of that will be published in book).
The term transistor itself denotes two basically different solid-state devices:
1. Bipolar semiconductor triode where three electrodes are mutually separated by two p-n-junctions.
2. Unipolar triode, or ‘field-effect’ transistor (FET) - three-pole solid device where two electrodes (‘source’ and ‘drain’) are directly connected to the area of the path of flow of free electrical charges (‘channel’), while the third electrode (gate) is insulated from the channel or by a thin dielectric layer (MOSFET, or metal-oxide-semiconductor FET) or by Schottky barrier (MESFET, or metal-semiconductor FET).
While the bipolar transistor is basically semiconductor device, the FE triode (at least in principle) may be realized with different kinds of channel matter, even without any semiconductor – for instance, with “poor metal” “or “poor dielectric”, or in molecular structure, or even without any matter – in vacuum similarly to original vacuum triode prototype – as soon as the device structure advanced into the deep “nano” range. But the FE triode was originally conceived long before the ”nano”-era: In three patents series of 1925-1928, Julius Edgar Lilienfeld had disclosed the solid-state analogs of vacuum triode: US patent 1745175 "Method and apparatus for controlling electric current" first filed in Canada on 22.10.1925 (similar to a MESFET), US patent 1900018 "Device for controlling electric current" filed on 28.03.1928 (a thin film MOSFET) and US patent 1877140 "Amplifier for electric currents" filed on 08.12.1928 (where the current flow is controlled in y a porous metal layer). No silicon or germanium had been explored yet as the workable semiconductors, and p-n-junctions were not known either. Russell Shoemaker Ohl discovered the key role of impurities in semiconductors, significance of their ultra-purifying (in that time – for germanium), p-n-junction barrier and semiconductor diode only in 1939.
The practical and essentially reciprocal history of unipolar and bipolar transistors started after the Second World War [1-5]. The junction transistor was invented by William Shockley* soon after WWII - still FET; however, the Shockley’s FET did not work in spite of tremendous and diversified efforts by Shockley himself, John Bardeen, Walter Brattain and their colleagues. Then, working alone, Brattain and Bardeen have created the first [germanium] bipolar transistor.
* Shockley finally filed his patent for FET with p-n-junction and mesa-channel in 1951 (granted in 1958).
Amazingly, Bardeen provided theoretical explanation why the Shockley’s FET did not work (imperfect surface “killing” the free carriers of electrical charge), while Shockely developed the fundamental theory of Electrons and Holes in Semiconductors (1950) first employed for bipolar transistors. Next two decades laying down the foundation of silicon transistor electronics, integral circuits, planar technology and microelectronics was exclusively due to bipolar transistors.
The next two ideas were crucial.
Jean Hoerni proposed to preserve the silicon oxide layer on place on silicon substrate (instead of etching it away after using oxide as diffusion mask) - to protect p-n junctions. The “given name” ‘planar technology’ was originally due to intention to distinguish a newly-born (in the Hoerni’s idea – in December 1957, in industry – in 1961) flat device structure from the preceding one – an entrenched “mesa” transistor design (developed by M. Tanenbaum and D. Thomas). However, the current meaning of this term is essentially broader and deeper.
As soon as the concept of the planar transistor was established, Robert Noyce suggested formation of interconnects on the same silicon substrate. Independently, the idea to form wires connecting undivided transistors on substrate was suggested in Texas Instruments by Jack Kilby (actually, a few months earlier, but it was not yet planar technology).
To the middle of 1960s, the planar technology of silicon integral circuits, including surface treatment, had been matured enough to return to FET –better matching the basic principles of planar technology, but most importantly - consuming less energy, especially in complimentary pairs of CMOS (p-channel & n-channel transistors) . In the 1990s, when the formerly steady increase of the FET transistor’ frequency with scaling down the design rules could not be continued with the same pace any longer, there were some efforts to revitalize bipolar transistors – naturally without a chance for success: the low-energy CMOS sustain their critical advantage.

***

So far, altering the world ~800,000,000 PCs, > 3,000,000,000 cell phones, Internet, library on disk, ipod, digital cameras, medical sensors, flat TV, more accurate weather prognosis, computer-assisted design of the technology itself, – are based on ‘top-down’ nano-progress.

Technology may be safely continue line for ~7 years. Then, keeping this “schedule” will be increasingly challenging, and Technology - may be diversified.

The “top-down” development in the physical technologies’ domain is predominantly united, and the major past milestones of this decisive development, only of which are shown on diagram 3, well follows the linear course in the ‘Time – Resolution’ coordinates.

The “bottom-up” approaches are diversified yet, and the below shown ‘bottom-up’ diagrams are just some examples.


From PICO to NANO

1. SUPRAMOLECULAR CHEMISTRY

and

SELF-ASSEMBLING MOLECULAR SYSTEMS


SUPRAMOLECULAR CHEMISTRY - is one of the most important fields in the "bottom-up" approach: creating the pre-designed hierarchical structures from molecules as building blocks (instead of atoms) while binding them with relatively week bonds (≤~2eV instead of ~ 3.5eV in covalent bonds C-C) or even without chemical bonds, just “topologically” (two Stoddart’s supramolecules above). Self”-assembling – but in desirable direction! – is engine driving this research to the true “nano”. All specific results, achieved so far, are just examples, road signs indicating the progress. Differently from semiconductor’s “nano”, where the world’ leader Intel alone has ~90,000 employees, supramolecular chemistry is still more academic field, and every step of progress associated with individual name of discoverer.

SUPRAMOLECULAR SYNTHESIS & SELF-ASSEMBLING MOLECULAR SYSTEMS is marked with a few complimentary lines of progress:

  • Structural Hierarchy => Intrinsic rotating and axial moving
  • 2D “molecular architecture”=>3D=>Complex topology=> Pre-designed deformation=>
  • Increasing complexity leads to growing number of steps in synthesis and number of building constituents =>
  • Another side and line of progress: More effective and complex selectivity of synthesized supra-molecules to chemical and biochemical reactions =>More complex functionality=>

It is actually “From pico to nano” approach: In the first ‘crown ethers’ – ancestors of the entire fields (shown in the left bottom corner of diagram), the size of their most remarkable feature – cavity - is in the range of 120 to 320 pico-meters.

Hierarchical supramolecules synthesized in 2000s preserve such pico-feature while adding also next floors, or ‘nano’ – levels.

Structural Hierarchy => Intrinsic rotating and axial moving=>
2D=>3D=>Complex topology => Pre-designed deformation=>
Steps of synthesis and number of building constituents =>
Selectivity => Functionality => Self- functioning =>

Indeed, the length of chemical bonds – ~100-200 picometers – is the base of all technologies, starting from ancient metallurgy. ‘Nano’ – is growing complexity. Pico-technology is rather predecessor than successor of “nano” world of stable atomic structures.

Perhaps, in meta-stable world of intro-atomic electronic states, lasers and future quantum computers may be considered as Pico-technologies.

Starting even earlier, Inorganic materials explored both ways, “up” and “down” (see bellow after references an example).

***

Some useful links for the further reading:
The following articles are available free:
1. Michael Riordan, From Bell Labs to Silicon Valley, The Electrochemical Society, Interface, Fall 2007
Full article extending the story up to very recent time is available free at:
http://www.electrochem.org/dl/Interface/fal/fal07/fall07_p36-41.pdf
2. Howard R. Huff. From The Lab to The Fab: Transistors to Integrated Circuits
https://www.chiphistory.org/exhibits/ex_howard_huff_transitors_integrated_circuits/howard_huff_section1.pdf
For the further reading:
3. Riordan and Hoddeson, Crystal Fire, W. W. Norton & Co., New York, (1997)
4. Christophe Lécuyer, Making Silicon Valley: Innovation and the Growth of High Tech, 1930-1970. The MIT Press, Cambridge, MA (2006).
To reed the history from the “first hands”:
5. Ian M. Ross, President of Bell Labs from 1979 to 1991,
The Foundation of the Silicon Age, Bell Labs Technical Journal, Autumn 1997, pp. 3-14
This article is giving the principle narrative from the first hands. Access to full article requires subscription.
6. The early history of magnetic disk storage in IBM:
http://www.magneticdiskheritagecenter.org.
Latest news:
7. Nanotech Breakthroughs! Intel Unveils Industry's First 32-nm Chips
ED Online ID #16875, September 18, 2007
http://electronicdesign.com/Articles/Index.cfm?AD=1&ArticleID=16875
8. IBM shines light on 22 nm chip manufacturing . Trendwatch. By Rick C. Hodgin
Thursday, September 18, 2008
http://www.tgdaily.com/content/view/39378/113/
9. UMC Announces Foundry Industry's First 28nm SRAMs
HSINCHU, Taiwan, October 27, 2008 -- UMC (NYSE: UMC; TSE: 2303),
a leading global semiconductor foundry, today announced that it has manufactured the foundry industry's first fully functional 28nm SRAM chips, advanced double-patterning immersion lithography and strained silicon technology to produce the chips, which feature very small six-transistor SRAM cell sizes of approximately 0.122 square microns http://www.umc.com/English/news/20081027.asp
(note: it is approximately in 100,000,000 times smaller than the earliest IC - BD)
11. Interconnect Metrology Confidently Looks at 32 nm
Alexander E. Braun, Senior Editor -- Semiconductor International, 10/1/2007
http://www.semiconductor.net/article/CA6482818.html
12. ISMI Outlines 450 mm Wafer, NGF Roadmaps
David Lammers, News Editor -- Semiconductor International, 10/27/2008
http://www.semiconductor.net/article/CA6608829.html?industryid=47301
The newest trends in magnetic disks’ physics and technology:
13. Spintronics: Poised for Next Great Memory Breakthrough? Spin-polarized current revolutionized digital storage on disk drives. The next step for spintronics may mean replacing flash memory with magnetic tunnel junction MRAMs. Stuart Parkin, IBM Almaden Research Center, San Jose -- Semiconductor International, 10/1/2008
http://www.semiconductor.net/article/CA6602518.html?industryid=47573
14. Relationship of the solid-state technology, progress in underlying physics with computer characteristics from the very beginning up to the current moment and into the future: http://secondbang.blogspot.com/search/label/Forecast

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part1.pdf

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part2.pdf

A very early example of self-organized functional (Giant Magneto-Resistance) nano-structured materials:

http://www.clarkson.edu/camp/reports_publications/dorfman/To%20%20History%20of%20Discovery%20of%20GiantMagRes_1976-2009.pdf

Utmost physical limit for nano-structured composite solids:

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_SynergeticMatter_2005.pdf

Example of progress in relatively simple non-organic technology from "micro" to "nano":

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Rohring_AWJ_May2006.pdf

For the “bottom-up” Supramolecular chemistry:

1. Charles J. Pedersen nobelprize.org/nobel_prizes/chemistry/laureates/1987/pedersen-lecture.pdf

2. J-M. Lehn, "Supramolecular Chemistry," VCH, 1995. The link for copyrighted book (you may read but not download) is:
http://books.google.com/books?id=PKWFOei609kC&dq=supramolecular+chemistry&printsec=frontcover&source=bl&ots=IfPGkr_Ui2&sig=MK_PVCTuryw1NPqxZNkaSarue0w&hl=en&sa=X&oi=book_result&resnum=
3. Supramolecular chemistry: Functional structures on the mesoscale
SonBinh T. Nguyen, Douglas L. Gin, Joseph T. Hupp, and Xi Zhang
http://www.pnas.org/content/98/21/11849.full.pdf+html
PNAS October 9, 2001 vol. 98 no. 21 11849-11850
4. This site is giving a brief review based on the Lehn’s book, but with numerous ever updated links:
Supramolecular chemistry. Nanotechnology Encyclopedia
http://www.edinformatics.com/nanotechnology/supramolecular_chemistry.htm

General “nano”-topics:
1. This site is giving numerous links for recent applications of nanotechnology:
Nanularity Nanotech Breakthroughs!
http://nanularity.com/Breakthroughs.aspx
2. The recent NIST study of Nanomaterials’ Trek Through Food Chain described in article by Alexander E. Braun, Senior Editor, Semiconductor International, 9/16/2008:http://www.semiconductor.net/article/CA6596520.html

***

NANO-CLUSTERS

Examination of nano-clusters is important and, possibly, the earliest frontier of Nanotechnology.

The following picture represents the results of theoretical research conducted over 40 years ago.



1. V.F.Dorfman, M.B.Galina, To The Theory of Nucleus Formation and Growth from Molecular Beams and Gas Phase, Sov. Acad.Science Reports,182(1968),n.2, p.372-375.

From as early as the second half of 1960s, nano-cluster formation was systematically examined by the Author and co-workers. We used analytical solution and computer simulation of 4 different mathematical models: so named, non-Markov chains of non-linear differential equations; 3D anisotropic poly-nuclear multi-phase statistical kinetic model; 2D contour-representation model, and Monte-Carlo model. Three first models had been specifically developed for this problem solution, especially challenging in the time when experimental observation of the smallest nano-clusters was not available yet, while computers were in the early phase of progress, and their performance was limited. Although the non-Markov chains of non-linear differential equations were well known, this model was essentially advanced as well for two or three parallel chains reflecting competitive growth of clusters differentiated with structure and/or compositional variation. This approach allows analyzing free cluster formation as well as origination of structural defects during the growth of single-crystalline films. Plots shown above have been received on computer M-2 – first large universal computer in Russia created at the end of 1950s (chief designer M.A.Kartzev; I.S. Brook’s Institute of Electronic Controlling Machines). The odd-numbered and the even-numbered plots produced by solution of parallel non-Markov chains– one reaching nearly 230, and the other - 118 non-linear differential equations. The process was examined up to phase when 10% of initial surface was covered with nano-clusters. Even at earlier stages, coalescence of nano-clusters becomes essential, and this brings a new and fast growing complexity into the model. It took nearly one week of overnight work of reliable hard-worker M-2 (during the day time, this grand father of contemporary computers was busy with more urgent problems’ solution). Many important conclusions resulted from such plots. On this figure, it is noticeable, for examples, that at the given conditions the density of defects is relatively high, but they still remain localized as insulated nano-islands. The other noticeable conclusion – is two-modal distribution of clusters’ size (indeed, the third feature on the curve 11 is slightly visible as well). Position of the second maximum corresponds to nano-clusters of the doubled size with respect to the first maximum. This phenomenon may remind the frequency doubling of electromagnetic wave, or second harmonic generation, SHG as it is usually named in non-linear optics.

The 2-modal distribution of nano-clusters could not be directly verified by the contemporary techniques of 1960s, but it was confirmed by different experimental groups in 1970s.


***


2008©B.F.Dorfman

This site is concerned with scientific and technological matters only.
But, by no means, Nanotechnology should be perceived as a pure "rose garden".

The negative and alarming sides should be carefully considered at every step of the progress.
The interested in this aspect readers are advised to look for specially designated sites which are abundant on Internet.


Forecast

2008©B.F.Dorfman

WHAT IS PREDICTABLE – WHAT IS NOT?

From: Presentation for Clarkson University, November 2004

PART 1

TECHNICAL SUCCESS IS PREDICTABLE

COMPUTER DEVELOPMENT

FORECAST FROM THE PAST AND FOR FUTURE

It is known, forecast for specific innovation is rarely successful. This is not true with regard to ‘metatechnologies’ as microelectronics and computers. The following diagrams and plots for semiconductors and all major kinds of computers are still accurate although they were built over two decades ago.

The forecast was based on correlated analysis of technological and physical parameters of elements, interconnections and systems (each constituent trend encompasses semi-linear branch that for one of them is known as ‘Moor law’ and ‘saturating’ branch). The actual development shown in colored dots is different only in that that in reality industry often missed the right time for transition to new ‘sub-paradigms’, such as multiprocessor-chips.



Top - original diagram of computers’ development with 20-years forecast from 1986 to 2006 [1,2] (Cray-2 and GF-10 – based on advanced publications describing their designed performance). Bottom - an English version published 3 years later [2]. The plots for all shown types of computers are still accurate, including exact matching for 1015 milestone reached by ‘Blue Gene’ in a simplified mode test.

Line for personal computer in bold plotted in assumption that semiconductor industry will start multi-processor chips’ production at the end 1990s.

Indeed, industry mistakenly delayed this transition for ~5 years following a single-processor line -only now fixing the situation and returning to the bold line. Similar scenario had happened with transistors: the industry was too persistent in its efforts to further increase their frequency.

The computers’ forecast was conducted based on analysis of the elements (as well as the entire hardware system) up to respective critical limits.

The forecast was based on correlated analysis of technological and physical parameters of elements, interconnections and the entire hardware system (each constituent trend encompasses semi-linear branch that for one of them is known as ‘Moor law’ and ‘saturating’ branch). The actual development shown in colored dots is different only in that that in reality industry often missed the right time for transition to new ‘sub-paradigms’, such as multiprocessor-chips.




As predicted, most major characteristics of transistors asymptotically approach the respective principle limits.
The remaining resources ~ 1 order of magnitude vs. 4 to 6 order of magnitude of the past progress.
Does the proximity of transistor limits mean the end of solid-state electronics and the urgent transition to molecular electronics, single-electron transistor and quantum computers?

Quite opposite. Stabilization of transistor structure will signify just a beginning of real evolution of solid-state systems

Transistor is a building atom of solid-state computers. Evolution may only start when the atoms reach stability; it is impossible based on radioactive, changeable atoms.

Elements’ and system’ factors in computer progress


Elements, frequency

t, ns

Elements, density

cm-2

Most powerful universal computers, F, s-1

Personal computers

F, s-1

1950

104=>105

10-2=>10-1

103=>104

-

2004

109=>1010

108=>109

1014=>1015

108=>109

2004/1950

105

1010

1011

105

Thus, even in supercomputer the past progress is primarily due to the progress of elements, while in personal computers it is still completely due to the elements progress.

The systems’ resources have been only slightly explored yet.



New forecast from the present into the coming a few decades also shown. Accurate forecast from the past gives a reason to believe that newly suggested forecast for successful technical progress in the coming decades is also correct. But predictable technical success obliges us to think responsibly about the possible social sequences which are not as easy predictable.

Forecast for future computer development

suggested at presentation for Clarkson University in 2004


This diagram of the year 1981 (V. F. Dorfman, On the Border of Millenniums, Moscow, 1982. In Russian) locates the brain (hatched area from the right side) vs. computers Number of elements (processors or neurons )– number of connections between each elements with other elements, frequency of elements. Shown are three major directions of computers’ development trends: 1. Maximum speed of elements vs. minimum connections in specialized computers; 2. Medium speed and connectivity in standard computers; 3. Limited speed vs. maximum connectivity in eventual provisioned trend directed to (but not reaching) the brain performance.




This diagram of the year 1981 (V. F. Dorfman, On the Border of Millenniums, Moscow, 1982. In Russian) locates the brain (hatched area from the right side) vs. computers Number of elements (processors or neurons )– number of connections between each elements with other elements, frequency of elements. Shown are three major directions of computers’ development trends: 1. Maximum speed of elements vs. minimum connections in specialized computers; 2. Medium speed and connectivity in standard computers; 3. Limited speed vs. maximum connectivity in eventual provisioned trend directed to (but not reaching) the brain performance.

Brain vs. Computer
per ‘quantum of time’

Direct comparison of performances Brain vs. Computer would be improper due to great

differences in the respective frequencies neuron vs. solid-state m-processor.

Instead, we will introduce ‘quantum of time’ – the time interval reversed to frequency and will compare the Brain vs. Computer performances during the respective quantum of time.





Best contemporary personal computer proceeds

P~ 0.1 op/quantum of time vs. brain’ P >~1 billion

Top contemporary supercomputers

P~ 1000-10,000 op/quantum of time vs. brain’ P >~1 billion

It is still far from brain performance, but basically supercomputer performance per “quantum of time” progresses very fast in this direction.

Computer technology from 1964 to 2004 was transistor technology.

Beginning from 2005, micro/nano-electronics is microprocessor technology.

Microprocessor is invariant vs. physics of elements (such as transistors).

This will create the base for eventual shift

solid-state "statistical" bit processing to single quantum phenomena in data processing.

Further development = development of system hierarchy in a single chip

Achievement of every new level of system hierarchy requires about a quarter of century (compare with 2-years stratified Moor's law).

But every new level of hierarchy in data processing will strongly change the phase of civilization.

2008©B.F.Dorfman

Source of data

1. V.F.Dorfman, L.V.Ivanov, Computer and its Elements: Development and Optimization, Moscow, 1988.
2. V.F.Dorfman,L.V.Ivanov, Problems of the System Technology of Computers, J. New Gener. Comput. Syst. Berlin, 2(1989)1, 3-23.
3. V.F.Dorfman, On the Border of Millenium, "Znanie", Moscow, 1982.
4. B.F.Dorfman, Presentation to Clarkson University, 2004



Talk presented for Clarkson University in November 2004 is available by 2 parts

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part1.pdf

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part2.pdf


"Computers: technology, performance and impact on our life: What is predictable, what is not?"