Tuesday, November 4, 2008

Nanotechnology in Nature

Direct Impact and Synergetic Synthesis on Micro/Nano-Particles:

Applications in Today’s Technology

and possibly –

Origin of Primordial Organics in Cold Dark Dust


Benjamin F. DORFMAN

Direct impact by reacting energetic molecules may generate organic and variety of solid carbon forms on the cold substrate. Even more effective is the synergetic thermal-impact activation in the moderate substrate temperature range. Over the past four decades, such energizing mechanism was subject of systematic research, and it is already almost three decades as this technology is industrially used in the USA and many other countries [1].

The major technique currently employed in the technology is remote vacuum plasma. The resulting materials are: the silica-stabilized diamond-like amorphous carbon, the synergetic grapheme-diamond quasi-amorphous carbon, and the carbon-metals diamond-like composites of atomic scale and nano-composites.

But such cold impact synthesis can flow in cosmic space in dark dust clouds (on the surface of micro- and nano-particles) producing primordial organics in the universe over the billions years, while synergetic synthesis of more complex forms is plausible in relatively short living warm clouds.

The vacuum UV specter of synergetic carbon stabilized by silica shows an absorption maximum located in the variable range of 1400,Å to 2600,Å, depending on conditions of synthesis, and at certain condition carbon-silicon matter with famous ‘2175 Å’ astronomic feature [2] may be synthesized as well.

While the artificial synthesis of SSC is based on acceleration of charged particles (atomic and molecular ions) with a bias voltage, the cosmic cause of charge is may be due to radioactive sources, in particular b-decay, i.e. 14C, 36Cl, 26Al, 60Fe, 40K, and in certain space environment – possibly a shorter living 42Ar, 32Si, 39Ar.

Although b-decay is commonly assumed as destructive force for organics, in pre-biotic organic synthesis it could be more properly considered similarly to temperature as a dual {synthesizing ↔ decomposing} source of energy. Indeed, there is indication that a low doses radiation may even enhance DNA thermal stability [3].

There are variety other energizing mechanism revealed in space [4-10]. For instance, possible sources of energy may be due to the stars’ “winds”. Thus, p+ and CO carry energy in solar wind (as a known example) carry up to 4-5 eV; especial effective, could be SiO-wind where it reaches such an energy range.

Direct Impact and Synergetic Synthesis on Micro/Nano-Particles may be significant for both the contemporary technology and the primordial origin of organic world.

The polycyclic aromatic molecules were detected in interstellar dust as early as in 1968, and lately, both aromatic and aliphatic organics detected in proto-planetary and planetary nebulas [4, 11-13]. Still, the basic mechanisms responsible for organics formation in interstellar space remains unknown. It was recently found that a ratio of methyl formate (C2H4O2) to hydrogen in parts of our galaxy is over two orders of magnitudes higher than can be explained even using “the best models” of interstellar chemistry. None of developed concepts based on irradiation from the active stars, or a shock wave, caused by the in-fall or outflow of material in the star-formation process, suggests a self consistent energizing mechanism functioning on a feasible time scale and at the temperature range not exceeding the upper limit of organic stability. Furthermore, the nature of the strongest ultraviolet spectral signature of the interstellar dust - astronomical 2175,Å feature - remains unknown, although over the past 40 years, a variety of materials have been proposed, not excluding nano-diamonds and fullerenes. But recently Lawrence Livermore National Laboratory found that amorphous carbonaceous-silicate matter abundant in interstellar and interplanetary dust may be responsible for the 2175 , Å feature. [14].

Unavoidably, one shall assume a continuous generation of organic and silicon-carbonaceous matter in cold dark interstellar space. Indeed, a frigid (~ 8K) reservoir of simple sugar molecules were very recently discovered in a dust cloud of interstellar space.

In this web publication we briefly discuss the principles of Direct Impact and Synergetic Synthesis in association with its possible role in interstellar “reactors” and the earliest phases of life origin. The complete report was submitted and accepted for international conferences 2005 and 2008, but due to different reasons the Author could not make those reports. This publications is giving a simplified version of that. More details about technology, underlying physics and chemistry, as well practical applications on the land may be found in [1].

***

Introduction

Considering technology as a continuation of evolution, and evolution as the natural technology, one would find numerous similarities. The most important of them is correlation between independently developing islands of evolution or cultures. In Nature, it is observed, theorized and accepted as obvious consequence of natural laws, such as evolution of stars, planets, and, if not the life itself because the only one example known so far, but at least of ecosystems and species. In technology, it is especially evident before the era of great discoveries when American and Australian cultures were completely insulated, and even in the early ages – at the very beginning of the humanity when the Middle East – European cultural nuclei and China nucleus were completely separated. One of the major complimentary lines in technological evolution is start from simple impact technology, stone against stone, sharp stone against wood or bone, etc., next - “taming” the fire that was followed with multiple discoveries of thermally activated chemistry in food preparation, artificial stones, e.g. ceramics, and metallurgy, and then discovery of combined thermal-impact technology of forging and glass vessels manufacturing.

There are just a few examples of technologies so greatly served for civilization as metal forging and glass blowing. However, they never considered as the synergy of thermal and mechanical energy but rather as the thermal preconditions for mechanical treatment. Material synthesis, being it realized with metals, glasses or contemporary plastics, is basically chemical technology, while formation of the useful shape is predominantly mechanical treatment - let’s say a “hammer technology”. But what would happen if the hammer miniaturized up to the atomic scale?

Physical-Chemical Background

Indeed, any contemporary material synthesis technology is based on one of three major approaches to chemical reaction activation and inter-atomic bond formation: thermal activation, electro-chemical activation, or impact activation (by incident ions, or neutral atoms or sub-molecular particles). The thermal activation approach continues traditional technology developing over the millenniums, while its basic kinetic law was formulized in 1889 by S. Arrhenius. Electrochemical approach originated with M. Faraday;s works, and its theory was formulized by Arrhenius (1887), Milner (1912), Debye and Huckel (1923), and Onzager (1926). The impact activation approach was developed in the 20th century, and its kinetic law was not considered theoretically because the incident particle energy typically exceeds the chemical barrier by a few orders of magnitude.

But let’s consider relatively low-energetic incident atom or low-molecular radical, ionized or neutral, colliding with solid surface. More specifically, consider incident particle possessing average kinetic energy Ei of about or above the level of activation barrier for typical chemical reactions but not exceeding the elastic threshold of the substrate. Thus, the incident particle would not produce direct structural damage to the subject lattice, but rather a spike of excitation over average thermal energy in substrate kTs.

While singular spike of thermal excitation in lattice is effectively considered as a quasi-particle, or phonon, let’s inversely consider the mechanically induced excitation as quasi-thermal fluctuation. Thus, we can define certain equivalent quasi-temperature T* corresponding to such impact excitations. On one hand, based on statistical physics law underlying the Arrhenius’ formula, one may estimate the frequency of fluctuation E≥ Ei as

The coordination factor N*≥1 reflects dissipation of the incident particle’ energy in the coordination proximity of collision point during characteristic time of the event.

On the other hand, the real frequency of such events is simply the incident flux density expressed in the number of atomic layers per second, w.

Assuming w = f, one would find:

Because this spike corresponds to additional energy above the average thermal oscillation (we assume that below elastic threshold the harmonic approximation is justified), the effective temperature in the spike proximity during characteristic time of the event of about n-1, sec. would be

Teff.= T + T*


Thus, depending on ratio between the substrate temperature and the incident particles energy, one of three major mechanisms may be predominant:

The current technology

The real present technology may be considered as “on-the-land” laboratory.

Figure 1-3 show schematics of remote plasma synthesis of diamond-like and synergetic carbon matter, as well as atomic-scale and nano- metal-carbon composites. The semi-insulated high-density chemical plasma reactor generates energetic “wind” of the simplest carbon- and silica- radicals (typically – positively charged CH+, SiO+ and alike). The radicals are extracted by electrical field and directed to the substrate holder. While colliding with substrates, the energetic radicals overcomes the activation chemical barriers and form diamond-like or synergetic forms of carbon.


Figure 1. A scematic of synthesis

Figure 2. A simplified schematic of internal design.

Figure 3. View of three identical reactors (pilot versions).

Figures 4 and 5 show a schematic model representation of two major carbonaceous-silicate matters produced in such conditions.


Atomic-Scale Composite structure, known as DLN, (and also known as Dylyn™) with density range of 1.9 to 2.25 g/cm(sup.3), while the density range of 2.1 to 2.23 g/cm(sup.3) is the most typical. In DLN the diamond-like network, the graphite-like netrwork and the silica network are partially bonded only, and the entire structure is completely amorphous.

Synergetic diamond-graphene quasi-amorphous carbon (QUASAM™).
These ultra-lightweight [
1.3 to ~1.7 g/cm(sup.3)] materials have a self-organized hierarchical structure from atomic, to nano-, to the micrometer level approaching by its atomic arrangement the utmost physical limit of composite solids. The diamond-like 3D framework interpenetrates and bonds together graphene meso-planes. QUASAM™ is not only superior to conventional DLC, but explores the strongest features of the graphene mesophase (otherwise known only in nanotubes) on macroscopic level as well.

Usually, such forms of carbon, being far remote from the equilibrium states, are unstable, extremely stressed and short-living. In the shown synthesis conditions, the silica component of the “wind” radically changes the situation resulting with formation of carbon-silica composites of atomic scale: interpenetrating atomic-scale networks of carbon and silica. These materials combine many of the best features of both diamond-like and graphite-like (‘graphene’) carbon and may survive over nearly unlimited time (as theoretically estimated; practically tested so far at ‘room temperature”- over quarter of century, while the high temperature tests allows extrapolation over geological time for low-temperature stability).

Figure 6a shows absorption spectra for three major families of these carbon forms [these optical spectra had been obtained by James N. Hilfiker (J.A.Woollam Co, Inc.)].

Figure 6b suggests an interpolation for intermediate states and plausible location of the temperature corresponding to the 2175Å astronomic feature, assumingly due to amorphous carbonaceous-silicate matter abundant in interstellar and interplanetary dust. There are no sharp specific features on the plots but some blunt maximums instead. But this is natural for amorphous solid where any structural feature may be surrounded by different chemical or coordination neighbors. The maximum absorption position is plausibly in linear relation to deposition temperature, and 2175 A may be anticipated at T=~460 C.



Possible implication for astrochemistry

Life is not a ‘technology’, but is based on a balance between continuous processes of synthesis (a natural ‘technology’) and degradation. Synthesis needs a sufficient environmental temperature to overcome chemical barriers, while to sustain under permanent attacks by degradation, the synthesis demands limiting temperature from the top. The suitable temperature range is narrower for more complex life forms. But the critical level of complexity from where the life may start cannot be reached by direct transition for non-organic matter. A rather complex and abounding pre-biotic organic environment should precede the life initiation. This is shown on diagram in some utmost schematic depiction.

Between the greatest challenges of the evolution theory, is extremely fast appearance of life as soon as the Earth was cooled enough at least locally. This implies a much longer (billions years) pre-biotic synthesis. This is possible only in relatively cool interstellar space. But the mechanisms of activation of chemical reactions is such conditions - is another challenge. Direct-impact- and synergetic synthesis itself may not reach the life complexity but may suggest a true mechanism for generation of the pre-biotic cosmic “soil” – starting from the age of a young universe when the early stars produced enough carbon.

On one hand, to establish the basic scale, consider three principle scenarios:








Hence, an atomic scale “hammer” haven being working in synergy with thermal activation appears as the effective mechanism both in technology and in the nature.
The ratio between T*, K values for direct reactions, surface reactions and superficial restructuring is also important: it shows that there are good ranges of values of synergetic parameters {Ei, Ts} where direct reactions would be activated effectively while not damaging the surface or destruct the previously synthesized films, as well as range of {Ei, Ts} where direct reactions and surface reaction may be effectively activated without destroying of the previously synthesized or pre-existed structure.
On the other hand, one may consider three possible scenarios in cosmic space:
Scenario 1.
Dark dust cloud.
The only source of charge are mechanical collisions.
The collision are relatively rare, and electrical charge is weakly depending on the particle size.
The most active are relatively fine particles.
Scenario 2.
Intensive irradiation (from the nearby star or from intrinsic -radioactivity)
The surface density is in the quasi-equilibrium state with respect to media.
The most active are large bodies
Scenario 3, intermediate
Combined source from irradiation and random collisions
Relatively weak irradiation produces a background charge, while relatively rare collisions result with randomly distributed additional charge.
The most active range of particles’ size gradually shifts from fine particles to larger bodies correspondingly to decrease of collisions’ frequency and increase of the irradiation.
To reach some realistic estimates in more quantity terms, one should consider the plausible density of dust particles, their distribution by size and charge and the resulting radius of capture and energy of collision of incident ions with particle vs. the density of active gas in the same clouds.

Omitting here the details, we will give some numbers for orientation:

The smallest particles of about 1 nm radius, even though they may have the highest relative surface density of electrical charge, may generate collision energy not exceeding 10 eV with radius of capture on a few nanometers level. Such particles may play essential role in some special conditions of their density combined with extensive charge generation.

The most active generator of organic matter in space should be particles of a few micrometers range which are able to capture ions from up to a few centimeters, and even nearly 1 m proximity while providing their incident collision with energy up to above 100 eV.

This well corresponds to tipical dense interstellar molecular clouds: density >100 cm-3, T=10-50 K, radius > 1 pc, a gas-to-dust mass ratio ~ 100, dust-to-gas number density ratio ~ 10-12, grain radii usually in the range 0.001-3 micron. [15]

On the other hands, the larger bodies in the range of 1 cm to about 1 m are able to capture ions from a few kilometers proximity and may routinely generate the incident collisions up to 1000 eV level. Combined with low temperature preserving the produced molecules by composition and on the substrate, such celestial bodies may generate rather complex organics, nitrogen derivative of organics such as famous Australian meteorite [16], as well as silicon-carbonious solid matter; over the million years of exposure, they may be completely covered with the secondary-synthesized material and thus self-converted into Carbonaceous chondrites as recently fallen Canadian meteorite [17].

Most importantly for technology, depending on available technique, those “quasi-fluctuations” may be precisely calculated and delivered to the front of new solid growth to overcome specific chemical barriers. Thus, appropriately controlled synergetic conditions allow selective chemical synthesis on solid surface that is hardly possible in conventional technology. The other important features of synergetic synthesis is the possibility to control the growth of nano-crystals or mono-layers while simultaneously and virtually independently controlling the surface chemical reactions. This is due to predominantly thermal activation of cooperative mechanisms of the first kind phase transitions and a local nature of chemical reactions which may be effectively activated by the impact of the incident particles. Thus, synergetic activation is particularly prospective for synthesis of predetermined nano-structured materials. While the ancient thermal technology is restricted by the reverse processes, and the ion/plasma technology, inversely, is usually realized in virtually irreversible conditions, the synergetic synthesis at first allows control of the reverse processes contribution and synthesis of synergetic solids which are not achievable by the prior techniques. Synergetic synthesis is also more effective and requires a lower temperature (typically, ~ 600-800 K) and lower voltage or particle energy (typically, 10 to 100 eV) that is crucial for micro- and nano-electronics.

Kinetic energy of incident particles as low as Ei~10 eV is sufficient to activate the growth of relatively soft “polymer-like” forms of carbon on cold substrate (substrate temperature Ts ≤ 300 K); at Ts ≥500 K - energy Ei ~20÷30 eV is ample for synergetic synthesis of relatively hard carbon forms. Normally, synthesis conducted on the flat substrates; however, in high-frequency or pulse accelerating field, deposition of carbon films is realized on the particles as well.

While the artificial synthesis of SSC is based on acceleration of charged particles (atomic and molecular ions) with a bias voltage, as it mentioned above, the cosmic cause of charge is may be due to radioactive sources, in particular b-decay. One act of decay may destroy a single bond in the absorbed organic layer while generating multiple electrons emitting from the substrate. Not only the resulting electrostatic field will energize the surface reactions, it will also cause the negative ion generation and accretion of such ions from surrounding space on the particles of cosmic dust. With relation to synergetic thermal-impact synthesis, it is important to note that accordingly to recent research, thermal stability of relatively complex bio-organics may essentially increase(see, for instance [18]), and for relatively simple bio-organics - even approach 600K [19].

A rough estimate based on suggested mechanism and general astronomic data suggests a very modest productivity of organics generation in contemporary Solar system, but very high productivity of interstellar dust clouds and convincingly high “domestic” productivity of proto-planetary nebulas.

Depending on the ratio between intensity of b-active background, the particle density in cloud and the carbon-containing gas pressure, the most active particles may vary from ~101-102 nm to relatively large macroscopic size. Such mechanisms may be significant for both the contemporary technology and the primordial origin of organic world - in both interstellar space and the earliest phase of the Earth or other planets formation.

Note in conclusion:

After this article was presented in 2004, it was shown the existence of a new class of astrophysical objects where the self-gravity of the dust is balanced by the force arising from shielded electric fields on the charged dust. [20]. Such an object could be effective astronomic reactor of pre-biotic organic matter.

***


Reference:

1. B.F. Dorfman, Stabilized sp2/sp3 Carbon and Metal-Carbon Composites of Atomic Scale as Interface and Surface-Controlling Dielectric and Conducting Materials. In: Handbook of Surfaces and Interfaces of Materials (H. S. Nalwa Ed.), v.1, Academic Press, San Diego, 2001, pp. 447-508.

Also a brief summary at:
http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_SynergeticMatter_2005.pdf

2. John Bradley, Zu Rong Dai, Rolf Erni, Nigel Browning, Giles Graham, Peter Weber, Julie Smith, Ian Hutcheon, Hope Ishii, Sasa Bajt, Christine Floss, Frank Stadermann, Scott Sandford. An Astronomical 2175 Å Feature in Interplanetary Dust Particles. Science 14 January 2005: Vol. 307. no. 5707, pp. 244 – 247.
3. Georgakilas AG, Sideris EG, Sakelliou L, Kalfas CA, Low doses of alpha- and gamma-radiation enhance DNA thermal stability. Biophys Chem. 1999 Aug 9;80(2):103-18.
4. Grun, E.; Gustafson, B. A.; Dermott, S.; Fechtig, H., eds. Interplanetary Dust. 2001, Springer: New York.
5. Hansen, D. O., Mass analysis of ions produced by hypervelocity impact. Applied Physics Letters,1968. 13(3): p. 89-91.
6. Abramov, V. I.; Bandura, D. R.; Ivanov, V. P.; Sysoev, A. A., Energy and angular characteristics of ions emitted in the impact of accelerated dust particles on a target. Sov. Tech. Phys. Lett., 1991. 17(3): p. 194-195.
7. Hornung, K. and Kissel, J., On shock wave impact ionization of dust particles. Astronomy and Astrophysics, 1994. 291: p. 324-336.
8. Hornung, K.; Malama, Y. G.; Kestenboim, K. S., Impact vaporization and ionization of cosmic dust particles. Astrophysics and Space Science, 2000. 274: p. 355-363.
9. A. Abergel1, J. P. Bernard, F. Boulanger, D. Cesarsky;E. Falgarone, A. Jones,M.-A. Miville-Deschenes1;M. Perault, J.-L. Puget, M. Huldtgren, A. A. Kaas, L. Nordh,G. Olofsson, P. Andre, S. Bontemps, M. M. Casali11, C. J. Cesarsky, M. E. Copet, J. Davies,T. Montmerle, P. Persi, and F. Sibille. Evolution of very small particles in the southern part of Orion B observed by ISOCAM. Astronomy & Astrophysics. 389, 239-251 (2002).
10. Kimura, H. and Mann, I., The electric charging of interstellar dust in the solar system and consequences for its dynamics. Astrophysical Journal, 1998. 499(454-462).
11. Zinner, E., Stellar nucleosynthesis and the isotopic composition of presolar grains from primitive
meteorites. Annual Review of Earth and Planetary Sciences, 1998. 26: p. 147-188.
12. Sun Kwok, "The synthesis of organic and inorganic compounds in evolved stars", p 985-991 v 430, Nature, 26 Aug 2004.
13. C. S. Contreras, J.-F. Desmurs, V. Bujarrabal, F. Colomer, J. Alcolea, 2002, Astronomy & Astrophysics., 385, L1-L4.
14. John Bradley, Zu Rong Dai, Rolf Erni, Nigel Browning, Giles Graham, Peter Weber, Julie Smith, Ian Hutcheon, Hope Ishii, Sasa Bajt, Christine Floss, Frank Stadermann, Scott Sandford. An Astronomical 2175 Å Feature in Interplanetary Dust Particles. Science, 14 January 2005: Vol. 307. no. 5707, pp. 244 – 247.
15. Yeghikyan A. G.; Fahr H. J.; Annales geophysicae, 2003, vol. 21, no 6 (177 p.)]
16. Shock E.L., and Schulte M.D., Summary and implications of reported amino acid concentrations in the Murchison Meteorite. Geochimica et Cosmochimica Acta 1990, vol. 54, pp. 3159-3173.],
17. Science Daily (Aug. 27, 2001)].
18. Ueda, Tadashi; Masumoto, Kiyonari; Ishibashi, Ryoji; So, Takanori. Remarkable thermal stability of doubly intramolecularly cross-linked hen lysozyme. Protein Engineering, Volume 13, Number 3, March 2000 , pp. 193-196(4), Oxford University Press.
19. Michael C. Adams, Joseph N. Moore, Laszlo G. Fabry, and Jong-Hong Ahn, Thermal stabilities of aromatic acids as geothermal tracers. University of Utah Research Institute. Salt Lake City.
20. K. Avinash, and P. K. Shukla, Gravitational equilibrium and the mass limit for dust clouds. New J. Phys. 8 (2006) 002.


***

Tuesday, October 21, 2008

NANOTECHNOLOGY - from Nano-Dream to Nano-Realm

2008©B.F.Dorfman Extended and updated on November 02, 2008
NANOTECHNOLOGY TODAY
FROM A BIRD'S EYE VIEW

Nanotechnology is not a specific technology
- It is specific time

There are many reasonable definitions of technology in general, and nanotechnology – in particular. Depending on definition, our perception of nanotechnology may be as a dream about relatively remote future, or actual state of the nowadays hi-tech.

1st Definition: Technology becomes “nano” when it produces new kind of artifacts due to a nanometer-range resolution at least in one of three dimensions. In this definition, ‘nano’ firstly started in the time of WWII with thin optical coatings, and finally – in the middle of 1950s with introduction of ‘double-diffused’ transistor.

2d Definition. Technology becomes “nano” when it produces new kind of artifacts due to a nanometer-range resolution in all three dimensions. In this definition, ‘nano’ was born at the border 1980s-1990s, when the design rule in semiconductor industry was firmly established below 1.0 micrometer ‘waterline’. It had matured at early 2000s while submerging bellow 100 nm.

3d Definition. Technology becomes “nano” producing intelligent products without direct intervention of external intellect (human or digital) - based on complete self-organization, self-repair, and even - self-development.

In this definition, we now almost exactly in the middle of the road to Nanotechnology from its conception at 1950s.

‘Football field’ on Intel’ chip: up to ~ 2,000,000,000 transistors/ 45 nanometer ‘design rule’. To make 45 nm visible, we must magnify ~1 cm chip to a real football field (the heads of players may be hit by satellites). This is a current cutting edge of "top-down" approach: steady progress of hi-tech’ resolution to atomic scale.

Two players fight for 540-atom buckyball - a cutting edge in some direction of progress in the "bottom-up" approach: from atomic scale to products. Even at such magnification, buckyball is almost unnoticeable; it will need two more orders of magnification to distinguish its details.


Time of the Great Unification of the Technologies

True nanotechnology will only start at the point where the “design rules” of “top-down” and “bottom-top” approaches will be equalized & synergized – ~ 3-4 decades from now – as it was shown two decades ago: Pictures 1, 2 below are from the book ‘Evolutions of technologies, or New history of the Time’ (B.F. Dorfman, Moscow, 1990, in Russian).

Nanotechnology is not a specific technology, or specific technologies. It is specific time.

Time when all the major branches of the current development will be united and synergized.

Time of the Great Unification of the Technologies. Plausibly, just starting at ~2040.

1. Nanotechnology ≡ Great Unification of the Technologies.

Progress in "Design Rules" from 1cm to 1mm was taking ~10,000 years, from 1mm to 100mm – a century, from 100mm to 100nm ~½ century. It would take another half a century to reach the atomic scale.

2. Approaching the atomic-scale in geometric dimensions, most of physical phenomena are nearly equalizing in the time (èspeed) dimension.

This may change our ‘common sense’. For instance, electromechanical relays, which were once starting points for both theory and practically working computers, – may return as the best known digital element, but on molecular base. (To make a relay-equivalent for PC-2010, one would need some megaton of relays, gigaton of wire, gigawatt of power, millennium for realization and astronomical time for system operation).

On diagram "Great Unification of Technologies", the historical time scale is not linear because the real time of progress was not linear. New universe was in a latent state up to the Middle-Age mechanics (clocks and automats), it became visible with the first Industrial Revolution, exploded with the Second one.



3. Some milestones of the “top-down” approach to ‘nano’ from the conception to the current moment.

The real ‘Big bang’ was transistor. From that moment and over half a century the time scale of progress is almost strictly linear: diagram 3 on the following screen. (NOTE: This is a strongly simplified diagram. The complete diagram showing all the the milestones of the “top-down” approach to ‘nano’ from the conception to the current moment and envisioned beyond of that will be published in book).
The term transistor itself denotes two basically different solid-state devices:
1. Bipolar semiconductor triode where three electrodes are mutually separated by two p-n-junctions.
2. Unipolar triode, or ‘field-effect’ transistor (FET) - three-pole solid device where two electrodes (‘source’ and ‘drain’) are directly connected to the area of the path of flow of free electrical charges (‘channel’), while the third electrode (gate) is insulated from the channel or by a thin dielectric layer (MOSFET, or metal-oxide-semiconductor FET) or by Schottky barrier (MESFET, or metal-semiconductor FET).
While the bipolar transistor is basically semiconductor device, the FE triode (at least in principle) may be realized with different kinds of channel matter, even without any semiconductor – for instance, with “poor metal” “or “poor dielectric”, or in molecular structure, or even without any matter – in vacuum similarly to original vacuum triode prototype – as soon as the device structure advanced into the deep “nano” range. But the FE triode was originally conceived long before the ”nano”-era: In three patents series of 1925-1928, Julius Edgar Lilienfeld had disclosed the solid-state analogs of vacuum triode: US patent 1745175 "Method and apparatus for controlling electric current" first filed in Canada on 22.10.1925 (similar to a MESFET), US patent 1900018 "Device for controlling electric current" filed on 28.03.1928 (a thin film MOSFET) and US patent 1877140 "Amplifier for electric currents" filed on 08.12.1928 (where the current flow is controlled in y a porous metal layer). No silicon or germanium had been explored yet as the workable semiconductors, and p-n-junctions were not known either. Russell Shoemaker Ohl discovered the key role of impurities in semiconductors, significance of their ultra-purifying (in that time – for germanium), p-n-junction barrier and semiconductor diode only in 1939.
The practical and essentially reciprocal history of unipolar and bipolar transistors started after the Second World War [1-5]. The junction transistor was invented by William Shockley* soon after WWII - still FET; however, the Shockley’s FET did not work in spite of tremendous and diversified efforts by Shockley himself, John Bardeen, Walter Brattain and their colleagues. Then, working alone, Brattain and Bardeen have created the first [germanium] bipolar transistor.
* Shockley finally filed his patent for FET with p-n-junction and mesa-channel in 1951 (granted in 1958).
Amazingly, Bardeen provided theoretical explanation why the Shockley’s FET did not work (imperfect surface “killing” the free carriers of electrical charge), while Shockely developed the fundamental theory of Electrons and Holes in Semiconductors (1950) first employed for bipolar transistors. Next two decades laying down the foundation of silicon transistor electronics, integral circuits, planar technology and microelectronics was exclusively due to bipolar transistors.
The next two ideas were crucial.
Jean Hoerni proposed to preserve the silicon oxide layer on place on silicon substrate (instead of etching it away after using oxide as diffusion mask) - to protect p-n junctions. The “given name” ‘planar technology’ was originally due to intention to distinguish a newly-born (in the Hoerni’s idea – in December 1957, in industry – in 1961) flat device structure from the preceding one – an entrenched “mesa” transistor design (developed by M. Tanenbaum and D. Thomas). However, the current meaning of this term is essentially broader and deeper.
As soon as the concept of the planar transistor was established, Robert Noyce suggested formation of interconnects on the same silicon substrate. Independently, the idea to form wires connecting undivided transistors on substrate was suggested in Texas Instruments by Jack Kilby (actually, a few months earlier, but it was not yet planar technology).
To the middle of 1960s, the planar technology of silicon integral circuits, including surface treatment, had been matured enough to return to FET –better matching the basic principles of planar technology, but most importantly - consuming less energy, especially in complimentary pairs of CMOS (p-channel & n-channel transistors) . In the 1990s, when the formerly steady increase of the FET transistor’ frequency with scaling down the design rules could not be continued with the same pace any longer, there were some efforts to revitalize bipolar transistors – naturally without a chance for success: the low-energy CMOS sustain their critical advantage.

***

So far, altering the world ~800,000,000 PCs, > 3,000,000,000 cell phones, Internet, library on disk, ipod, digital cameras, medical sensors, flat TV, more accurate weather prognosis, computer-assisted design of the technology itself, – are based on ‘top-down’ nano-progress.

Technology may be safely continue line for ~7 years. Then, keeping this “schedule” will be increasingly challenging, and Technology - may be diversified.

The “top-down” development in the physical technologies’ domain is predominantly united, and the major past milestones of this decisive development, only of which are shown on diagram 3, well follows the linear course in the ‘Time – Resolution’ coordinates.

The “bottom-up” approaches are diversified yet, and the below shown ‘bottom-up’ diagrams are just some examples.


From PICO to NANO

1. SUPRAMOLECULAR CHEMISTRY

and

SELF-ASSEMBLING MOLECULAR SYSTEMS


SUPRAMOLECULAR CHEMISTRY - is one of the most important fields in the "bottom-up" approach: creating the pre-designed hierarchical structures from molecules as building blocks (instead of atoms) while binding them with relatively week bonds (≤~2eV instead of ~ 3.5eV in covalent bonds C-C) or even without chemical bonds, just “topologically” (two Stoddart’s supramolecules above). Self”-assembling – but in desirable direction! – is engine driving this research to the true “nano”. All specific results, achieved so far, are just examples, road signs indicating the progress. Differently from semiconductor’s “nano”, where the world’ leader Intel alone has ~90,000 employees, supramolecular chemistry is still more academic field, and every step of progress associated with individual name of discoverer.

SUPRAMOLECULAR SYNTHESIS & SELF-ASSEMBLING MOLECULAR SYSTEMS is marked with a few complimentary lines of progress:

  • Structural Hierarchy => Intrinsic rotating and axial moving
  • 2D “molecular architecture”=>3D=>Complex topology=> Pre-designed deformation=>
  • Increasing complexity leads to growing number of steps in synthesis and number of building constituents =>
  • Another side and line of progress: More effective and complex selectivity of synthesized supra-molecules to chemical and biochemical reactions =>More complex functionality=>

It is actually “From pico to nano” approach: In the first ‘crown ethers’ – ancestors of the entire fields (shown in the left bottom corner of diagram), the size of their most remarkable feature – cavity - is in the range of 120 to 320 pico-meters.

Hierarchical supramolecules synthesized in 2000s preserve such pico-feature while adding also next floors, or ‘nano’ – levels.

Structural Hierarchy => Intrinsic rotating and axial moving=>
2D=>3D=>Complex topology => Pre-designed deformation=>
Steps of synthesis and number of building constituents =>
Selectivity => Functionality => Self- functioning =>

Indeed, the length of chemical bonds – ~100-200 picometers – is the base of all technologies, starting from ancient metallurgy. ‘Nano’ – is growing complexity. Pico-technology is rather predecessor than successor of “nano” world of stable atomic structures.

Perhaps, in meta-stable world of intro-atomic electronic states, lasers and future quantum computers may be considered as Pico-technologies.

Starting even earlier, Inorganic materials explored both ways, “up” and “down” (see bellow after references an example).

***

Some useful links for the further reading:
The following articles are available free:
1. Michael Riordan, From Bell Labs to Silicon Valley, The Electrochemical Society, Interface, Fall 2007
Full article extending the story up to very recent time is available free at:
http://www.electrochem.org/dl/Interface/fal/fal07/fall07_p36-41.pdf
2. Howard R. Huff. From The Lab to The Fab: Transistors to Integrated Circuits
https://www.chiphistory.org/exhibits/ex_howard_huff_transitors_integrated_circuits/howard_huff_section1.pdf
For the further reading:
3. Riordan and Hoddeson, Crystal Fire, W. W. Norton & Co., New York, (1997)
4. Christophe Lécuyer, Making Silicon Valley: Innovation and the Growth of High Tech, 1930-1970. The MIT Press, Cambridge, MA (2006).
To reed the history from the “first hands”:
5. Ian M. Ross, President of Bell Labs from 1979 to 1991,
The Foundation of the Silicon Age, Bell Labs Technical Journal, Autumn 1997, pp. 3-14
This article is giving the principle narrative from the first hands. Access to full article requires subscription.
6. The early history of magnetic disk storage in IBM:
http://www.magneticdiskheritagecenter.org.
Latest news:
7. Nanotech Breakthroughs! Intel Unveils Industry's First 32-nm Chips
ED Online ID #16875, September 18, 2007
http://electronicdesign.com/Articles/Index.cfm?AD=1&ArticleID=16875
8. IBM shines light on 22 nm chip manufacturing . Trendwatch. By Rick C. Hodgin
Thursday, September 18, 2008
http://www.tgdaily.com/content/view/39378/113/
9. UMC Announces Foundry Industry's First 28nm SRAMs
HSINCHU, Taiwan, October 27, 2008 -- UMC (NYSE: UMC; TSE: 2303),
a leading global semiconductor foundry, today announced that it has manufactured the foundry industry's first fully functional 28nm SRAM chips, advanced double-patterning immersion lithography and strained silicon technology to produce the chips, which feature very small six-transistor SRAM cell sizes of approximately 0.122 square microns http://www.umc.com/English/news/20081027.asp
(note: it is approximately in 100,000,000 times smaller than the earliest IC - BD)
11. Interconnect Metrology Confidently Looks at 32 nm
Alexander E. Braun, Senior Editor -- Semiconductor International, 10/1/2007
http://www.semiconductor.net/article/CA6482818.html
12. ISMI Outlines 450 mm Wafer, NGF Roadmaps
David Lammers, News Editor -- Semiconductor International, 10/27/2008
http://www.semiconductor.net/article/CA6608829.html?industryid=47301
The newest trends in magnetic disks’ physics and technology:
13. Spintronics: Poised for Next Great Memory Breakthrough? Spin-polarized current revolutionized digital storage on disk drives. The next step for spintronics may mean replacing flash memory with magnetic tunnel junction MRAMs. Stuart Parkin, IBM Almaden Research Center, San Jose -- Semiconductor International, 10/1/2008
http://www.semiconductor.net/article/CA6602518.html?industryid=47573
14. Relationship of the solid-state technology, progress in underlying physics with computer characteristics from the very beginning up to the current moment and into the future: http://secondbang.blogspot.com/search/label/Forecast

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part1.pdf

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part2.pdf

A very early example of self-organized functional (Giant Magneto-Resistance) nano-structured materials:

http://www.clarkson.edu/camp/reports_publications/dorfman/To%20%20History%20of%20Discovery%20of%20GiantMagRes_1976-2009.pdf

Utmost physical limit for nano-structured composite solids:

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_SynergeticMatter_2005.pdf

Example of progress in relatively simple non-organic technology from "micro" to "nano":

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Rohring_AWJ_May2006.pdf

For the “bottom-up” Supramolecular chemistry:

1. Charles J. Pedersen nobelprize.org/nobel_prizes/chemistry/laureates/1987/pedersen-lecture.pdf

2. J-M. Lehn, "Supramolecular Chemistry," VCH, 1995. The link for copyrighted book (you may read but not download) is:
http://books.google.com/books?id=PKWFOei609kC&dq=supramolecular+chemistry&printsec=frontcover&source=bl&ots=IfPGkr_Ui2&sig=MK_PVCTuryw1NPqxZNkaSarue0w&hl=en&sa=X&oi=book_result&resnum=
3. Supramolecular chemistry: Functional structures on the mesoscale
SonBinh T. Nguyen, Douglas L. Gin, Joseph T. Hupp, and Xi Zhang
http://www.pnas.org/content/98/21/11849.full.pdf+html
PNAS October 9, 2001 vol. 98 no. 21 11849-11850
4. This site is giving a brief review based on the Lehn’s book, but with numerous ever updated links:
Supramolecular chemistry. Nanotechnology Encyclopedia
http://www.edinformatics.com/nanotechnology/supramolecular_chemistry.htm

General “nano”-topics:
1. This site is giving numerous links for recent applications of nanotechnology:
Nanularity Nanotech Breakthroughs!
http://nanularity.com/Breakthroughs.aspx
2. The recent NIST study of Nanomaterials’ Trek Through Food Chain described in article by Alexander E. Braun, Senior Editor, Semiconductor International, 9/16/2008:http://www.semiconductor.net/article/CA6596520.html

***

NANO-CLUSTERS

Examination of nano-clusters is important and, possibly, the earliest frontier of Nanotechnology.

The following picture represents the results of theoretical research conducted over 40 years ago.



1. V.F.Dorfman, M.B.Galina, To The Theory of Nucleus Formation and Growth from Molecular Beams and Gas Phase, Sov. Acad.Science Reports,182(1968),n.2, p.372-375.

From as early as the second half of 1960s, nano-cluster formation was systematically examined by the Author and co-workers. We used analytical solution and computer simulation of 4 different mathematical models: so named, non-Markov chains of non-linear differential equations; 3D anisotropic poly-nuclear multi-phase statistical kinetic model; 2D contour-representation model, and Monte-Carlo model. Three first models had been specifically developed for this problem solution, especially challenging in the time when experimental observation of the smallest nano-clusters was not available yet, while computers were in the early phase of progress, and their performance was limited. Although the non-Markov chains of non-linear differential equations were well known, this model was essentially advanced as well for two or three parallel chains reflecting competitive growth of clusters differentiated with structure and/or compositional variation. This approach allows analyzing free cluster formation as well as origination of structural defects during the growth of single-crystalline films. Plots shown above have been received on computer M-2 – first large universal computer in Russia created at the end of 1950s (chief designer M.A.Kartzev; I.S. Brook’s Institute of Electronic Controlling Machines). The odd-numbered and the even-numbered plots produced by solution of parallel non-Markov chains– one reaching nearly 230, and the other - 118 non-linear differential equations. The process was examined up to phase when 10% of initial surface was covered with nano-clusters. Even at earlier stages, coalescence of nano-clusters becomes essential, and this brings a new and fast growing complexity into the model. It took nearly one week of overnight work of reliable hard-worker M-2 (during the day time, this grand father of contemporary computers was busy with more urgent problems’ solution). Many important conclusions resulted from such plots. On this figure, it is noticeable, for examples, that at the given conditions the density of defects is relatively high, but they still remain localized as insulated nano-islands. The other noticeable conclusion – is two-modal distribution of clusters’ size (indeed, the third feature on the curve 11 is slightly visible as well). Position of the second maximum corresponds to nano-clusters of the doubled size with respect to the first maximum. This phenomenon may remind the frequency doubling of electromagnetic wave, or second harmonic generation, SHG as it is usually named in non-linear optics.

The 2-modal distribution of nano-clusters could not be directly verified by the contemporary techniques of 1960s, but it was confirmed by different experimental groups in 1970s.


***


2008©B.F.Dorfman

This site is concerned with scientific and technological matters only.
But, by no means, Nanotechnology should be perceived as a pure "rose garden".

The negative and alarming sides should be carefully considered at every step of the progress.
The interested in this aspect readers are advised to look for specially designated sites which are abundant on Internet.


Forecast

2008©B.F.Dorfman

WHAT IS PREDICTABLE – WHAT IS NOT?

From: Presentation for Clarkson University, November 2004

PART 1

TECHNICAL SUCCESS IS PREDICTABLE

COMPUTER DEVELOPMENT

FORECAST FROM THE PAST AND FOR FUTURE

It is known, forecast for specific innovation is rarely successful. This is not true with regard to ‘metatechnologies’ as microelectronics and computers. The following diagrams and plots for semiconductors and all major kinds of computers are still accurate although they were built over two decades ago.

The forecast was based on correlated analysis of technological and physical parameters of elements, interconnections and systems (each constituent trend encompasses semi-linear branch that for one of them is known as ‘Moor law’ and ‘saturating’ branch). The actual development shown in colored dots is different only in that that in reality industry often missed the right time for transition to new ‘sub-paradigms’, such as multiprocessor-chips.



Top - original diagram of computers’ development with 20-years forecast from 1986 to 2006 [1,2] (Cray-2 and GF-10 – based on advanced publications describing their designed performance). Bottom - an English version published 3 years later [2]. The plots for all shown types of computers are still accurate, including exact matching for 1015 milestone reached by ‘Blue Gene’ in a simplified mode test.

Line for personal computer in bold plotted in assumption that semiconductor industry will start multi-processor chips’ production at the end 1990s.

Indeed, industry mistakenly delayed this transition for ~5 years following a single-processor line -only now fixing the situation and returning to the bold line. Similar scenario had happened with transistors: the industry was too persistent in its efforts to further increase their frequency.

The computers’ forecast was conducted based on analysis of the elements (as well as the entire hardware system) up to respective critical limits.

The forecast was based on correlated analysis of technological and physical parameters of elements, interconnections and the entire hardware system (each constituent trend encompasses semi-linear branch that for one of them is known as ‘Moor law’ and ‘saturating’ branch). The actual development shown in colored dots is different only in that that in reality industry often missed the right time for transition to new ‘sub-paradigms’, such as multiprocessor-chips.




As predicted, most major characteristics of transistors asymptotically approach the respective principle limits.
The remaining resources ~ 1 order of magnitude vs. 4 to 6 order of magnitude of the past progress.
Does the proximity of transistor limits mean the end of solid-state electronics and the urgent transition to molecular electronics, single-electron transistor and quantum computers?

Quite opposite. Stabilization of transistor structure will signify just a beginning of real evolution of solid-state systems

Transistor is a building atom of solid-state computers. Evolution may only start when the atoms reach stability; it is impossible based on radioactive, changeable atoms.

Elements’ and system’ factors in computer progress


Elements, frequency

t, ns

Elements, density

cm-2

Most powerful universal computers, F, s-1

Personal computers

F, s-1

1950

104=>105

10-2=>10-1

103=>104

-

2004

109=>1010

108=>109

1014=>1015

108=>109

2004/1950

105

1010

1011

105

Thus, even in supercomputer the past progress is primarily due to the progress of elements, while in personal computers it is still completely due to the elements progress.

The systems’ resources have been only slightly explored yet.



New forecast from the present into the coming a few decades also shown. Accurate forecast from the past gives a reason to believe that newly suggested forecast for successful technical progress in the coming decades is also correct. But predictable technical success obliges us to think responsibly about the possible social sequences which are not as easy predictable.

Forecast for future computer development

suggested at presentation for Clarkson University in 2004


This diagram of the year 1981 (V. F. Dorfman, On the Border of Millenniums, Moscow, 1982. In Russian) locates the brain (hatched area from the right side) vs. computers Number of elements (processors or neurons )– number of connections between each elements with other elements, frequency of elements. Shown are three major directions of computers’ development trends: 1. Maximum speed of elements vs. minimum connections in specialized computers; 2. Medium speed and connectivity in standard computers; 3. Limited speed vs. maximum connectivity in eventual provisioned trend directed to (but not reaching) the brain performance.




This diagram of the year 1981 (V. F. Dorfman, On the Border of Millenniums, Moscow, 1982. In Russian) locates the brain (hatched area from the right side) vs. computers Number of elements (processors or neurons )– number of connections between each elements with other elements, frequency of elements. Shown are three major directions of computers’ development trends: 1. Maximum speed of elements vs. minimum connections in specialized computers; 2. Medium speed and connectivity in standard computers; 3. Limited speed vs. maximum connectivity in eventual provisioned trend directed to (but not reaching) the brain performance.

Brain vs. Computer
per ‘quantum of time’

Direct comparison of performances Brain vs. Computer would be improper due to great

differences in the respective frequencies neuron vs. solid-state m-processor.

Instead, we will introduce ‘quantum of time’ – the time interval reversed to frequency and will compare the Brain vs. Computer performances during the respective quantum of time.





Best contemporary personal computer proceeds

P~ 0.1 op/quantum of time vs. brain’ P >~1 billion

Top contemporary supercomputers

P~ 1000-10,000 op/quantum of time vs. brain’ P >~1 billion

It is still far from brain performance, but basically supercomputer performance per “quantum of time” progresses very fast in this direction.

Computer technology from 1964 to 2004 was transistor technology.

Beginning from 2005, micro/nano-electronics is microprocessor technology.

Microprocessor is invariant vs. physics of elements (such as transistors).

This will create the base for eventual shift

solid-state "statistical" bit processing to single quantum phenomena in data processing.

Further development = development of system hierarchy in a single chip

Achievement of every new level of system hierarchy requires about a quarter of century (compare with 2-years stratified Moor's law).

But every new level of hierarchy in data processing will strongly change the phase of civilization.

2008©B.F.Dorfman

Source of data

1. V.F.Dorfman, L.V.Ivanov, Computer and its Elements: Development and Optimization, Moscow, 1988.
2. V.F.Dorfman,L.V.Ivanov, Problems of the System Technology of Computers, J. New Gener. Comput. Syst. Berlin, 2(1989)1, 3-23.
3. V.F.Dorfman, On the Border of Millenium, "Znanie", Moscow, 1982.
4. B.F.Dorfman, Presentation to Clarkson University, 2004



Talk presented for Clarkson University in November 2004 is available by 2 parts

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part1.pdf

http://www.clarkson.edu/camp/reports_publications/dorfman/Dorfman_Computers_Part2.pdf


"Computers: technology, performance and impact on our life: What is predictable, what is not?"