History of Nanotechnology

By: Jerry Flattum, Performer/Songwriter & Writer/Editor

The history of nanotechnology, in some sense dates back to prehistoric times when early humans/hominoids made use of naturally-occurring, nanoscale elements. Depicted in the header above, nano-sized carbon molecules integrate nicely with the more porous rock surface of the cave walls to remain embedded for thousands of years.

The process and appreciation of miniaturization, the entire principle of nanotechnology, is also not new to the electronic or even the post-industrial eras. For thousands of years prior, the east has regarded small as superior and aesthetically pleasing.

In classic Japanese literature- The Pillow Book by Sei Shonagon, court lady to an Empress in the 10th Century, small things are to be regarded as beautiful.


An excerpt from “The Pillow Book”, 10th Century Japanese Literature praising miniaturization.

Contemporary: Richard Feynman

In 1959, theoretical physicist Richard Feynman asked a couple of questions: “Why cannot we write the entire 24 volumes of the Encyclopedia Britannica on the head of a pin?” He also asked, “I put this out as a challenge: Is there no way to make the electron microscope more powerful?”

No doubt the folks of the American Physical Society at the California Institute of Technology (Caltech) were not only puzzled but also intrigued. Actually, Feynman upped the ante on his on the head of pin question by asking why we simply couldn’t store every book ever written in the same amount of space.

buckyball-future-human-evolutionThe talk Feynman gave was later published in the February 1960 issue of Caltech’s Engineering and Science, which many say represents the first introduction to nanotechnology. Feynman never used the word nanotechnology.

Feynman further speculated that since we already knew cells were capable of manufacturing processes and doing other things, why then, couldn’t humans manufacture things at the same level? He took the question further and asked why we couldn’t manufacture not only at the cellular level, but better yet, at the atomic level.

Computers were so large they filled entire rooms. Was there a way to miniaturize them? Well, a few microchips and a few decades later, we did just that.

But it was in 1974 Tokyo Science University professor Norio Taniguchi coined the term nanotechnology. So nano was well on its way already in competing with micro.

Taniguchi was concerned with manufacturing materials with nanometer tolerances. At the nano level, gravity becomes less an issue whereas the strength of materials (tolerance) becomes a greater one. There is very little information on Tniguchi, other than cited references to two of his works, Nanotechnology: integrated processing systems for ultra-precision and ultra-fine products and On the Basic Concept of Nano-Technology.

Back to Feynman, he further asked, What are the limitations as to how small a thing has to be before you can no longer mold it? Essentially he saw the current (late 50s) manufacturing processes of cutting, soldering, stamping and drilling as clumsy, with end results taking up too much space and materials requiring vast amounts of energy to operate. Feynman wasn’t just being theoretical; he was being practical. What Feynman envisioned were machines building at the molecular level.

Feynman was aware of the problems surrounding his atomic/sub-atomic factory building ideas. All things do not simply scale down in proportion and there is the problem that materials stick together by molecular attractions (Van der Waals). The laws of Quantum mechanics, chemistry and physics would come into question. New kinds of forces, effects and processes would be discovered, altering outcomes, if not the very paradigm in which scientists interpreted the world.

Atomic Theory

Miniaturization was certainly nothing new in 1959; after all, the electron microscope was in full use by then. But, as far back as the 5th Century B.C., Democritus and Leucippus, determined that matter was made up of tiny, indivisible particles in constant motion. Apparently Aristotle disagreed. Aristotle must’ve had a lot of pull since the idea of indivisible particles was shelved until the 1600s, when Sir Isaac Newton proposed an atomic model in The Mathematical Principles of Natural Philosophy, or The Principia as it came to be commonly known.

English chemist and physicist, John Dalton (1766-1844), developed the first useful atomic theory of matter around 1803. Some of the details of Dalton’s atomic theory have since been refuted, but the core concepts (that chemical reactions can be explained by the union and separation of atoms, and that these atoms have characteristic properties) became the foundations of modern physical science.

Another British physicist, J.J. Thomson (1856-1940), made a new discovery about the atom. At the Cavendish Laboratory at Cambridge University, Thomson was experimenting with currents of electricity inside empty glass tubes, known as “cathode rays.” He proposed that the mysterious cathode rays were streams of particles much smaller than atoms. He called these particles “corpuscles,” which later became known as electrons. He suggested these electrons made up all the matter in atoms. Before Thomson, scientists thought that the atom was indivisible, being the most fundamental unit of matter.

Cathode Rays

Thomson also discovered the isotope. Isotopes are atoms of a chemical element whose nuclei have the same atomic number but different atomic weights. The atomic number corresponds to the number of protons in an atom. Isotopes of each element contain the same number of protons. Atomic weights are the result of the differences in the number of neutrons in the nuclei.

In 1911, Thomson’s student, Ernest Rutherford, suggested a new model for the atom. Through his experiments in positive and negative charges within the atom, he determined there was a center of the atom, now known as the nucleus. Electrons, it turned out, revolved around the nucleus. Rutherford also discovered the proton and neutron.

Around 1914, Swedish Physicist Niels Bohr, advanced atomic theory further in discovering that electrons traveled around the nucleus in fixed energy levels.


Nuclear Regulatory Commission

nuclear-regulatory-commission-on-the-future-of-human-evolution-websiteShortly after WW2, atomic theory became mainstream, so to speak. Before the Nuclear Regulatory Commission was created, nuclear regulation was the responsibility of the Atomic Energy Commission, first established in the Atomic Energy Act of 1946. The law was replaced with the Atomic Energy Act of 1954, which made the development of commercial nuclear power possible. The Energy Reorganization Act of 1974 abolished the AEC and replaced it with the Nuclear Regulatory Commission.

Through the Commission History Program, the origins and evolution of NRC regulatory policies are documented in four volumes of nuclear regulatory history published by the University of California Press. These volumes are:

  • Controlling the Atom: The Beginnings of Nuclear Regulation 1946-1962 (1984) (NUREG-1610).
  • Containing the Atom: Nuclear Regulation in a Changing Environment, 1963-1971 (1992)
  • Permissible Dose: A History of Radiation Protection in the Twentieth Century (2000)
  • Three Mile Island: A Nuclear Crisis in Historical Perspective (2004)

United States Nuclear Regulatory Commission: http://www.nrc.gov



The Higgs Boson was confirmed at CERN in 2012, validating the Standard Model structure.

CERN (Conseil Europen pour la Recherche Nuclaire) is the European Organization for Nuclear Research, the world’s largest particle physics center, concerned with what matter is made of and what forces hold it together. At CERN, the primary tools scientists use are accelerators, which accelerate particles to almost the speed of light and detectors to make the particles visible.

CERN is also responsible for the development and creation of the World Wide Web. The WWW was designed to improve and speed-up the information shared between physicists working in different universities and institutes all over the world.

CERN conducts research into the fundamental particles that bind together to form structures on all scales, from the proton built from three quarks, through atoms and molecules, liquids and solids, to the huge conglomerations of matter in stars and galaxies.

There are four basic forces involved.

  1. The most familiar basic force is gravity. Gravity plays less of a role in particle physics, because we humans have no idea whatsoever about what gravity is, so we conveniently leave it out of the discussion and our model(s).
  2. A stronger than gravity fundamental force, is electromagnetism. The electromagnetic force binds negative electrons to the positive nuclei in atoms, and underlies the interactions between atoms that give rise to molecules and to solids and liquids.Inside atomic nuclei and at even smaller structures (nucleons), two forces come into play: the weak force and the strong force.
  3. The weak force leads to the decay of neutrons (which underlies many natural occurrences of radioactivity) and allows the conversion of a proton into a neutron (responsible for hydrogen burning in the centre of stars).
  4. The strong force holds quarks together within protons, neutrons and other particles.

CERN established the Standard Model of Particles and Forces. The model involves 12 matter particles (6 quarks and 6 leptons) and 4 force carrier particles to summarize all that we currently know about the most fundamental constituents of matter and their interactions. CERN scientists are working on what is called the Grand Unified Theory (GUT). The GUT is an attempt to unify all the forces of Nature in a single super force.



New spin on the 1980′s Self Replicating Lunar Factory: 3D printing paving the way to the future of human evolution.

In 1980, NASA conducted a study called, “Advanced Automation for Space Missions,” by request of then President Jimmy Carter, at a cost of Nearly 12 million.

The study set forth a realistic proposal for a self-replicating automated lunar factory system. The proposal never gained much attention. However, it did serve as a prelude to factory building and replication at the nanotechnological level. What was speculation in the 1980′s has become near reality some 30 years later.

From the study: “…designing the factory as an automated, multiproduct, remotely controlled, reprogrammable Lunar Manufacturing Facility (LMF) capable of constructing duplicates of itself which would themselves be capable of further replication. Successive new systems need not be exact copies of the original, but could, by remote design and control, be improved, reorganized, or enlarged so as to reflect changing human requirements.”

In the study, NASA gave Hungarian-American mathematician John von Neumann most of the credit for the conception of of machine reproduction. It was circa 1966 when John von Neumann began studying automata (self-operating machine) replication. Von Neumann had also contributed to quantum theory, was the co-inventor of the theory of games, and he he worked on the Manhattan Project (involved in the design of the implosion mechanism for the plutonium bomb).

Later, he became involved in the ENIAC computer project at the Moore School of Electrical Engineering, University of Pennsylvania. The ENIAC was a military project begun in 1945-1946, and is considered to be the forerunner of todays modern computer. Prior to the ENIAC, which later became UNIVAC, there were only simple relay machines and analog devices, such as the differential analyzer. Von Neuman was not only interested in processing speed, but also how large machine were analogous to the complex behavior of living systems.

The NASA study did not ignore the risks and dangers of self-replicating machines to human evolution. The combination of artificial intelligence and self-replicating machines, NASA warned, could become adversarial. Without further research, there was no telling what kinds of behaviors such a machine might exhibit or if it would develop a sort of machine sociobiology. Benefits for the future of human evolution depended on a mission for automation and machine replicative techniques to improve, protect, and increase the productivity of human society, the study reported.

In an unusal merger of science fiction and scientific speculation, the study quoted science fiction author, Isaac Asimovs, Asimovs Three Laws of Robotics, which state: 1) A robot may not injure a human being, or, through inaction, allow a human being to come to harm; 2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law; 3) A robot must protect its own existence as long as such protection does not conflict with tile First or Second Laws. Asimov came up with these laws circa 1950.

However, these rules did not reflect future developments in artificial intelligence, such as the ability of robots or evolving machines to understand the concept of God or creation, or even possess a soul. Could such a machine learn to view humans in the same way humans view monkeys, a lower form of species in our evolutionary journey?

Behind the NASA study, not only were there major developments in atomic and nuclear theory, but also in the understanding of biological reproduction at the molecular or biochemical level. The similarities between biological replication and machine replication were acknowledged: (1) follow instructions to make machinery, (2) copy the instructions, (3) divide the machinery, providing a sufficient set in each half, (4) assign a set of instructions to each half, and (5) complete the physical separation.

Eric Drexler

A few years later, Eric Drexler borrowed Norio Taniguchis term nanotechnology, and explored the subject in much greater depth. In 1986 he wrote the book, Engines of Creation: The Coming Era of Nanotechnology. Drexler did not have a PhD when he wrote the book. In 1991, Drexlers MIT doctoral dissertation was called, Nanosystems: Molecular Machinery, Manufacturing, and Computation. MIT awarded Drexler a PhD in Molecular Nanotechnology, the first degree of its kind.

Drexler established the field of molecular nanotechnology.

His work showed quantum chemists and synthetic chemists how the knowledge of bonds and molecules could serve as the basis for further development of manufacturing systems of nanotechnlogy, and show physicists and engineers how to scale down their concepts of macroscopic systems to the level of molecules.

Richard E. Smalley

He was the founding director of the Center for Nanoscale Science and Technology at Rice from 1996 to 2002, and then became Director of the new Carbon Nanotechnology Laboratory at Rice University, Houston, Texas.

He is widely known for the discovery and characterization of C60 (Buckminsterfullerene, a.k.a the buckyball), a soccerball-shaped molecule that, together with other fullerenes such as C70, now constitutes the third elemental form of carbon (after graphite and diamond).

(Fullerene: Any of various cage-like, hollow molecules composed of hexagonal and pentagonal groups of atoms, and especially those formed from carbon, that constitute the third form of carbon after diamond and graphite).

He researched and popularized buckytubes, elongated fullerenes that are essentially a new high tech polymer, following on from nylon, polypropylene, and Kevlar. Buckytubes conduct electricity, unlike any of the previous wonder polymers. Bukytubes are finding applications in nearly every technology where electrons flow.



Category: Nanotechnology