Early in the investigation of what makes up atoms, a physicist named Planck noticed a logical problem for the structure of the atom. Planck noticed a fatal flaw in our physics by demonstrating that the electron in orbit around the nucleus accelerates. Acceleration means a changing electric field (the electron has charge), when means photons should be emitted. But, then the electron would lose energy and fall into the nucleus. Therefore, atoms shouldn't exist!
To resolve this problem, Planck made a wild assumption that energy, at the sub-atomic level, can only be transfered in small units, called quanta. Due to his insight, we call this unit Planck's constant (h). The word quantum derives from quantity and refers to a small packet of action or process, the smallest unit of either that can be associated with a single event in the microscopic world.
Changes of energy, such as the transition of an electron from one orbit to another around the nucleus of an atom, is done in discrete quanta. Quanta are not divisible. The term quantum leap refers to the abrupt movement from one discrete energy level to another, with no smooth transition. There is no ``inbetween''.
The quantization, or ``jumpiness'' of action as depicted in quantum physics differs sharply from classical physics which represented motion as smooth, continuous change. Quantization limits the energy to be transfered to photons and resolves the UV catastrophe problem.
The wave-like nature of light explains most of its properties:
This dualism to the nature of light is best demonstrated by the photoelectric effect, where a weak UV light produces a current flow (releases electrons) but a strong red light does not release electrons no matter how intense the red light.
Einstein explained that light exists in a particle-like state as packets of energy (quanta) called photons. The photoelectric effect occurs because the packets of energy carried by each individual red photons are too weak to knock the electrons off the atoms no matter how many red photons you beamed onto the cathode. But the individual UV photons were each strong enough to release the electron and cause a current flow.
It is one of the strange, but fundamental, concepts in modern physics that light has both a wave and particle state (but not at the same time), called wave-particle dualism.
Perhaps the foremost scientists of the 20th century was Niels Bohr, the first to apply Planck's quantum idea to problems in atomic physics. In the early 1900's, Bohr proposed a quantum mechanical description of the atom to replace the early model of Rutherford.
The Bohr model basically assigned discrete orbits for the electron, multiples of Planck's constant, rather than allowing a continuum of energies as allowed by classical physics.
The power in the Bohr model was its ability to predict the spectra of light emitted by atoms. In particular, its ability to explain the spectral lines of atoms as the absorption and emission of photons by the electrons in quantized orbits.
Our current understanding of atomic structure was formalized by Heisenberg and Schroedinger in the mid-1920's where the discreteness of the allowed energy states emerges from more general aspects, rather than imposed as in Bohr's model. The Heisenberg/Schroedinger quantum mechanics have consistent fundamental principles, such as the wave character of matter and the incorporation of the uncertainty principle.
In principle, all of atomic and molecular physics, including the structure of atoms and their dynamics, the periodic table of elements and their chemical behavior, as well as the spectroscopic, electrical, and other physical properties of atoms and molecules, can be accounted for by quantum mechanics => fundamental science.
de Broglie Matter Waves:
Perhaps one of the key questions when Bohr offered his quantized orbits as an explanation to the UV catastrophe and spectral lines is, why does an electron follow quantized orbits? The response to this question arrived from the Ph.D. thesis of Louis de Broglie in 1923. de Broglie argued that since light can display wave and particle properties, then perhaps matter can also be a particle and a wave too.
One way of thinking of a matter wave (or a photon) is to think of a wave packet. Normal waves look with this:
having no beginning and no end. A composition of several waves of different wavelength can produce a wave packet that looks like this:
So a photon, or a free moving electron, can be thought of as a wave packet, having both wave-like properties and also the single position and size we associate with a particle. There are some slight problems, such as the wave packet doesn't really stop at a finite distance from its peak, it also goes on for every and every. Does this mean an electron exists at all places in its trajectory?
de Broglie also produced a simple formula that the wavelength of a matter particle is related to the momentum of the particle. So energy is also connected to the wave property of matter.
Lastly, the wave nature of the electron makes for an elegant explanation to quantized orbits around the atom. Consider what a wave looks like around an orbit, as shown below.
The electron matter wave is both finite and unbounded (remember the 1st lecture on math). But only certain wavelengths will `fit' into an orbit. If the wavelength is longer or shorter, then the ends do not connect. Thus, de Broglie explains the Bohr atom in that on certain orbits can exist to match the natural wavelength of the electron. If an electron is in some sense a wave, then in order to fit into an orbit around a nucleus, the size of the orbit must correspond to a whole number of wavelengths.
Notice also that this means the electron does not exist at one single spot in its orbit, it has a wave nature and exists at all places in the allowed orbit. Thus, a physicist speaks of allowed orbits and allowed transitions to produce particular photons (that make up the fingerprint pattern of spectral lines). And the Bohr atom really looks like the following diagram:
While de Broglie waves were difficult to accept after centuries of thinking of particles are solid things with definite size and positions, electron waves were confirmed in the laboratory by running electron beams through slits and demonstrating that interference patterns formed.
How does the de Broglie idea fit into the macroscopic world? The length of the wave diminishes in proportion to the momentum of the object. So the greater the mass of the object involved, the shorter the waves. The wavelength of a person, for example, is only one millionth of a centimeter, much to short to be measured. This is why people don't `tunnel' through chairs when they sit down.
Young Two-Slit Experiment:
The wave-like properties of light were demonstrated by the famous experiment first performed by Thomas Young in the early nineteenth century. In original experiment, a point source of light illuminates two narrow adjacent slits in a screen, and the image of the light that passes through the slits is observed on a second screen.
However, notice that electrons do act as particles, as do photons. For example, they make a single strike on a cathode ray tube screen. So if we lower the number of electrons in the beam to, say, one per second. Does the interference pattern disappear?
The answer is no, we do see the individual electrons (and photons) strike the screen, and with time the interference pattern builds up. Notice that with such a slow rate, each photon (or electron) is not interacting with other photons to produce the interference pattern. In fact, the photons are interacting with themselves, within their own wave packets to produce interference.
But wait, what if we do this so slow that only one electron or one photon passes through the slits at a time, then what is interfering with what? i.e. there are not two waves to destructively and constructively interfere. It appears, in some strange way, that each photon or electron is interfering with itself. That its wave nature is interfering with its own wave (!).
The formation of the interference pattern requires the existence of two slits, but how can a single photon passing through one slit `know' about the existence of the other slit? We are stuck going back to thinking of each photon as a wave that hits both slits. Or we have to think of the photon as splitting and going through each slit separately (but how does the photon know a pair of slits is coming?). The only solution is to give up the idea of a photon or an electron having location. The location of a subatomic particle is not defined until it is observed (such as striking a screen).
Role of the Observer:
The quantum world can be not be perceived directly, but rather through the use of instruments. And, so, there is a problem with the fact that the act of measuring disturbs the energy and position of subatomic particles. This is called the measurement problem.
Thus, we begin to see a strong coupling of the properties of an quantum object and and the act of measuring those properties. The question of the reality of quantum properties remains unsolved. All quantum mechanical principles must reduce to Newtonian principles at the macroscopic level (there is a continuity between quantum and Newtonian mechanics).
How does the role of the observer effect the wave and particle nature of the quantum world? One test is to return to the two slit experiment and try to determine count which slit the photon goes through. If the photon is a particle, then it has to go through one or the other slit. Doing this experiment results in wiping out the interference pattern. The wave nature of the light is eliminated, only the particle nature remains and particles cannot make interference patterns. Clearly the two slit experiments, for the first time in physics, indicates that there is a much deeper relationship between the observer and the phenomenon, at least at the subatomic level. This is an extreme break from the idea of an objective reality or one where the laws of Nature have a special, Platonic existence.
If the physicist looks for a particle (uses particle detectors), then a particle is found. If the physicist looks for a wave (uses a wave detector), then a wave pattern is found. A quantum entity has a dual potential nature, but its actual (observed) nature is one or the other.
Quantum Wave Function:
The wave nature of the microscopic world makes the concept of `position' difficult for subatomic particles. Even a wave packet has some `fuzziness' associated with it. An electron in orbit has no position to speak of, other than it is somewhere in its orbit.
To deal with this problem, quantum physics developed the tool of the quantum wave function as a mathematical description of the superpositions associated with a quantum entity at any particular moment.
The key point to the wave function is that the position of a particle is only expressed as a likelihood or probability until a measurement is made. For example, striking an electron with a photon results in a position measurement and we say that the wave function has `collapsed' (i.e. the wave nature of the electron converted to a particle nature).
The fact that quantum systems, such as electrons and protons, have indeterminate aspects means they exist as possibilities rather than actualities. This gives them the property of being things that might be or might happen, rather than things that are. This is in sharp contrast to Newtonian physics where things are or are not, there is no uncertainty except those imposed by poor data or limitations of the data gathering equipment.
Further experimentation showed that reality at the quantum (microscopic) level consists of two kinds of reality, actual and potential. The actual is what we get when we see or measure a quantum entity, the potential is the state in which the object existed before it was measured. The result is that a quantum entity (a photon, electron, neutron, etc) exists in multiple possibilities of realities known as superpositions.
The superposition of possible positions for an electron can be demonstrated by the observed phenomenon called quantum tunneling.
Notice that the only explanation for quantum tunneling is if the position of the electron is truly spread out, not just hidden or unmeasured. It raw uncertainty allows for the wave function to penetrate the barrier. This is genuine indeterminism, not simply an unknown quantity until someone measures it.
It is important to note that the superposition of possibilities only occurs before the entity is observed. Once an observation is made (a position is measured, a mass is determined, a velocity is detected) then the superposition converts to an actual. Or, in quantum language, we say the wave function has collapsed.
The collapse of the wave function by observation is a transition from the many to the one, from possibility to actuality. The identity and existence of a quantum entities are bound up with its overall environment (this is called contextualism). Like homonyms, words that depend on the context in which they are used, quantum reality shifts its nature according to its surroundings.
In the macroscopic world ruled by classical physics, things are what they are. In the microscopic world ruled by quantum physics, there is an existential dialogue among the particle, its surroundings and the person studying it.
Macroscopic/Microscopic World Interface:
The macroscopic world is Newtonian and deterministic for local events (note however that even the macroscopic world suffers from chaos). On the other hand, the microscopic quantum world radical indeterminacy limits any certainty surrounding the unfolding of physical events. Many things in the Newtonian world are unpredictable since we can never obtain all the factors effecting a physical system. But, quantum theory is much more unsettling in that events often happen without cause (e.g. radioactive decay).
Note that the indeterminacy of the microscopic world has little effect on macroscopic objects. This is due to the fact that wave function for large objects is extremely small compared to the size of the macroscopic world. Your personal wave function is much smaller than any currently measurable sizes. And the indeterminacy of the quantum world is not complete because it is possible to assign probabilities to the wave function.
But, as Schrodinger's Cat paradox show us, the probability rules of the microscopic world can leak into the macroscopic world. The paradox of Schrodinger's cat has provoked a great deal of debate among theoretical physicists and philosophers. Although some thinkers have argued that the cat actually does exist in two superposed states, most contend that superposition only occurs when a quantum system is isolated from the rest of its environment. Various explanations have been advanced to account for this paradox--including the idea that the cat, or simply the animal's physical environment (such as the photons in the box), can act as an observer.
The question is, at what point, or scale, do the probabilistic rules of the quantum realm give way to the deterministic laws that govern the macroscopic world? This question has been brought into vivid relief by the recent work where an NIST group confined a charged beryllium atom in a tiny electromagnetic cage and then cooled it with a laser to its lowest energy state. In this state the position of the atom and its "spin" (a quantum property that is only metaphorically analogous to spin in the ordinary sense) could be ascertained to within a very high degree of accuracy, limited by Heisenberg's uncertainty principle.
The workers then stimulated the atom with a laser just enough to change its wave function; according to the new wave function of the atom, it now had a 50 percent probability of being in a "spin-up" state in its initial position and an equal probability of being in a "spin-down" state in a position as much as 80 nanometers away, a vast distance indeed for the atomic realm. In effect, the atom was in two different places, as well as two different spin states, at the same time--an atomic analog of a cat both living and dead.
The clinching evidence that the NIST researchers had achieved their goal came from their observation of an interference pattern; that phenomenon is a telltale sign that a single beryllium atom produced two distinct wave functions that interfered with each other.
Many-Worlds Hypothesis :
The many possibilities carried by quantum superpositions are spread out over space and time. However, Newtonian physics is an accurate description of ordinary experience. What is the relationship between the strange quantum world and the classical world of common sense? Clearly the difference occurs when we measure or observe a quantum system. Whatever the process, it occurs at that time. The ``how and why'' of this process is unsolved and many believe modern physics will be incomplete until it is resolved.
By the 1950's, the ongoing parade of successes had made it abundantly clear that quantum theory was far more than a short-lived temporary fix. And so, in the mid 1950's, a Princeton graduate student named Hugh Everett III decided to revisit the collapse postulate in his Ph.D. thesis. Everett's idea is known as the relative-state, many-histories or many-universes interpretation or metatheory of quantum theory. Dr Hugh Everett, III, its originator, called it the "relative-state metatheory" or the "theory of the universal wavefunction", but it is generally called "many-worlds".
Many-worlds is a re-formulation of quantum theory which treats the process of observation or measurement entirely within the wave-mechanics of quantum theory, rather than an input as additional assumption, as in the Copenhagen interpretation. Everett considered the wavefunction a real object. Many-worlds is a return to the classical, pre-quantum view of the universe in which all the mathematical entities of a physical theory are real. For example the electromagnetic fields of James Clark Maxwell or the atoms of Dalton were considered as real objects in classical physics. Everett treats the wavefunction in a similar fashion. Everett also assumed that the wavefunction obeyed the same wave equation during observation or measurement as at all other times. This is the central assumption of many-worlds: that the wave equation is obeyed universally and at all times.
Quantum systems, like particles, that interact become entangled. If one of the systems is an observer and the interaction an observation then the effect of the observation is to split the observer into a number of copies, each copy observing just one of the possible results of a measurement and unaware of the other results and all its observer copies. Interactions between systems and their environments, including communication between different observers in the same world, transmits the correlations that induce local splitting or decoherence into non-interfering branches of the universal wavefunction. Thus the entire world is split, quite rapidly, into a host of mutually unobservable but equally real worlds.
According to many-worlds all the possible outcomes of a quantum interaction are realised. The wavefunction, instead of collapsing at the moment of observation, carries on evolving in a deterministic fashion, embracing all possibilities embedded within it. All outcomes exist simultaneously but do not interfere further with each other, each single prior world having split into mutually unobservable but equally real worlds.
Many-worlds, as stated above, seems quite crazy as a physical theory. And proposes a large number of very un-scientific ideas, like parallel realities that we can see or test exist. Since the parallel worlds can be tested, they do not count as a proper science hypothesis. The modern formulation of many-worlds is not that the parallel realities exists as separate entities, but rather they all exist in our reality. In other words, what is "real" is the wave function, and all these possibilities exist inside the wave function, you simply only appearance one path through that wave function. The best way to think about this is to follow the reality of Ms. Kitty.