Materials Science and Engineering/List of Topics/Quantum Mechanics/Origins of Quantum Physics

From Wikiversity
Jump to navigation Jump to search

Origins of Quantum Physics[edit | edit source]

Particle Aspect of Electromagnetic Radiation[edit | edit source]

Blackbody Radiation[edit | edit source]

In physics, a black body is an object that absorbs all electromagnetic radiation that falls onto it. No radiation passes through it and none is reflected. It is this lack of both transmission and reflection to which the name refers. These properties make black bodies ideal sources of thermal radiation. That is, the amount and spectrum of electromagnetic radiation they emit is directly related to their temperature. Black bodies below around 700 K (430 °C) produce very little radiation at visible wavelengths and appear black. Black bodies above this temperature however, produce radiation at visible wavelengths starting at red, going through orange, yellow, and white before ending up at blue as the temperature increases.

The term "black body" was introduced by Gustav Kirchhoff in 1860. The light emitted by a black body is called black-body radiation (or cavity radiation), and has a special place in the history of quantum mechanics.

Photoelectric Effect[edit | edit source]

The photoelectric effect is a quantum electronic phenomenon in which electrons are emitted from matter after the absorption of energy from electromagnetic radiation such as x-rays or visible light.[1] The emitted electrons can be referred to as photoelectrons in this context. The effect is also termed the Hertz Effect, due to its discovery by Heinrich Rudolf Hertz, although the term has generally fallen out of use.

Study of the photoelectric effect led to important steps in understanding the quantum nature of light and electrons and influenced the formation of the concept of wave–particle duality.

Compton Effect[edit | edit source]

In physics, Compton scattering or the Compton effect is the decrease in energy (increase in wavelength) of an X-ray or gamma ray photon, when it interacts with matter. Inverse Compton scattering also exists, where the photon gains energy (decreasing in wavelength) upon interaction with matter. The amount the wavelength increases by is called the Compton shift. Although nuclear compton scattering exists, Compton scattering usually refers to the interaction involving only the electrons of an atom. The Compton effect was observed by Arthur Holly Compton in 1923 and further verified by his graduate student Y. H. Woo in the years following. Arthur Compton earned the 1927 Nobel Prize in Physics for the discovery.

The effect is important because it demonstrates that light cannot be explained purely as a wave phenomenon. Thomson scattering, the classical theory of an electromagnetic wave scattered by charged particles, cannot explain any shift in wavelength. Light must behave as if it consists of particles in order to explain the Compton scattering. Compton's experiment convinced physicists that light can behave as a stream of particles whose energy is proportional to the frequency.

The interaction between electrons and high energy photons results in the electron being given part of the energy (making it recoil), and a photon containing the remaining energy being emitted in a different direction from the original, so that the overall momentum of the system is conserved. If the photon still has enough energy left, the process may be repeated. If the photon has sufficient energy (in general a few eV, right around the energy of visible light), it can even eject an electron from its host atom entirely (a process known as the Photoelectric effect).

Pair Production[edit | edit source]

Pair production refers to the creation of an elementary particle and its antiparticle, usually from a photon (or another neutral boson). This is allowed, provided there is enough energy available to create the pair – at least the total rest mass energy of the two particles – and that the situation allows both energy and momentum to be conserved (though not necessarily on shell). All other conserved quantum numbers (angular momentum, electric charge) of the produced particles must sum to zero — thus the created particles shall have opposite values of each (for instance, if one particle has strangeness +1 then another one must have strangeness −1).

Wave Aspect of Particles[edit | edit source]

de Broglie's Hypothesis: Matter Waves[edit | edit source]

In physics, the de Broglie hypothesis (pronounced /brœj/, as French breuil, close to "broy") is the statement that all matter (any object) has a wave-like nature (wave-particle duality). The de Broglie relations show that the wavelength is inversely proportional to the momentum of a particle and that the frequency is directly proportional to the particle's kinetic energy. The hypothesis was advanced by Louis de Broglie in 1924 in his PhD thesis; he was awarded the Nobel Prize for Physics in 1929 for this work, which made him the first person to receive a Nobel Prize on a PhD thesis.

Experimental Confirmation of de Broglie's Hypothesis[edit | edit source]

In 1927 at Bell Labs, Clinton Davisson and Lester Germer fired slow-moving electrons at a crystalline nickel target. The angular dependence of the reflected electron intensity was measured, and was determined to have the same diffraction pattern as those predicted by Bragg for X-Rays. Before the acceptance of the de Broglie hypothesis, diffraction was a property that was thought to be only exhibited by waves. Therefore, the presence of any diffraction effects by matter demonstrated the wave-like nature of matter. When the de Broglie wavelength was inserted into the Bragg condition, the observed diffraction pattern was predicted, thereby experimentally confirming the de Broglie hypothesis for electrons.

This was a pivotal result in the development of quantum mechanics. Just as Arthur Compton demonstrated the particle nature of light, the Davisson-Germer experiment showed the wave-nature of matter, and completed the theory of wave-particle duality. For physicists this idea was important because it means that not only can any particle exhibit wave characteristics, but that one can use wave equations to describe phenomena in matter if one uses the de Broglie wavelength.

Since the original Davisson-Germer experiment for electrons, the de Broglie hypothesis has been confirmed for other elementary particles.

Wave Characteristics of Macroscopic Objects[edit | edit source]

Experimental[edit | edit source]

Since the demonstrations of wave-like properties in photons and electrons, similar experiments have been conducted with neutrons and protons. Among the most famous experiments are those of Estermann and Otto Stern in 1929.[source?] Authors of similar recent experiments with atoms and molecules, described below, claim that these larger particles also act like waves.

A dramatic series of experiments emphasizing the action of gravity in relation to wave–particle duality were conducted in the 1970s using the neutron interferometer.[source?] Neutrons, one of the components of the atomic nucleus, provide much of the mass of a nucleus and thus of ordinary matter. In the neutron interferometer, they act as quantum-mechanical waves directly subject to the force of gravity. While the results were not surprising since gravity was known to act on everything, including light (see tests of general relativity and the Pound-Rebka falling photon experiment), the self-interference of the quantum mechanical wave of a massive fermion in a gravitational field had never been experimentally confirmed before.

In 1999, the diffraction of C60 fullerenes by researchers from the University of Vienna was reported.[7] Fullerenes are comparatively large and massive objects, having an atomic mass of about 720 u. The de Broglie wavelength is 2.5 pm, whereas the diameter of the molecule is about 1 nm, i.e. about 400 times larger. As of 2005, this is the largest object for which quantum-mechanical wave-like properties have been directly observed in far-field diffraction.

In 2003 the Vienna group also demonstrated the wave nature of tetraphenylporphyrin[8] – a flat biodye with an extension of about 2 nm and a mass of 614 u. For this demonstration they employed a near-field Talbot Lau interferometer.[9][10] In the same interferometer they also found interference fringes for C60F48., a fluorinated buckyball with a mass of about 1600 u, composed of 108 atoms[8] Large molecules are already so complex that they give experimental access to some aspects of the quantum-classical interface, i.e. to certain decoherence mechanisms.[11][12]

Whether objects heavier than the Planck mass (about the weight of a large bacterium) have a de Broglie wavelength is theoretically unclear and experimentally unreachable; above the Planck mass a particle's Compton wavelength would be smaller than the Planck length and its own Schwarzschild radius, a scale at which current theories of physics may break down or need to be replaced by more general ones.

de Broglie Equation[edit | edit source]

The first de Broglie equation relates the wavelength to the particle momentum as

where is Planck's constant, is the particle's rest mass, is the particle's velocity, is the Lorentz factor, and is the speed of light in a vacuum.

The greater the energy, the larger the frequency and the shorter (smaller) the wavelength. Given the relationship between wavelength and frequency, it follows that short wavelengths are more energetic than long wavelengths. The second de Broglie equation relates the frequency of the wave associated to a particle to the total energy of the particle such that

where is the frequency and is the total energy. The two equations are often written as

where is the reduced Planck's constant (also known as Dirac's constant, pronounced "h-bar"), is the wavenumber, and is the angular frequency.

Quantum View of Particles and Waves[edit | edit source]

In physics and chemistry, wave–particle duality is the concept that all matter exhibits both wave-like and particle-like properties. A central concept of quantum mechanics, duality addresses the inadequacy of classical concepts like "particle" and "wave" in fully describing the behaviour of objects. Various interpretations of quantum mechanics attempt to explain this ostensible paradox.

The idea of duality is rooted in a debate over the nature of light and matter dating back to the 1600s, when competing theories of light were proposed by Christiaan Huygens and Isaac Newton. Through the work of Albert Einstein, Louis de Broglie and many others, current scientific theory holds that all particles also have a wave nature. This phenomenon has been verified not only for elementary particles, but also for compound particles like atoms and even molecules. In fact, according to traditional formulations of non-relativistic quantum mechanics, wave–particle duality applies to all objects, even macroscopic ones; we can't detect wave properties of macroscopic objects due to their small wavelengths.

Principle of Linear Superposition[edit | edit source]

Indeterminism in the Microphysical World[edit | edit source]

Heisenberg Uncertainty Principle[edit | edit source]

In quantum physics, the outcome of even an ideal measurement of a system is not deterministic, but instead is characterized by a probability distribution, and the larger the associated standard deviation is, the more "uncertain" we might say that that characteristic is for the system. The Heisenberg uncertainty principle, or HUP, gives a lower bound on the product of the standard deviations of position and momentum for a system, implying that it is impossible to have a particle that has an arbitrarily well-defined position and momentum simultaneously. More precisely, the products of the standard deviations in each of the three spatial dimensions are bounded by

where is the reduced Planck constant; Δx, Δy, and Δz are the standard deviations of the three coordinates of position; and Δpx, Δpy, and Δpz are the standard deviations of the three components of momentum. The principle generalizes to many other pairs of quantities besides position and momentum (for example, angular momentum about two different axes), and can be derived directly from the axioms of quantum mechanics (particularly the de Broglie relations).

Note that the uncertainties in question are characteristic of the mathematical quantities themselves. In any real-world measurement, there will be additional uncertainties created by the non-ideal and imperfect measurement process. The uncertainty principle holds true regardless of whether the measurements are ideal (sometimes called von Neumann measurements) or non-ideal (Landau measurements). Note also that the product of the uncertainties, of order 10−35 joule-seconds, is so small that the uncertainty principle has negligible effect on objects of macroscopic scale, despite its importance for atoms and subatomic particles.

The uncertainty principle was an important step in the development of quantum mechanics when it was discovered by Werner Heisenberg in 1927. It is often confused with the observer effect.

Probabilistic Interpretation[edit | edit source]

  1. A system is completely described by a wave function , which represents an observer's knowledge of the system. (Heisenberg) [source?]
  2. The description of nature is essentially probabilistic. The probability of an event is related to the square of the amplitude of the wave function related to it. (Max Born)
  3. Heisenberg's uncertainty principle states the observed fact that it is not possible to know the values of all of the properties of the system at the same time; those properties that are not known with precision must be described by probabilities.
  4. (Complementary Principle) Matter exhibits a wave-particle duality. An experiment can show the particle-like properties of matter, or wave-like properties, but not both at the same time.(Niels Bohr)
  5. Measuring devices are essentially classical devices, and measure classical properties such as position and momentum.
  6. The Correspondence Principle of Bohr and Heisenberg: the quantum mechanical description of large systems should closely approximate to the classical description.

Atomic Transitions and Spectroscopy[edit | edit source]

Atomic Transitions[edit | edit source]

In physics, atomic spectral lines are of two types:

  • An emission line is formed when an electron makes a transition from a particular discrete energy level of an atom, to a lower energy state, emitting a photon of a particular energy and wavelength. A spectrum of many such photons will show an emission spike at the wavelength associated with these photons.
  • An absorption line is formed when an electron makes a transition from a lower to a higher discrete energy state, with a photon being absorbed in the process. These absorbed photons generally come from background continuum radiation and a spectrum will show a drop in the continuum radiation at the wavelength associated with the absorbed photons.

The two states must be bound states in which the electron is bound to the atom, so the transition is sometimes referred to as a "bound–bound" transition, as opposed to a transition in which the electron is ejected out of the atom completely ("bound–free" transition) into a continuum state, leaving an ionized atom, and generating continuum radiation.

A photon with an energy equal to the energy difference between the levels is released or absorbed in the process. The frequency ν at which the spectral line occurs is related to the photon energy E by Planck's law E = hν where h is Planck's constant.

Spectroscopy[edit | edit source]

Spectroscopy is the study of the interaction between radiation (electromagnetic radiation, or light, as well as particle radiation) and matter. Spectrometry is the measurement of these interactions and an instrument which performs such measurements is a spectrometer or spectrograph. A plot of the interaction is referred to as a spectrum.

Historically, spectroscopy referred to a branch of science in which visible light was used for the theoretical study of the structure of matter and for qualitative and quantitative analyses. Recently, however, the definition has broadened as new techniques have been developed that utilise not only visible light, but many other forms of radiation.

Spectroscopy is often used in physical and analytical chemistry for the identification of substances through the spectrum emitted from or absorbed by them. Spectroscopy is also heavily used in astronomy and remote sensing. Most large telescopes have spectrometers, which are used either to measure the chemical composition and physical properties of astronomical objects or to measure their velocities from the Doppler shift of their spectral lines.

Rutherford Planetary Model of the Atom[edit | edit source]

The Rutherford model or planetary model was a model of the atom devised by Ernest Rutherford. Rutherford directed the famous Geiger-Marsden experiment in (1909), which suggested to Rutherford's analysis (1911) that the Plum pudding model (of J. J. Thomson) of the atom was incorrect. Rutherford's new model for the atom, based on the experimental results, had a number of essential modern features, including a relatively high central charge concentrated into a very small volume in comparison to the rest of the atom.

Bohr Model of the Hydrogen Atom[edit | edit source]

In atomic physics, the Bohr model depicts the atom as a small, positively charged nucleus surrounded by electrons that travel in circular orbits around the nucleus — similar in structure to the solar system, but with electrostatic forces providing attraction, rather than gravity. This was an improvement on the earlier cubic model (1902), the plum-pudding model (1904), the Saturnian model (1904), and the Rutherford model (1911). Since the Bohr model is a quantum-physics based modification of the Rutherford model, many sources combine the two, referring to the Rutherford-Bohr model.

Introduced by Niels Bohr in 1913, the model's key success lay in explaining the Rydberg formula for the spectral emission lines of atomic hydrogen; while the Rydberg formula had been known experimentally, it did not gain a theoretical underpinning until the Bohr model was introduced. Not only did the Bohr model explain the reason for the structure of the Rydberg formula, but it provided a justification for its empirical results in terms of fundamental physical constants.

The Bohr model is a primitive model of the hydrogen atom. As a theory, it can be derived as a first-order approximation of the hydrogen atom using the broader and much more accurate quantum mechanics, and thus may be considered to be an obsolete scientific theory. However, because of its simplicity, and its correct results for selected systems (see below for application), the Bohr model is still commonly taught to introduce students to quantum mechanics, before moving on to the more accurate but more complex valence shell atom. A related model was originally proposed by Arthur Erich Haas in 1910, but was rejected.

Reference[edit | edit source]

Nouredine Zettili, "Quantum Mechanics: Concepts and Application". John Wiley & Sons, LTD. New York, 2001