Skip to main content

Section 1.2 Fundamental Units

In order to understand a physical phenomenon, we make careful measurements of the relevant physical quantities. To facilitate communication among people and to build devices that would work with each other, the measurements of physical quantities are expressed in terms of agreed-upon standards. Length, time, and mass are fundamental quantities in mechanics. Other fundamental quantities will be added to this list when we study heat, electricity and optics.

Subsection 1.2.1 Length

Length is a measure of the distance between two points in space. Before, the French Revolution (1790), different standards of length were used in different countries, and even different localities in the same country; for instance, a Greek foot was approximately \(1.012\) times the English foot, a Roman foot was approximately \(0.97\) of an English foot, etc. Furthermore, to make matters worse, the multipliers between different units in common use were not uniform; for instance, there are three feet in a yard, and twelve inches in a foot.

In 1792, the government of France after the French revolution adopted a new system of weights and measures with meter as the fundamental unit of length. The name meter comes from the Latin word metrum and the Greek word metron, both meaning “measure”. The meter was defined as \(10^{-7}\) times or one ten-millionth of the distance on the meridian from the North Pole to the equator passing through Paris. The factor was chosen to get a size close to the “human scale”. Later it was found that prototypes based on earth-based definition were 0.2 mm too short due to the flattening of earth due to its rotational motion.

In 1889, the first meeting of International Committee for Weights and Measures (CIPM) replaced the earth-based meter to the distance between two fine markings on a Platinum-Iridium rod kept at zero degrees Celsius temperature and standard pressure at the International Bureau of Weights and Measures (BIPM) near Paris, France. Accurate copies of the original rod were made and distributed to other standard-keeping laboratories throughout the world. These secondary standards were used to produce more accessible copies such as meter rulers for the general public.

With the advancement in optical technology, it became possible to measure lengths more precisely than the fine markings on the standard meter rod. In 1960 a new standard for meter based on the wavelength of orange-red light emitted by Krypton-86 in a gas discharge tube was adopted. The meter was redefined to be equal to \(1,650,763.73\) wavelengths of this light.

To reduce further uncertainty in measurements, in 1983 General Conference on Weights and Measures (CGPM) replaced the definition of meter based on Krypton-86 to the one based on the measurement of the speed of light. An exact value of \(299,792,458\ \text{m/s}\) for the speed of light is assumed which is then used to define the meter.

One meter is the length of the path travelled by light in vacuum during a time interval of \(1/299,792,458\) of a second.

Note that this way of defining a meter uses the unit of time (second) to define the unit of length. With this definition of the unit of length, the precision of length is now tied with the precision of time measurements. We will see below that the time measurements have become extremely accurate and therefore a better unit to serve as a base unit for other units.

Subsection 1.2.2 Time

To define time we think of either a natural phenomenon that repeats itself or an experiment that can be performed repeatedly. For instance, oscillations of a pendulum provides a basis for time in an experimental setting while earth's rotation provides a naturally occurring phenomenon that has been used for time measurements.

The SI unit of time is one second. How one second came to be a standard of time is a fascinating story unto itself in the history of science. It is known that ancient civilizations used the apparent motion of celestial bodies across the sky - Sun, Moon, planets and stars - for keeping track of the passage of time and seasons.

It is known that Egyptians made time keeping devices such as sundial and used a similar system as our own. Figure 1.2.1 shows Sundial in thyme garden at Minnesota Landscape Arboretum. Photographed June 17, 2007 at 12:21 solar time. Photocredit: S. E. Wilco, via Wikimedia Commons.

Figure 1.2.1.

For instance, the current division of a year into 365 days seems to have come from Egyptian calendar as far back as 3100 BCE (Before Common Era) based on the rising of the Dog Star in Canis Major, now called Sirius, next to the sun every 365 days, which coincided with the flooding of Nile. Egyptians had built obelisks (slender, tapering, four-sided monuments) as far back as 3500 BCE whose shadow was used to determine time during the day. The obelisks were like primitive sundials and had markings at the base to indicate the shadows corresponding to the shortest and longest days of the year.

Around 1500 BCE the Egyptians invented the sundial that divided the day from sunrise to sunset into ten parts plus two “twilight hours”. Similarly they also divided nighttime into 12 hours thus making a total of 24 hours in a full day. The divisions of an hour into sixty minutes and a minute into sixty seconds is said to have come from the Sumerian culture, which had a sexagesimal system that was based on number 60.

Egyptians also invented water clock or clepsydra before 1500 BCE, the earliest time keeping device not dependent on the motion of celestial objects. Greeks started using water clocks around 325 BCE and built even more impressive and elaborate water clocks. The complexities were added to make the flow of water as steady as possible. Despite these efforts it was very difficult to control the water flow with high accuracy and new mechanical clocks were needed.

Little progress after the Egyptian inventions seems to have been made in time keeping until Galileo Galilei (1564 - 1642 ) who suggested using the natural period of a pendulum. Although Galileo sketched a design of a pendulum clock, he never constructed it. The first pendulum clock was built by Christian Huygens of Netherlands in 1656. It had an error of less than 1 minute a day and was the most accurate clock to date. Christian Huygens also invented the balance wheel and spring assembly in 1675, which led to the construction of more accurate clocks. The oscillations of the balance wheel, which oscillates at around 5 cycles per second, provide the time standard for the mechanical watch. In 1889 Sigmund Riefler's clock was made that kept time with an error of less than a hundredth's of a second a day and became a standard fixture for astronomers.

With the discovery of piezoelectricity in 1880 by Pierre and Jacques Curie, the Curie brothers, it was found that piezo-crystals of quartz vibrate at a definite frequency when one applies voltage upon them. A vibrating quartz crystal generates an oscillating current of constant frequency that can be determined quite accurately with appropriate electrical circuitry.

In 1927 a Canadian-born telecommunications engineer Warren Morrison (1896-1980) invented the quartz watch. Morrison and others demonstrated that the accuracy of time based on quartz crystals far exceeded clocks based on balance wheel and spring assembly. Today inexpensive electronic clocks based on quartz vibration are commonplace.

Figure 1.2.2. A quartz watch by Seiko.

Despite a better performance of quartz crystals, they are no match for atomic clocks developed in 1940's and 50's. The possibility of atomic clock based on atomic beam magnetic resonance was first suggested in 1945 by I. Rabi of Columbia University (New York). In 1949 the National Bureau of Standards of United States (now called the National Institute of Standards and Technology or NIST) developed the first atomic clock using ammonia molecule. However, the atomic clock based on ammonia molecule was not much better than the existing quartz clocks.

In 1955 Louis Essen at the National Physical Laboratory in United Kingdom constructed the world's first atomic clock based on the atomic transitions of cesium atoms that had an accuracy of 1 sec in 300 years. The measured time using the atomic clock was compared with the time based on the rotation of earth and found to be much more accurate and stable. Therefore, in 1967 the 13th General Conference of Weights and Measures decided to replace the definition of a second by the following.

One second is the duration of \(9,192,631,770\) periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133 atom.

Although there have been improvements in atomic clocks, the definition of a second today is the one adopted in 1967. Atomic clocks are getting better everyday, and in 2010 the reported uncertainty for NIST-F1 clock shown in Figure 1.2.3 was merely \(3\times10^{-16}\) second in one second, i.e. 1 second in 3 million years - a fantastic precision!

Figure 1.2.3. Atomic clock NIST-F1 at National Institute of Standards and Technology, USA.

Subsection 1.2.3 Mass

Mass is a measure of mechanical response of an object. Two objects of equal mass, regardless of their chemical content, shape or size, are accelerated equally when subjected to the same force, and two objects of different masses have different accelerations when subjected to the same force.

The SI unit of mass, the kilogram (kg), is the only base quantity now that is still defined by a physical artifact. The original sample was an alloy made up of \(90\%\) platinum and \(10\%\) iridium by mass in the shape of a cylinder of height 39 mm and diameter \(39\text{ mm}\) in 1879 by George Matthey of Johnson Matthey and stored at the atmospheric pressure in a special triple-bell jar at BIPM, the International Bureau of Weights and Measures near Paris. The alloy was chosen for its non-corrosive properties and the shape was chosen to correspond to the minimum area for a given volume of a cylinder; the spherical shape would be better for minimizing the surface exposed, but since spheres roll off easily it was decided that a cylinder would serve better. The definition of a kilogram can be given as follows.

One kilogram is the mass of the prototype of the kilogram kept at the International Bureau of Weights and Measures.

Forty copies of the original prototype were made in 1882 and distributed to various countries. Copies of these secondary standards were made widely available to the tradesmen and general public. Thus all 1-kg samples are traceable to the international prototype kept at BIPM.

The unit of kilogram defined by an artefact has some intrinsic problems. For instance, the prototype may be damaged or corrode due to oxidation or other wear and tear due to the environment. As a result of these problems, the prototype is said to be gaining approximately 1 micro-gram per year. Therefore, it is hard to keep the standard kilogram constant to a very high degree of precision. Presently, several new methods for defining kilogram are being investigated. A particularly attractive possibility is to define the kilogram based on the mass of a fixed number of molecules of a substance that can be made with high purity. In another method developed at National Institute of Standards and Technology (NIST), force of gravity on a standard kilogram is balanced by magnetic force between two coils in a Watt balance (Figure 1.2.4).

Figure 1.2.4. The Watt balance at National Institute of Standards and Technology, USA. (Photo by Richard Steiner)