Metrology and Instrumentation - Mod 1
Metrology and Instrumentation - Mod 1
MODULE 1
Metrology (from Ancient Greek metron (measure) and logos (study of)) is the
measurement.
INTRODUCTION
Metrology is a very broad field and may be divided into three subfields:
measurement standards and the transfer of traceability from these standards to users in
society.
manufacturing and other processes and their use in society, ensuring the suitability of
instruments for the protection of health, public safety, the environment, enabling
comparisons, all having stated uncertainties." The level of traceability establishes the level of
the previous one, a measurement result a year ago, or to the result of a measurement
Traceability is most often obtained by calibration, establishing the relation between the
standards are usually coordinated by national laboratories: National Institute of Standards and
METROLOGY BASICS
Mistakes can make measurements and counts incorrect. If there are no mistakes, all
counts will be exactly correct. Even if there are no mistakes, nearly all measurements are still
inexact. The term 'error' is reserved for that inexactness, also called measurement uncertainty.
• The absence of the quantity being measured, such as a voltmeter with its leads shorted
• Measurement of an accepted constant under qualifying conditions, such as the triple point of
pure water: the thermometer should read 273.16 Kelvin (0.01 degrees Celsius, 32.018 degrees
• Self-checking ratio metric measurements, such as a potentiometer: the ratio is between steps
All other measurements either have to be verified to be sufficiently correct or left to chance.
Metrology is the science that establishes the correctness of specific measurement situations.
This is done by anticipating and allowing for both mistakes and error. The precise distinction
between measurement error and mistakes is not settled and varies by country. Repeatability
and reproducibility studies help quanitfy the precision: one common method is an ANOVA
METROLOGY STANDARDS
Standards are objects or ideas that are designated as being authoritative for some
accepted reason. Whatever value they possess is useful for comparison to unknowns for the
purpose of establishing or confirming an assigned value based on the standard. The design of
comparisons for the purpose of establishing the relationship between a standard and some
The ideal standard is independently reproducible without uncertainty. This is what the
creators of the 'metre' length standard were attempting to do in the 19th century. Later, we
learned that the Earth’s surface is a terrible basis for a standard. The Earth is not spherical and
it is constantly changing in shape. But the special alloy metre/meter bars that were created and
accepted in that time period standardized international length measurement until the 1950s.
reproduced in metrology laboratories worldwide, regardless of whether the rest of the metric
system was implemented and in spite of the shortfalls of the metre/meter’s original basis.
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
The inhabitants of the Indus Valley Civilization (c. 3000–1500 BCE, Mature period
measures, evident by the excavations made at the Indus valley sites. This technical
measurement for construction. Calibration was also found in measuring devices along with
Metrology has existed in some form or another since antiquity. The earliest forms of
metrology were simply arbitrary standards set up by regional or local authorities, often based
on practical measures such as the length of an arm. The earliest examples of these
standardized measures are length, time, and weight. These standards were established in order
Little progress was made with regard to proto-metrology until various scientists,
chemists, and physicists started making headway during the scientific revolution. With the
advances in the sciences, the comparison of experiment to theory required a rational system of
units, and something more closely resembling modern metrology began to come into being.
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
principles could be applied to standards of measurement, and many inventions made it easier
Metrology was thus one of the precursors to the Industrial Revolution, and was
necessary for the implementation of mass production, equipment commonality, and assembly
lines.
Modern metrology has its roots in the French Revolution, with the political motivation
to harmonize units all over France and the concept of establishing units of measurement based
on constants of nature, and thus making measurement units available "for all people, for all
time". In this case deriving a unit of length from the dimensions of the Earth, and a unit of
mass from a cube of water. The result was platinum standards for the meter and the kilogram
established as the basis of the metric system on June 22, 1799. This further led to the creation
of the Système International d'Unités, or the International System of Units. This system has
measurement units. Though not the official system of units of all nations, the definitions and
specifications of SI are globally accepted and recognized. The SI is maintained under the
auspices of the Metre Convention and its institutions, the General Conference on Weights and
Measures, or CGPM, its executive branch the International Committee for Weights and
Measures, or CIPM, and its technical institution the International Bureau of Weights and
Measures, or BIPM.
As the authorities on SI, these organizations establish and promulgate the SI, with the
ambition to be able to service all. This includes introducing new units, such as the relatively
new unit, the mole, to encompass metrology in chemistry. These units are then established
and maintained through various agencies in each country, and establish a hierarchy of
measurement standards that can be traced back to the established standard unit, a concept
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
known as metrological traceability. The U.S. agencies holding this responsibility are the
National Institute of Standards and Technology (NIST) and the American National Standards
Institute (ANSI).
The development of standards also involves individual and small group achievements.
In 1893, Edward Weston (chemist) and his company perfected his Saturated Standard Cell
design, which allowed the volt to be reproduced to 1 part in ten to the fourth power directly.
This advance made a huge practical difference at a critical moment in the development of
modern electrical devices. Groupings of saturated cells, called banks, can still be found in
some metrology and calibration laboratories today. Edward Weston did not pursue patents for
his cell design. By doing this, his superior design quickly replaced similar but inferior
MODERN STANDARDS
Currently, only five independent units of measure are internationally recognized. All
measurements of all types are based on one or more of these independent units. For example,
Ohm's law is the most widely understood concept in all of electricity usage. Of the three units
of measure involved, only current (ampere) is an independent unit. Voltage and resistance
units are dependent on current units, per Ohm's law. Two supplemental independent units are
It is believed that each of independent units of measure will be defined in terms of the other
four independent units eventually. Length (meter) and time (second) are already connected
this way. If an accurate time base is available, then a length standard can be reproduced
without a meter bar artifact. Lesser known is the relationship between the luminance (candela)
and current (ampere). The candela is defined in terms of the watt, which in turn is derived
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
from the ampere. This difficult to recreate standard is supplemented by an incandescent bulb
design that is used as a secondary and transfer standard. These bulbs recreate the candela
The development of standards follows the needs of technology. As a result, some units
of measure have much more resolution than others. The second is reproducible to 1 part in 10
to the 14th power. As this resolution capability increased, what was believed to be a constant
proved to be very slightly irregular. See leap second for an explanation and as a case study of
that are capable of 50 parts per million (0.0005%) precision. Reproducibility of the standard is
the constraint.
Temperature (kelvin) is defined by accepted fixed points. These points are defined by
the state changes of nearly pure materials, generally as they move from liquid to solid.
constructed a very specific way are used to interpolate temperature values. This mosaic of
approaches produces uncertainty that is not uniform over the entire range of temperature
as the frontiers of science moved forward, it pulled applied science along. Engineering,
manufacturing and ordinary living now routinely challenge the limits of measurement.
For example, most owners of 'atomic clocks,' more correctly known as radio clocks,
know that there are no radioactive materials in their clocks. An unacceptably small percentage
of users know that the clocks are synchronized by internal radio receivers for broadcasted
time signals from real atomic clocks. There are too many other measurement devices used by
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
people who don't have adequate comprehension of the basic principles involved. Without this
After 40+ centuries of effort, there still are many unanswered questions and a lot of
work remaining to be done. There also are plenty of surname-less units of measure waiting for
new champions. They would join Kelvin, Watt, Ampere, Hertz and, in 1971, Siemens, in the
large and small industrial companies also define metrology standards and procedures to meet
their particular needs for technically and economically competitive manufacturing. These
standards and procedures, while drawing in part upon the national and international standards,
also address the issues of what specific instrument technology will be used to measure each
quantity, how often each quantity will be measured, and which definition of each quantity will
be used as the basis for accomplishing the process control that their manufacturing and
product specifications require. Industrial metrology standards include dynamic control plans,
In industrial metrology, several issues beyond accuracy constrain the usability of metrology
1. The speed with which measurements can be accomplished on parts or surfaces in the
process of manufacturing, which must match the TAKT Time of the production line.
2. The completeness with which the manufactured part can be measured such as
4. The ability of the measurement results, as they are presented, to be assimilated by the
NATIONAL STANDARDS
Every country maintains its own metrology system. In the United States, the National
Institute of Standards and Technology (NIST) plays the dual role of maintaining and
furthering both commercial and scientific metrology. NIST does not enforce measurement
accuracy directly.
The accuracy and traceability of commercial measurements is enforced per the laws of the
individual states. Commercial measurement generally involves any material sold by any unit
cloth on a cutting table that has a yardstick fastened to it. All counting-based transactions are
generally exempt also. But each state has its own rules, responding to the accumulated
commerce of any kind above the pure barter level. Every state maintains its own weights and
measures functionality with traceability to the national standards maintained by NIST. Large
states further divide this effort by county, where a "Sealer" or other appointee is responsible
for the validity of most common commercial measurements such as mass balances (scales) in
grocery stores and gasoline pump measurements of volume. The sealer's staff and agents
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
make periodic inspections to catch merchant cheaters, maintaining the integrity of commercial
measurements.
Typical State Seal application: Even in Las Vegas, people prefer not to leave volumetric
Depending on the specific state, other state government agencies can be involved. For
example, electricity watt-hour meters and water delivery flow meters are commonly
monitored by the state's "public utilities commission" who enforces the measurement
tolerances and traceabiity to NIST through the utility providers. Highway State Police and the
State Highway Department generally run the commercial truck mass measurement programs
for safety purposes and to minimize the damage to road surfaces that overloaded trucks cause.
Nearly all states license weighmasters, weighmistresses, scale calibrators and other specialists
The term "commercial metrology" is also used to describe calibration laboratories that are not
ordinary commerce, such as the test bed pictured at the beginning of the article.
only. They may choose to have their work accredited by voluntary certification
Irresolvable disputes involving scientific metrology are generally settled in the civil
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
court systems. Some federal government entities like the Federal Communications
the final authority in their domains rather than the NIST. Disputes involving only
metrology issues with those organizations probably would not be heard in any courts.
LINE STADARDS
The meter was originally defined as one ten-millionth of the distance between the
North Pole and the equator at the longitude of Paris. Because of the difficulty of reproducing
this measurement, a platinum bar of that length was constructed in 1799 and housed at
Pavillon de Breteuil near Paris. This is the headquarters for the International Bureau of
It was discovered that this bar was a fraction of a millimeter too long. In 1889 the
meter was redefined as the distance between precision marks on a new 'X' shaped 90%
platinum 10% iridium bar at 0 °C. This alloy was used because it is does not oxidize, is hard,
can be highly polished, and expands or contracts very little with temperature changes. The bar
Selected metrologists were authorized to travel there to duplicate the marks on to their
own bars for regional prototypes The new bar served as standard until 1960 when the meter
was redefined in terms of the wavelength of a spectral line of Krypton86. The meter was
redefined yet again in 1989 in terms of the speed of light. The present speed of light is defined
as 299,792,458 meters per second and is used to indirectly calculate the length of the meter.
SYSTEMS OF MEASUREMENT
A system of measurement is a set of units which can be used to specify anything which can
be measured and were historically important, regulated and defined because of trade and
internal commerce. Scientifically, when later analyzed, some quantities are designated as
fundamental units meaning all other needed units can be derived from them, whereas in the
early and most historic eras, the units were given by fiat (See Statutory law) by the ruling
Although we might suggest that the Egyptians had discovered the art of measurement, it is
really only with the Greeks that the science of measurement begins to appear. The Greeks'
knowledge of geometry, and their early experimentation with weights and measures, soon
began to place their measurement system on a more scientific basis. By comparison, Roman
The French Revolution gave rise to a scientific system, and there has been steady
significant pressure since to convert to a scientific basis from so called customary units of
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
measure. In most systems, length (distance), weight, and time are fundamental quantities; or
as has been now accepted as better in science and engineering, the substitution of mass for
weight, as a better more basic parameter. Some systems have changed to recognize the
improved relationship, notably the 1824 legal changes to the imperial system.
Later science developments showed that either electric charge or electric current must
be added to complete the minimum set of fundamental quantities by which all other
metrological units may be defined. Other quantities, such as power, speed, etc. are derived
from the fundamental set; for example, speed is distance divided by time. Historically a wide
range of units were used for the same quantity; for example, in several cultural settings,
length was measured in inches, feet, yards, fathoms, rods, chains, furlongs, miles, nautical
miles, stadia, leagues, with conversion factors which are not simple powers of ten or even
Nor were they necessarily the same units (or equal units) between different members
of similar cultural backgrounds. It must be understood by the modern reader that historically,
measurement systems were perfectly adequate within their own cultural milieu, and the
understanding that a better more universal system (based on more rationale and fundamental
units) only gradually spread with the maturation and appreciation of the rigor characteristic of
Newtonian physics. Moreover, changing one's measurement system has real fiscal and
cultural costs.
Once the analysis tools within that field were appreciated and came into widespread
use in the nascent sciences, especially in the utilitarian subfields of applied science like civil
and mechanical engineering, conversion to a common basis had no impetus. It was only after
the appreciation of these needs and the appreciation of the difficulties of converting between
numerous national customary systems became widespread could there be any serious
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
spirit for taking the first significant and radical step down that road.
In antiquity, systems of measurement were defined locally, the different units were
defined independently according to the length of a king's thumb or the size of his foot, the
length of stride, the length of arm or per custom like the weight of water in a keg of specific
size, perhaps itself defined in hands and knuckles. The unifying characteristic is that there was
some definition based on some standard, however egocentric or amusing it may now seem
viewed with eyes used to modern precision. Eventually cubits and strides gave way under
In the metric system and other recent systems, a single basic unit is used for each
fundamental quantity. Often secondary units (multiples and submultiples) are used which
convert to the basic units by multiplying by powers of ten, i.e., by simply moving the decimal
point. Thus the basic metric unit of length is the metre or meter; a distance of 1.234 m is
METRIC SYSTEM
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
Metric systems of units have evolved since the adoption of the first well-defined system in
France in 1791. During this evolution the use of these systems spread throughout the world,
first to the non-English-speaking countries, and more recently to the English speaking
countries.
Multiples and submultiples of metric units are related by powers of ten; the names for
these are formed with prefixes. This relationship is compatible with the decimal system of
In the early metric system there were two fundamental or base units, the metre and the gram,
for length and mass. The other units of length and mass, and all units of area, volume, and
compound units such as density were derived from these two fundamental units.
introduced to act as compromise between the metric system and traditional measurements. It
A number of variations on the metric system have been in use. These include
gravitational systems, the centimetre-gram-second systems (cgs) useful in science, the metre-
tonne-second system (mts) once used in the USSR and the metre-kilogram-second system of
The current international standard metric system is the International System of Units
(Système international d'unités or SI) It is an mks system based on the metre, kilogram and
The SI includes two classes of units which are defined and agreed internationally. The first of
these classes are the seven SI base units for length, mass, time, temperature, electric current,
luminous intensity and amount of substance. The second of these are the SI derived units.
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
These derived units are defined in terms of the seven base units. All other quantities (e.g.
DISTANCE
In all traditional measuring systems, short distance units are based on the dimensions
of the human body. The inch represents the width of a thumb; in fact, in many languages, the
word for "inch" is also the word for "thumb." The foot (12 inches) was originally the length of
a human foot, although it has evolved to be longer than most people's feet. The yard (3 feet)
seems to have gotten its start in England as the name of a 3-foot measuring stick, but it is also
understood to be the distance from the tip of the nose to the end of the middle finger of the
outstretched hand. Finally, if you stretch your arms out to the sides as far as possible, your
total "arm span," from one fingertip to the other, is a fathom (6 feet).
Historically, there are many other "natural units" of the same kind, including the digit
(the width of a finger, 0.75 inch), the nail (length of the last two joints of the middle finger, 3
digits or 2.25 inches), the palm (width of the palm, 3 inches), the hand (4 inches), the
shaftment (width of the hand and outstretched thumb, 2 palms or 6 inches), the span (width of
the outstretched hand, from the tip of the thumb to the tip of the little finger, 3 palms or 9
In Anglo-Saxon England (before the Norman conquest of 1066), short distances seem to have
been measured in several ways. The inch (ynce) was defined to be the length of 3 barleycorns,
which is very close to its modern length. The shaftment was frequently used, but it was
roughly 6.5 inches long. Several foot units were in use, including a foot equal to 12 inches, a
foot equal to 2 shaftments (13 inches), and the "natural foot" (pes naturalis, an actual foot
length, about 9.8 inches). The fathom was also used, but it did not have a definite relationship
When the Normans arrived, they brought back to England the Roman tradition of a
12-inch foot. Although no single document on the subject can be found, it appears that during
the reign of Henry I (1100-1135) the 12-inch foot became official, and the royal government
took steps to make this foot length known. A 12-inch foot was inscribed on the base of a
column of St. Paul's Church in London, and measurements in this unit were said to be "by the
foot of St. Paul's" (de pedibus Sancti Pauli). Henry I also appears to have ordered construction
of 3-foot standards, which were called "yards," thus establishing that unit for the first time in
England. William of Malmsebury wrote that the yard was "the measure of his [the king's] own
arm," thus launching the story that the yard was defined to be the distance from the nose to
the fingertip of Henry I. In fact, both the foot and the yard were established on the basis of the
Saxon ynce, the foot being 36 barleycorns and the yard 108.
Meanwhile, all land in England was traditionally measured by the gyrd or rod, an old
Saxon unit probably equal to 20 "natural feet." The Norman kings had no interest in changing
the length of the rod, since the accuracy of deeds and other land records depended on that
unit. Accordingly, the length of the rod was fixed at 5.5 yards (16.5 feet). This was not very
convenient, but 5.5 yards happened to be the length of the rod as measured by the 12-inch
foot, so nothing could be done about it. In the Saxon land-measuring system, 40 rods make a
furlong (fuhrlang), the length of the traditional furrow (fuhr) as plowed by ox teams on Saxon
farms. These ancient Saxon units, the rod and the furlong, have come down to us today with
essentially no change. The chain, a more recent invention, equals 4 rods or 1/10 furlong in
Longer distances in England are traditionally measured in miles. The mile is a Roman
unit, originally defined to be the length of 1000 paces of a Roman legion. A "pace" here
means two steps, right and left, or about 5 feet, so the mile is a unit of roughly 5000 feet. For
a long time no one felt any need to be precise about this, because distances longer than a
furlong did not need to be measured exactly. It just didn't make much difference whether the
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
next town was 21 or 22 miles away. In medieval England, various mile units seem to have
been used. Eventually, what made the most sense to people was that a mile should equal 8
furlongs, since the furlong was an English unit roughly equivalent to the Roman stadium and
the Romans had set their mile equal to 8 stadia. This correspondence is not exact: the furlong
is 660 English feet and the stadium is only 625 slightly-shorter Roman feet.
In 1592, Parliament settled this question by setting the length of the mile at 8 furlongs,
which works out to 1760 yards or 5280 feet. This decision completed the English distance
system. Since this was just before the settling of the American colonies, British and American
AREA
In all the English-speaking countries, land is traditionally measured by the acre, a very
old Saxon unit that is either historic or archaic, depending on your point of view. There are
references to the acre at least as early as the year 732. The word "acre" also meant "field", and
as a unit an acre was originally a field of a size that a farmer could plow in a single day. In
practice, this meant a field that could be plowed in a morning, since the oxen had to be rested
in the afternoon. The French word for the unit is journal, which is derived from jour, meaning
"day"; the corresponding unit in German is called the morgen ("morning") or tagwerk ("day's
work").
Most area units were eventually defined to be the area of a square having sides equal
to some simple multiple of a distance unit, like the square yard. But the acre was never
visualized as a square. An acre is the area of a long and narrow Anglo-Saxon farm field, one
furlong (40 rods) in length but only 4 rods (1 chain) wide. This works out, very awkwardly
indeed, to be exactly 43 560 square feet . If we line up 10 of these 4 x 40 standard acres side
by side, we get 10 acres in a square furlong, and since the mile is 8 furlongs there are exactly
WEIGHT
The basic traditional unit of weight, the pound, originated as a Roman unit and was
used throughout the Roman Empire. The Roman pound was divided into 12 ounces, but many
European merchants preferred to use a larger pound of 16 ounces, perhaps because a 16-ounce
pound is conveniently divided into halves, quarters, or eighths. During the Middle Ages there
were many different pound standards in use, some of 12 ounces and some of 16. The use of
these weight units naturally followed trade routes, since merchants trading along a certain
route had to be familiar with the units used at both ends of the trip.
In traditional English law the various pound weights are related by stating all of them
as multiples of the grain, which was originally the weight of a single barleycorn. Thus
barleycorns are at the origin of both weight and distance units in the English system.
The oldest English weight system has been used since the time of the Saxon kings. It is based
on the 12-ounce troy pound, which provided the basis on which coins were minted and gold
and silver were weighed. Since Roman coins were still in circulation in Saxon times, the troy
system was designed to model the Roman system directly. The troy pound weighs 5760
grains, and the ounces weigh 480 grains. Twenty pennies weighed an ounce, and therefore a
pennyweight is 480/20 = 24 grains. The troy system continued to be used by jewelers and also
by druggists until the nineteenth century. Even today gold and silver prices are quoted by the
Since the troy pound was smaller than the commercial pound units used in most of
Europe, medieval English merchants often used a larger pound called the "mercantile" pound
(libra mercatoria). This unit contained 15 troy ounces, so it weighed 7200 grains. This unit
seemed about the right size to merchants, but its division into 15 parts, rather than 12 or 16,
was very inconvenient. Around 1300 the mercantile pound was replaced in English commerce
by the 16-ounce avoirdupois pound. This is the pound unit still in common use in the U.S. and
Britain. Modeled on a common Italian pound unit of the late thirteenth century, the
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
avoirdupois pound weighs exactly 7000 grains. The avoirdupois ounce, 1/16 pound, is divided
Unfortunately, the two English ounce units don't agree: the avoirdupois ounce is
7000/16 = 437.5 grains while the troy ounce is 5760/12 = 480 grains. Conversion between
troy and avoirdupois units is so awkward, no one wanted to do it. The troy system quickly
became highly specialized, used only for precious metals and for pharmaceuticals, while the
Since at least 1400 a standard weight unit in Britain has been the hundredweight,
which is equal to 112 avoirdupois pounds rather than 100. There were very good reasons for
the odd size of this "hundred": 112 pounds made the hundredweight equivalent for most
purposes with competing units of other countries, especially the German zentner and the
French quintal. Furthermore, 112 is a multiple of 16, so the British hundredweight can be
pounds each. The ton, originally a unit of wine measure, was defined to equal 20
During the nineteenth century, an unfortunate disagreement arose between British and
Americans concerning the larger weight units. Americans, not very impressed with the history
of the British units, redefined the hundredweight to equal exactly 100 pounds. The definition
of the ton as 20 hundredweight made the disagreement carry over to the size of the ton: the
British "long" ton remained at 2240 pounds while the American "short" ton became exactly
2000 pounds. (The American hundredweight became so popular in commerce that British
merchants decided they needed a name for it; they called it the cental.) Today, most
international shipments are reckoned in metric tons, which, coincidentally, are rather close in
VOLUME
The names of the traditional volume units are the names of standard containers. Until
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
the eighteenth century, it was very difficult to measure the capacity of a container accurately
in cubic units, so the standard containers were defined by specifying the weight of a particular
substance, such as wheat or beer, that they could carry. Thus the gallon, the basic English unit
of volume, was originally the volume of eight pounds of wheat. This custom led to a
sizes.
Gallons are always divided into 4 quarts, which are further divided into 2 pints each.
For larger volumes of dry commodities, there are 2 gallons in a peck and 4 pecks in a bushel.
Larger volumes of liquids were carried in barrels, hogsheads, or other containers whose size
in gallons tended to vary with the commodity, with wine units being different from beer and
The situation was still confused during the American colonial period, so the
Americans were actually simplifying things by selecting just two of the many possible
gallons. These two were the gallons that had become most common in British commerce by
around 1700. For dry commodities, the Americans were familiar with the "Winchester
bushel," defined by Parliament in 1696 to be the volume of a cylindrical container 18.5 inches
in diameter and 8 inches deep. The corresponding gallon, 1/8 of this bushel, is usually called
the "corn gallon" in England. This corn gallon holds 268.8 cubic inches.
For liquids Americans preferred to use the traditional British wine gallon, which
Parliament defined to equal exactly 231 cubic inches in 1707. As a result, the U.S. volume
system includes both "dry" and "liquid" units, with the dry units being about 1/6 larger than
In 1824, the British Parliament abolished all the traditional gallons and established a
new system based on the "Imperial" gallon of 277.42 cubic inches. The Imperial gallon was
Unfortunately, Americans were not inclined to adopt this new, larger gallon, so the traditional
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
English "system" actually includes three different volume measurement systems: U.S. liquid,
On both sides of the Atlantic, smaller volumes of liquid are traditionally measured in
fluid ounces, which are at least roughly equal to the volume of one ounce of water. To
accomplish this in the different systems, the smaller U.S. pint is divided into 16 fluid ounces,
All systems of weights and measures, metric and non-metric, are linked through a
International System is called the SI, using the first two initials of its French name Système
International d'Unités. The key agreement is the Treaty of the Meter (Convention du Mètre),
signed in Paris on May 20, 1875. 48 nations have now signed this treaty, including all the
major industrialized countries. The United States is a charter member of this metric club,
and Measures (BIPM, for Bureau International des Poids et Mesures), and it is updated every
few years by an international conference, the General Conference on Weights and Measures
(CGPM, for Conférence Générale des Poids et Mesures), attended by representatives of all
the industrial countries and international scientific and engineering organizations. The 23rd
CGPM met in 2007; the next meeting will be in 2011. As BIPM states on its web site, "The SI
is not static but evolves to match the world's increasingly demanding requirements for
measurement."
At the heart of the SI is a short list of base units defined in an absolute way without
referring to any other units. The base units are consistent with the part of the metric system
called the MKS system. In all there are seven SI base units:
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
Other SI units, called SI derived units, are defined algebraically in terms of these
fundamental units. For example, the SI unit of force, the newton, is defined to be the force
that accelerates a mass of one kilogram at the rate of one meter per second per second. This
means the newton is equal to one kilogram meter per second squared, so the algebraic
• the radian and steradian for plane and solid angles, respectively;
• units for measurement of electricity: the coulomb (charge), volt (potential), farad
• units for measurement of magnetism: the weber (flux), tesla (flux density), and henry
(inductance);
• the lumen for flux of light and the lux for illuminance;
• the hertz for frequency of regular events and the becquerel for rates of radioactivity
Future meetings of the CGPM may make additions to this list; the katal was added by the
In addition to the 29 base and derived units, the SI permits the use of certain additional units,
including:
• the traditional mathematical units for measuring angles (degree, arcminute, and
arcsecond);
• the traditional units of civil time (minute, hour, day, and year);
• two metric units commonly used in ordinary life: the liter for volume and the tonne
• the logarithmic units bel and neper (and their multiples, such as the decibel); and
• three non-metric scientific units whose values represent important physical constants:
the astronomical unit, the atomic mass unit or dalton, and the electronvolt.
The SI currently accepts the use of certain other metric and non-metric units traditional in
various fields. These units are supposed to be "defined in relation to the SI in every document
in which they are used," and "their use is not encouraged." These barely-tolerated units might
• the nautical mile and knot, units traditionally used at sea and in meteorology;
• the bar, a pressure unit, and its commonly-used multiples such as the millibar in
• the angstrom and the barn, units used in physics and astronomy.
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
inaccurate.
the average.
Which of the following sets of data is more precise, based on its range?
Set A Set B
15.32 32.56
15.37 32.55
15.33 32.48
15.38 32.53
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
15.35 32.55
Both accuracy and precision affect how many significant digits can be reported.
Manufacturers will usually specify the accuracy and precision to be expected from their
equipment as a uncertainty.
consider the chains used to measure the first down in a football game. They are supposed to
be ten yards long. But what if they were only 9 yards, 35 inches? You would certainly get
the same precise measurement each time you used the chains, but you wouldn't be getting the
correct accurate measurement. Both teams would not have to go quite ten yards to get a first
down, and the error is so small you probably wouldn't even notice it. However, there
probably have been football games played where one inch would have made a difference to
the outcome of the game. In science and in football, our measurements should be both
Of course, even precise and accurate equipment can be used incorrectly. If the chains
were the proper ten yards long, it would still be possible to get an imprecise measurement for
first downs. The chains must be stretched tightly, and they must be marked from the proper
ERRORS OF MEASUREMENT
The true score theory is a good simple model for measurement, but it may not always
of the true value plus some random error value. But is that reasonable? What if all error is not
random? Isn't it possible that some errors are systematic, that they hold across most or all of
the members of a group? One way to deal with this notion is to revise the simple true score
model by dividing the error component into two subcomponents, random error and
systematic error. here, we'll look at the differences between these two types of errors and try
Random error is caused by any factors that randomly affect measurement of the
variable across the sample. For instance, each person's mood can inflate or deflate their
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
performance on any occasion. In a particular testing, some children may be feeling in a good
mood and others may be depressed. If mood affects their performance on the measure, it may
artificially inflate the observed scores for some children and artificially deflate them for
others. The important thing about random error is that it does not have any consistent effects
across the entire sample. Instead, it pushes observed scores up or down randomly. This means
that if we could see all of the random errors in a distribution they would have to sum to 0 --
there would be as many negative errors as positive ones. The important property of random
error is that it adds variability to the data but does not affect average performance for the
Systematic error is caused by any factors that systematically affect measurement of the
variable across the sample. For instance, if there is loud traffic going by just outside of a
classroom where students are taking a test, this noise is liable to affect all of the children's
scores -- in this case, systematically lowering them. Unlike random error, systematic errors
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
So, how can we reduce measurement errors, random or systematic? One thing you can do is to
pilot test your instruments, getting feedback from your respondents regarding how easy or
hard the measure was and information about how the testing environment affected their
performance. Second, if you are gathering measures using people to collect the data (as
interviewers or observers) you should make sure you train them thoroughly so that they aren't
inadvertently introducing error. Third, when you collect the data for your study you should
double-check the data thoroughly. All data entry for computer analysis should be "double-
punched" and verified. This means that you enter the data twice, the second time having your
data entry machine check that you are typing the exact same data you did the first time.
Fourth, you can use statistical procedures to adjust for measurement error. These range from
rather simple formulas you can apply directly to your data to very complex modeling
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
procedures for modeling the error and its effects. Finally, one of the best things you can do to
deal with measurement errors, especially systematic errors, is to use multiple measures of the
same construct. Especially if the different measures don't share the same systematic errors,
you will be able to triangulate across the multiple measures and get a more accurate sense of
BASIC TERMS
It is necessary that the dimensions, shape and mutual position of surfaces of individual
parts of mechanical engineering products are kept within a certain accuracy to achieve their
correct and reliable functioning. Routine production processes do not allow maintenance (or
measurement) of the given geometrical properties with absolute accuracy. Actual surfaces of
the produced parts therefore differ from ideal surfaces prescribed in drawings. Deviations of
actual surfaces are divided into four groups to enable assessment, prescription and checking
• Dimensional deviations
• Shape deviations
• Position deviations
This toll includes the first group and can therefore be used to determine dimensional
absolute dimensional accuracy. In fact, it is not necessary or useful. It is quite sufficient that
the actual dimension of the part is found between two limit dimensions and a permissible
deviation is kept with production to ensure correct functioning of engineering products. The
required level of accuracy of production of the given part is then given by the dimensional
tolerance which is prescribed in the drawing. The production accuracy is prescribed with
regards to the functionality of the product and to the economy of production as well.
where:
Depending on the mutual position of tolerance zones of the coupled parts, 3 types of fit can be
distinguished:
A. Clearance fit
B. Transition fit
C. Interference fit
This paragraph can be used to choose a fit and determine tolerances and deviations of
machine parts according to the standard ISO 286:1988. This standard is identical with the
tolerances, deviations and fits. The standard ISO 286 is used as an international standard for
linear dimension tolerances and has been accepted in most industrially developed countries in
identical or modified wording as a national standard (JIS B 0401, DIN ISO 286, BS EN
The system of tolerances and fits ISO can be applied in tolerances and deviations of
smooth parts and for fits created by their coupling. It is used particularly for cylindrical parts
with round sections. Tolerances and deviations in this standard can also be applied in smooth
parts of other sections. Similarly, the system can be used for coupling (fits) of cylindrical
parts and for fits with parts having two parallel surfaces (e.g. fits of keys in grooves). The
term "shaft", used in this standard has a wide meaning and serves for specification of all outer
elements of the part, including those elements which do not have cylindrical shapes. Also, the
term "hole" can be used for specification of all inner elements regardless of their shape.
Note: All numerical values of tolerances and deviations mentioned in this paragraph are given
in the metric system and relate to parts with dimensions specified at 20 °C.
BASIC SIZE
It is the size whose limit dimensions are specified using the upper and lower deviations. In
case of a fit, the basic size of both connected elements must be the same.
Attention: The standard ISO 286 defines the system of tolerances, deviations and fits only for
The tolerance of a size is defined as the difference between the upper and lower limit
dimensions of the part. In order to meet the requirements of various production branches for
accuracy of the product, the system ISO implements 20 grades of accuracy. Each of the
tolerances of this system is marked "IT" with attached grade of accuracy (IT01, IT0, IT1 ...
IT18).
IT01 to
For production of gauges and measuring instruments
IT6
IT5 to
For fits in precision and general engineering
IT12
IT11 to
For production of semi-products
IT16
IT16 to
For structures
IT18
IT11 to For specification of limit deviations of non-tolerated
IT18 dimensions
Note: When choosing a suitable dimension it is necessary to also take into account the used
between the tolerance and modification of the surface can be found in the table in
paragraph [5].
TOLERANCE ZONES
The tolerance zone is defined as a spherical zone limited by the upper and lower limit
dimensions of the part. The tolerance zone is therefore determined by the amount of the
tolerance and its position related to the basic size. The position of the tolerance zone, related
to the basic size (zero line), is determined in the ISO system by a so-called basic deviation.
The system ISO defines 28 classes of basic deviations for holes. These classes are marked by
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
capital letters (A, B, C, ... ZC). The tolerance zone for the specified dimensions is prescribed
in the drawing by a tolerance mark, which consists of a letter marking of the basic deviation
and a numerical marking of the tolerance grade (e.g. H7, H8, D5, etc.). This paragraph
includes graphic illustrations of all tolerance zones of a hole which are applicable for the
specified basic size [1.1] and the tolerance grade IT chosen from the pop-up list.
Though the general sets of basic deviations (A ... ZC) and tolerance grades (IT1 ...
IT18) can be used for prescriptions of hole tolerance zones by their mutual combinations, in
practice only a limited range of tolerance zones is used. An overview of tolerance zones for
general use can be found in the following table. The tolerance zones not included in this table
are considered special zones and their use is recommended only in technically well-grounded
cases.
Prescribed hole tolerance zones for routine use (for basic sizes up to 3150 mm):
E5 E6 E7 E8 E9 E10
EF3 EF4 EF5 EF6 EF7 EF8 EF9 EF10
F3 F4 F5 F6 F7 F8 F9 F10
FG3 FG4 FG5 FG6 FG7 FG8 FG9 FG10
G3 G4 G5 G6 G7 G8 G9 G10
H1 H2 H3 H4 H5 H6 H7 H8 H9 H10 H11 H12 H13 H14 H15 H16 H17
JS1 JS2 JS3 JS4 JS5 JS6 JS7 JS8 JS9 JS10 JS11 JS12 JS13 JS14 JS15 JS16 JS17
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
J6 J7 J8
K3 K4 K5 K6 K7 K8
M3 M4 M5 M6 M7 M8 M9 M10
N3 N4 N5 N6 N7 N8 N9 N10 N11
P3 P4 P5 P6 P7 P8 P9 P10
R3 R4 R5 R6 R7 R8 R9 R10
S3 S4 S5 S6 S7 S8 S9 S10
T5 T6 T7 T8
U5 U6 U7 U8 U9 U10
V5 V6 V7 V8
X5 X6 X7 X8 X9 X10
Y6 Y7 Y8 Y9 Y10
Z6 Z7 Z8 Z9 Z10 Z11
Note: Tolerance zones with thin print are specified only for basic sizes up to 500 mm.
Hint: For hole tolerances, tolerance zones H7, H8, H9 and H11 are used preferably.
The tolerance zone is defined as a spherical zone limited by the upper and lower limit
dimensions of the part. The tolerance zone is therefore determined by the amount of the
tolerance and its position related to the basic size. The position of the tolerance zone, related
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
to the basic size (zero line), is determined in the ISO system by a so-called basic deviation.
The system ISO defines 28 classes of basic deviations for shafts. These classes are marked by
lower case letters (a, b, c, ... zc). The tolerance zone for the specified dimensions is prescribed
in the drawing by a tolerance mark, which consists of a letter marking of the basic deviation
and a numerical marking of the tolerance grade (e.g. h7, h6, g5, etc.). This paragraph includes
graphic illustrations of all tolerance zones of a shaft which are applicable for the specified
basic size [1.1] and the tolerance grade IT chosen from the pop-up list.
Though the general sets of basic deviations (a ... zc) and tolerance grades (IT1 ...
IT18) can be used for prescriptions of shaft tolerance zones by their mutual combinations, in
practice only a limited range of tolerance zones is used. An overview of tolerance zones for
general use can be found in the following table. The tolerance zones not included in this table
are considered special zones and their use is recommended only in technically well-grounded
cases.
Prescribed shaft tolerance zones for routine use (for basic sizes up to 3150 mm):
n3 n4 n5 n6 n7 n8 n9
p3 p4 p5 p6 p7 p8 p9 p10
r3 r4 r5 r6 r7 r8 r9 r10
s3 s4 s5 s6 s7 s8 s9 s10
t5 t6 t7 t8
u5 u6 u7 u8 u9
v5 v6 v7 v8
x5 x6 x7 x8 x9 x10
y6 y7 y8 y9 y10
z6 z7 z8 z9 z10 z11
Note: Tolerance zones with thin print are specified only for basic sizes up to 500 mm.
Hint: For shaft tolerances, tolerance zones h6, h7, h9 and h11 are used preferably.
SELECTION OF FIT
This paragraph can be used to choose a recommended fit. If you wish to use another fit than
the recommended one, define hole and shaft tolerance zones directly in the paragraphs. When
• Tolerances of the hole and shaft should not differ by more than two grades.
Hint: In case you wish to find a suitable standardized fit with regard to its specific properties
SYSTEM OF FIT
Although there can be generally coupled parts without any tolerance zones, only two methods
of coupling of holes and shafts are recommended due to constructional, technological and
economic reasons.
The desired clearances and interferences in the fit are achieved by combinations of various
shaft tolerance zones with the hole tolerance zone "H". In this system of tolerances and
The desired clearances and interferences in the fit are achieved by combinations of various
hole tolerance zones with the shaft tolerance zone "h". In this system of tolerances and
where:
The option of the system for the specified type of product or production is always influenced
• Costs for purchase, maintenance and storage of gauges and production tools.
Hint: Although both systems are equivalent in the view of functional properties, the hole
TYPE OF FIT
Depending on the mutual position of tolerance zones of the coupled parts, 3 types of fit can be
distinguished:
A. Clearance fit
It is a fit that always enables a clearance between the hole and shaft in the coupling. The
lower limit size of the hole is greater or at least equal to the upper limit size of the shaft.
B. Transition fit
It is a fit where (depending on the actual sizes of the hole and shaft) both clearance and
interference may occur in the coupling. Tolerance zones of the hole and shaft partly or
completely interfere.
C. Interference fit
It is a fit always ensuring some interference between the hole and shaft in the coupling.
The upper limit size of the hole is smaller or at least equal to the lower limit size of the
shaft.
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
RECOMMENDED FITS.
The list of recommended fits given here is for information only and cannot be taken as
a fixed listing. The enumeration of actually used fits may differ depending on the type and
field of production, local standards and national usage and last but not least, depending on the
plant practices. Properties and field of use of some selected fits are described in the following
overview. When selecting a fit it is often necessary to take into account not only
constructional and technological views, but also economic aspects. Selection of a suitable fit
is important particularly in view of those measuring instruments, gauges and tools which are
implemented in the production. Therefore, follow proven plant practices when selecting a fit.
BOLD):
Clearance fits:
Use: Pivots, latches, fits of parts exposed to corrosive effects, contamination with dust and
Running fits with greater clearances without any special requirements for accuracy of guiding
shafts.
Use: Multiple fits of shafts of production and piston machines, parts rotating very rarely or
only swinging.
Running fits with greater clearances without any special requirements for fit accuracy.
Use: Fits of long shafts, e.g. in agricultural machines, bearings of pumps, fans and piston
machines.
Running fits with smaller clearances with general requirements for fit accuracy.
Use: Main fits of machine tools. General fits of shafts, regulator bearings, machine tool
Running fits with very small clearances for accurate guiding of shafts. Without any noticeable
Use: Parts of machine tools, sliding gears and clutch disks, crankshaft journals, pistons of
H11/h11, H11/h9
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
Slipping fits of parts with great tolerances. The parts can easily be slid one into the other and
turn.
Use: Easily demountable parts, distance rings, parts of machines fixed to shafts using pins,
Sliding fits with very small clearances for precise guiding and centring of parts. Mounting by
sliding on without use of any great force, after lubrication the parts can be turned and slid by
hand.
Use: Precise guiding of machines and preparations, exchangeable wheels, roller guides.
Transition fits:
Tight fits with small clearances or negligible interference. The parts can be assembled or
disassembled manually.
Use: Easily dismountable fits of hubs of gears, pulleys and bushings, retaining rings,
Similar fits with small clearances or small interferences. The parts can be assembled or
Use: Demountable fits of hubs of gears and pulleys, manual wheels, clutches, brake disks.
Fixed fits with negligible clearances or small interferences. Mounting of fits using pressing
Use: Fixed plugs, driven bushings, armatures of electric motors on shafts, gear rims, flushed
bolts.
Interference fits:
Pressed fits with guaranteed interference. Assembly of the parts can be carried out using cold
pressing.
Pressed fits with medium interference. Assembly of parts using hot pressing. Assembly using
Pressed fits with big interferences. Assembly using pressing and great forces under different
use some of the preferred fits. Preferred fits are marked by asterisk "*" in the list.
METROLOGY AND INSTRUMENTATION (M602) -MODULE 1
Note: Preferred fits designed for preferred use in the USA are defined in ANSI B4.2. This
- Clearance fits: H11/c11, H9/d9, H8/f7, H7/g6, H7/h6, C11/h11, D9/h9, F8/h7,
G7/h6
Limit deviations of the hole tolerance zone are calculated in this paragraph for the
The respective hole tolerance zone is automatically set up in the listing during
selection of any of the recommended fits from the list in row [1.8]. If you wish to use another
tolerance zone for the hole, select the corresponding combination of a basic deviation (A ...
ZC) and a tolerance zone (1 ... 18) in pop-up lists in this row.
Though the general sets of basic deviations (A ... ZC) and tolerance grades (IT1 ...
IT18) can be used for prescriptions of hole tolerance zones by their mutual combinations, in
practice only a limited range of tolerance zones is used. An overview of tolerance zones
specified for general use can be found in the table in paragraph [1.3]. The tolerance zones
which are not included in the selection are considered special zones and their use is
Attention: In case you select a hole tolerance zone which is not defined in the ISO system for
the specified basic size, limit deviations will be equal to zero and the tolerance
Hint: For hole tolerances, tolerance zones H7, H8, H9 and H11 are used preferably.
Limit deviations of the hole tolerance zone are calculated in this paragraph for the
The respective shaft tolerance zone is automatically set up in the listing during
selection of any of the recommended fits from the list in row [1.8]. If you wish to use another
tolerance zone for the shaft, select the corresponding combination of a basic deviation (a ...
zc) and a tolerance zone (1 ... 18) in pop-up lists in this row.
Though the general sets of basic deviations (a ... zc) and tolerance grades (IT1 ...
IT18) can be used for prescriptions of shaft tolerance zones by their mutual combinations, in
practice only a limited range of tolerance zones is used. An overview of tolerance zones
specified for general use can be found in the table in paragraph [1.3]. The tolerance zones
which are not included in the selection are considered special zones and their use is
Attention: In case you select a shaft tolerance zone which is not defined in the ISO system
for the specified basic size, limit deviations will be equal to zero and the tolerance
Hint: For shaft tolerances, tolerance zones h6, h7, h9 and h11 are used preferably.
Parameters of the selected fit are calculated and mutual positions of tolerance zones of the
ANSI B4.1: Preferred limits and fits for cylindrical parts. [2]
This paragraph can be used for selection of a preferred fit of cylindrical parts according to
ANSI B4.1. This standard defines a system of dimensional tolerances and prescribes a series
of those preferred fits of cylindrical part, which are specified for preferred use.
Note: All numerical values of tolerances and deviations given in this paragraph are related to
Basic size.
It is the size whose limit dimensions are specified using the upper and lower deviations. In
case of a fit, the basic size of both connected elements must be the same.
Note: Standard ANSI B4.1 defines a system of preferred fits only for basic sizes up
to 16.69 in.
The tolerance of a size is defined as the difference between the upper and lower limit
dimensions of the part. The standard ANSI B4.1 implements 10 tolerance grades to meet the
Note: When choosing a suitable dimension it is necessary to also take into account the used
SYSTEM OF FITS.
The standard ANSI B4.1 defines two basic methods of coupling of holes and shafts for the
In this system of tolerances and fits, the lower deviation of the hole is always equal to
zero.
In this system of tolerances and fits, the upper deviation of the hole is always equal to
zero.
where:
The option of the system for the specified type of product or production is always influenced
• Costs for purchase, maintenance and storage of gauges and production tools.
Hint: Although both systems are equivalent in the view of functional properties, the hole
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx