All spectrophotometers rely on the 150-year-old observation that a material, like an invisible gas or any liquid, will absorb some amount of specific wavelengths of light while passing some amount of other wavelengths. One might think that this observation is, at best, an interesting phenomenon, but developing this technology has given the research and clinical laboratories one of the most important and versatile laboratory instruments ever devised. Between the 1950s and the 1970s, spectrophotometers were a fundamental instrument found in every hospital-based clinical laboratory in the country.
Today, they are found primarily in research facilities and reference laboratories. This technology did not disappear from clinical laboratories, however, since nearly every hospital laboratory contains automated analyzers employing the same basic principles embodied in these legacy manual instruments. At the heart of many of these modern instruments lies a spectrophotometer, albeit enveloped by some complex automated sample preparation hardware, sample flow plumbing, and, often, computer-based flow and output controls. Likewise, dry chemistry analyzers utilizing light reflectance technology, a cousin of spectrophotometry, utilize these same optical principles.
Before describing the spectrophotometer, some knowledge of its use in performing basic laboratory tests must be attained to understand how this instrument is used and what it does. As the spectrophotometer evolved, aside from merely being able to measure absorbance (the amount of light absorbed by the sample expressed as a percentage) or transmittance (how much light is passed, the mathematic reciprocal of absorbance), two important observations were made about the sample. First, the amount of light absorbed or transmitted at a particular wavelength is proportional to the concentration of a material. That is, the more of a particular material that exists in the sample, the greater the light absorbance. If the material does not absorb light itself, one or more reagents can be mixed into the solution and its absorbance or transmittance will be proportional to the concentration of the material.
Second, through prior testing, using titrated aliquots of a material, a correlation can be drawn between the absorbance or transmittance and the concentration of the material. What is “the material”? It can be a plethora of things from the amount of iron in the blood, alkaline phosphatase in the serum, or the amount of copper in tissue. The “secret,” if there is one, is in knowing which reagent or reagents (when necessary) to use, the process to employ (employing agitation, heat, cold, or simply allowing time to pass until the reaction occurs), the wavelength of light that yields the best results, and the conversion from the absorbance or transmittance reading to the known value of the material.
For manual lab tests performed with the spectrophotometer, this conversion information could be a graph in a textbook containing a smooth curve showing the instrument reading on one axis and the copper value, for example, on the other axis. If the lab technician is using a pre-packaged test kit, it will contain the reagents and a packaging insert containing step-by-step instructions to perform the test and the aforementioned graph.
To measure this absorbance or transmittance light by a material and quantify the results, basic single-beam spectrophotometers rely on a collection of basic components: a light source, light filter, a focusing device or devices, an absorption cell or sample cuvette, a photodetector, and a display device.
The first component is the light source, which must provide both the correct wavelengths of light and a constant intensity. Typically, a tungsten-filament bulb is used because it provides light in the wavelength of 330 to about 900 nanometers (nm), covering the visible region quite well. Hydrogen or deuterium lamps are frequently used for ultraviolet light, producing wavelengths from about 200 to 450 nm. Between the two lamps, the wavelengths of interest are covered.
The full spectrum light from the bulb passes through a light filter, called a “monochromator,” which allows only a narrow slice (typically 0.05–20 nm wide) of the overall light spectrum, centered on the wavelength of interest, to pass through to the sample. The filter typically uses colored filters or a variable-width narrow slit to filter the light. The filter can be adjusted manually by the lab technician or, on highly automated spectrophotometer models, scanned or automatically swept through a range of wavelengths while documenting continuous successive readings.
To ensure the light rays are parallel as they go through the sample, which both reduces scattering in and around the sample, most spectrophotometers place a combination of lenses, slits, and/or mirrors in the light path. These focusing devices may include components of the monochromator as a first stage in the focusing process.
Next, the light passes through an absorption cell, or cuvette, containing the sample being analyzed. Typically, cuvettes are round or rectangular and are constructed of glass, quartz, fused silica, or plastic. The key factor about the cuvette is that the material of which it is made should be completely transparent to the light (wavelength) being used for the analysis. This is not easy to do since optical glass absorbs light below 350 nm. Quartz cuvettes do not possess this characteristic, but they are both more expensive and more fragile than optical glass cuvettes. Cuvettes are always placed in a dark analysis chamber while being read; however, some designs employ a rotating chamber so that a sample-filed cuvette can be changed while another is being read. One variant of the basic design uses a special flow-through cuvette design to aspirate the sample into the cuvette, measure the absorbance, then dispose of the sample and rinse the cuvette in preparation for the next sample. A similar variant uses a continuous-flow cuvette to provide constant absorbance readings of a liquid stream.
When the light passes through the cuvette and the sample contained therein, some of the spectrum has been absorbed by the sample. The remaining light exits the cuvette and strikes a photodetector (a photomultiplier tube, photocell, or phototransistor) which provides an electronic output proportional to the intensity of the light leaving the sample. The simplest legacy units, still found in daily use in developing countries, pass the electronic output to a sensitive galvanometer—a device similar to an analog meter movement—containing a mirror in place of the customary needle or pen. Light bouncing off the mirror illuminates a spot on a translucent scale calibrated to provide a direct reading of the absorbance or conductance. More sophisticated models amplify and process the photodetector's electronic output to an integral calculator or computer electronics to provide this and other operator-selected information via a digital readout on the unit. High-end units provide their output to a personal computer that provides this and other valuable functionality.
In 1859, a scientist studying the properties of various gasses constructed the first spectrophotometer to measure the absorptive powers of gasses such as carbon dioxide, ozone, various hydrocarbons, and water vapor. Surprisingly, to the scientist that is, these “perfectly colorless and invisible gasses and vapors” absorbed some wavelengths of light while passing others. On Jan. 8, 1935, Professor Arthur C. Hardy of the Massachusetts Institute of Technology was granted a patent for the spectrophotometer. Five months later, General Electric introduced the first commercial recording spectrophotometer based on his work. In 1940, Arnold O. Beckman, the founder of Beckman Instruments, assembled a spectrophotometer using a glass prism, a vacuum tube photocell, and an amplifier from his company's pH meter. The following year, Beckman introduced an improved version which increased throughput and raised accuracy from about 25% to around 99.99%, setting a new standard in chemical analysis. This version was marketed, essentially unchanged, until it was discontinued in 1976. After some 35 years of sales, more than 30,000 of these instruments are used in chemistry, biochemistry, and both clinical and industrial laboratories.
Bausch & Lomb introduced the first highly accurate, low-cost spectrophotometer in 1954, dubbing it the “Spectronic 20.” Due to both features, it quickly became an industry standard instrument and probably the world's most widely used spectrophotometer. The early 1980s saw the first spectrophotometers coupled with microprocessor controls, both automating the analysis and reducing analysis time, again increasing throughput. Incremental improvements continue today, especially in the area of computer technology. Bruce Merrifield, a Nobel laureate and author, called the spectrophotometer “probably the most important instrument ever developed towards the advancement of bioscience.”
Single-beam spectrophotometers must first be set to zero absorbance with a reagent blank inserted into the cuvette; that is, the reagent blank is read and the instrument adjusted for 0.00% absorbance (100% transmittance). Then the lab technician replaces the blank with the sample and the absorbance of the sample is determined and read. To eliminate this wasteful step and to increase lab throughput, a number of variant designs are available in the marketplace. One design uses a special mirror to split the light beam, passing one part through the cuvette to the sample detector and the other part to a reference detector. Another variant uses one or more rotating mirrors, each driven by a high-speed stepper motor, to provide an alternate light path around the sample cuvette. This provides near simultaneous alternating readings both through and around the sample cuvette, interpreted by the electronics as nearly simultaneous zero absorbance and sample absorbance readings. The control electronics both synchronizes the stepper motors and mathematically determines sample absorbance. This design automatically accounts for lamp aging and momentary changes in lamp luminescence caused by voltage changes, transient dust particles, etc. It also automatically corrects for the change in light output as the spectrophotometer automatically sweeps through a range of wavelengths.
A contemporary alternate design (Figure 2) employs most of the same basic components, but passes full-spectrum light through the sample and then filters the light through a monochromator before it reaches a diode array instead of a photodetector. This design has the advantage of eliminating the moving parts associated with the monochromator, but the disadvantage of higher cost.
How to Manage the Device
The College of American Pathologists (CAP) has established standards for clinical laboratories and the maintenance of their equipment. Although not a regulatory body, following these standards is a requirement for laboratories desiring CAP accreditation. Federal agencies, such as the U.S. Centers for Medicare and Medicaid Services (CMS) have established standards, through the Clinical Laboratory Improvement Amendments (CLIA), that must be followed to qualify the hospital, medical center, or stand-alone laboratory for Medicare and Medicaid payments.
To comply with these standards, which do not specifically address maintenance management, the manager should schedule recurring services, such as preventive maintenance and electrical safety testing, uniquely to each spectrophotometer in their area of responsibility. Additionally, the manager should maintain a detailed history covering initial installation, scheduled services, remedial maintenance, and any modifications made to the spectrophotometer.
There is no specific regulation of spectrophotometers, but a number of organizations have established standards and guidance for the use and maintenance of this and other laboratory instruments. If the laboratory seeks certification by one of these organizations, its standards and guidance must be followed as if they were regulatory in nature.
“Spectrophotometers have been common in hospital laboratories since the early 1950s, so the basic technology is well established.”
Risk Management Issues
Spectophotometry presents two major risk management issues. The first is the inability to perform certain specific critical laboratory tests should the spectrophotometer succumb to a hard failure. The second deals with inaccurate results. Obviously whenever test results drive treatments and diagnoses, a great potential exists for medical misadventures.
If a test result incorrectly indicates the presence of a particular treatable condition, the patient will experience undue stress and anxiety as well as face unnecessary treatment costs. In some cases, the physician may inadvertently harm the patient by treating for a nonexistent condition. The other extreme, not confirming a problem that does in fact exist, can be equally or even more devastating—the disease will progress untreated while physicians search elsewhere for the cause of the presenting symptoms. In either case, inaccurate lab results are detrimental for both the patient and the healthcare facility.
Tungsten vapor, deposited on the interior of old lamps, or dust and deposits on the exterior can cause inaccurate results. Under these circumstances, the lamp must be replaced. It will not be burnt out or appear to be “bad” in the conventional sense, but the lamp is no longer usable. Single-beam spectrophotometers are particularly prone to inaccurate results because of old lamps and the accumulation of dust on the interior optics. Dual-beam instruments automatically compensate for some optical deterioration. Otherwise, due to the simplicity of their design, spectrophotometers are quite failure free and not maintenance intensive.
One oddity about the basic legacy spectrophotometer, which a biomed may still find in use in developing nations, is a problem rarely observed in modern equipment. Since these units bounce a beam of light off the galvanometerconnected mirror to an analog scale on the face of the instrument, a good hard shock to the instrument will cause the galvanometer-connected mirror to oscillate back and forth. While it is oscillating, the laboratory technician will not see the light on the analog scale, leading the technician to believe the bulb is burnt out, the instrument cannot be zeroed, the galvanometer is not working, or some other perceived problem. It can take as long as an hour—more commonly one-half hour—for the galvanometer to settle down and again reflect the light onto the translucent analog scale. This phenomenon does not occur in modern spectrophotometers.
Training and Equipment
Again, due to their simple design, legacy manual spectrophotometers can be maintained with nothing more than a voltmeter and common hand tools. As the design becomes more complex, additional service aids, such as manufacturer's literature, oscilloscope, etc., become essential items for the biomedical technician and the maintenance of spectrophotometers.
Spectrophotometers have been common in hospital laboratories since the early 1950s, so the basic technology is well established. The recent advances in computer technology coupled to spectrophotometers, together with improved detectors, diffraction gratings, and better lamps have allowed industry to produce instruments of unimagined accuracy, stability, higher resolution, and improved reproducibility. Additionally, the integration of spectrophotometry with computer technology has also provided features such as automatic baseline correction, improved monochromator performance, system controllers, and data analysis and storage. Additional improvements and further instrument refinements are expected as the level of computer technology increases.
Robert Dondelinger, CBET-E, MS, is the senior medical logistician at the U.S. Military Entrance Processing Command in North Chicago, IL. E-mail: email@example.com