Comparing Calibration Technologies for Liquid Handling Quality Assurance
By Curtis, R., Rodrigues, G. | Publication
The use of high-quality precision liquid handling instruments throughout the drug discovery, testing and production processes has tended to give scientists a sense of confidence in their data. However, the large amount of resources dedicated to drug development, the long FDA approval process, and the numerous recalls and legal actions plaguing several well-known drug companies suggest that more attention be paid to quality assurance. In particular, liquid handling processes, core to pharmaceutical laboratory operations, demand the application of robust, rigorous, science-based methods and tools to ensure data quality.
In life science laboratories, where technological breakthroughs are common, scientists often have a variety of tools available to complete everyday tasks, including liquid handling quality assurance. There are several options available to laboratories to calibrate liquid handling instrumentation and measure the efficacy of liquid handling processes, each with their own applications, benefits, and drawbacks. The optimal technology for each laboratory application depends on a variety of factors, from the volume of liquids to be quantified to the type of instrumentation used and the applicable regulatory and quality standards. Also to be considered are the laboratory environment, tolerance for risk, required calibration frequency, and the demands of the laboratory’s processes.
This article will compare gravimetry, fluorometry, single-dye photometry and ratiometric photometry – all common means for verifying liquid handling instrumentation – and will provide data and guidance regarding best applications of each.
Traditionally, laboratories have relied on gravimetry to measure the performance of liquid handling devices. This method uses a balance to weigh liquid volumes. The balance reports a weight and that weight is converted to mass and then to volume using conversion factors, which may be found in tables, calculated from formulas, or produced by software packages.
Gravimetry has several advantages, including the wide availability of weighing devices in most laboratories. In addition, gravimetry is a well-accepted technology. It is recognized by national and international regulatory agencies, including the International Organization for Standardization (ISO), the College of American Pathologists (CAP), and ASTM International. Published standard methods of gravimetry include ASTM E1154 and ISO 8655-6. Gravimetric calibration can also be traced to national standards, facilitating regulatory compliance and standardization.
Gravimetry is frequently the method of choice for measuring device performance when handling larger volumes. For example, a 1,000 microliter aliquot weighs approximately one gram and can be weighed reliably on a modern laboratory analytical balance. However, the current trend in laboratories toward handling smaller liquid volumes with automated devices is illustrating one major drawback of this method – as volumes decrease, weighing becomes more challenging for several reasons.
First, measuring smaller liquid volumes requires more specialized balances (producing measurement results to five or six decimal places on the gram scale). Such balances are delicate, require a stable platform to limit vibration, and are not as portable as the less sensitive models used for measuring larger liquid volumes. These requirements often make microgram balances not well-suited for use on the deck of automated liquid handlers. Illustrating the need for sensitivity, ISO 8655-6 requires that volumes 10 microliters or smaller be measured on a six-place (microgram) balance.
Because microgram balances take some time to settle, gravimetric calibration can also be time consuming. In addition, gravimetry is affected by a variety of environmental conditions, including evaporation, static electricity, and vibration. And as volumes become smaller, these error sources become more significant.
For example, modern dispensing equipment can deliver volumes so small that they can evaporate in a matter of seconds. Obtaining adequate resolution for small volumes requires a highly sensitive balance with complicated evaporation traps, static eliminators, and vibration dampeners. Other methods for controlling for evaporation can be complicated. One method is to measure the evaporation rate and correct for the resulting volume variation. Alternatively, the humidity in the room can be increased or a draft shield built to prevent air from flowing over the testing area. These steps add time and complexity to the measurement process.
Electrostatic effects also cause some uncertainty with gravimetric methods because plastic pipette tips are typically used to transfer liquids. Static electricity that is imparted to the balance pan or the draft shield induces a force that affects measurement accuracy and, when working with small volumes, the error can be significant. Vibration must also be controlled for, and this often requires calibrating in a controlled environment on a solid marble bench.
Because gravimetric measurements calculate volume by converting weight to mass and then to volume, accurate calibration is contingent on knowing the density of the fluid being pipetted. Many laboratory technicians assume the fluid being measured has the density of one, which is the approximate density of water. Although common solutions do have published density values, the densities are not always known to a high degree of accuracy.
To illustrate the possible uncertainty, consider DMSO, whose published density is 1.1 grams per milliliter. Note that the density is published with limited resolution, using only two significant figures. In addition, the density of DMSO changes depending on its water content, which depends on the starting water content and time exposed to ambient local temperature and relative humidity. Even the density of water varies with temperature and, at room temperature, is always less than one gram per milliliter, its commonly accepted value.
These details need to be accounted for if very precise measurements are required. Consider a device with accuracy specifications of better than 0.6 percent, which is a typical specification for high-accuracy pipetting of 1,000 microliters. Failure to correct for density errors, even when pipetting water, can lead to error in the 0.3 to 0.5 percent range, which is nearly as large as the acceptable error for the entire piece of equipment. However, when acceptable tolerances are in the five to ten percent range, density considerations are much less important.
An alternative to referencing published density values is to measure density with a commercial densitometer or pycnometer. For reliable results, these instruments require calibration just like other laboratory instruments and require care to avoid measurement error.
One last drawback of gravimetry is the inability to simultaneously measure each individual channel in multichannel liquid handling devices. With gravimetry, individual aliquots can be measured, or multiple dispenses may be made and the total weight then used to calculate the average volume. However, to measure the performance of single channels, each channel has to be tested one tip at a time, and this is time consuming and tedious. Testing each channel one time in a 96-channel device, for example, would require 96 dispenses.
In summary, gravimetric calibration is best suited for measuring the performance of single-channel devices handling larger liquid volumes, usually above 200 to 1000 microliters (the precise lower limit for effective use of gravimetry depends on the tightness of the tolerance to be met and the quality of the measuring equipment and procedure employed).
During fluorometric calibration, a beam of ultraviolet light is shone on a sample at one wavelength, called the excitation wavelength. This causes the molecules to absorb light and enter an excited electronic state. Release of this excess energy results in the emission of light at a different, longer wavelength, called the emission wavelength. A detector is used to measure how much light is emitted at the emission wavelength. Precision is measured by comparing relative fluorescence levels in different samples.
Fluorescent dyes are very photoactive and are capable of generating a strong signal at very low volumes. Small samples can thereby generate large signals at low concentrations and this facilitates fluorometry’s use in measuring small volumes, with measurements as low as five nanoliters possible.
A major drawback of fluorometry is the difficulty in achieving a robust traceability, which often prevents its use in regulated laboratories. This deficiency is due to the fact that the strength of the fluorescent measuring signal varies depending on local chemical environment, so factors such as solvent composition, pH, ionic strength, redox potential, time, etc., can alter the signal strength. This means that during a given measurement, the volume in a well can be compared to a volume in the previous well provided that all have very similar chemical composition. However, it is difficult to compare measurement readings day-to-day, assay-to-assay, or location-to-location unless traceability is established, typically by developing a standard response curve using a calibrated pipette, or other traceable liquid delivery device.
Yet, the accuracy and traceability of this standardization depend on many factors, and at small volumes (where fluorometry is most often used) this sort of standardization can be difficult. For this reason, fluorometry is most often used to determine precision only and not accuracy, leaving the user to estimate how close the actual dispense is to the desired volume. Work is currently in progress to develop better traceability for fluorometric calibration methods.
Fluorescence methods are also affected by quenching and photo bleaching. Fluorescent dyes can chemically degrade over time and are sensitive to temperature and pH. Some dyes are buffered, meaning they contain chemicals to prevent the pH from changing. However, unbuffered dyes suffer from pH shifts as the dyes absorb carbon dioxide from the air and become more acidic. This can affect the accuracy of the measurement reading. And because the properties of fluorescent dyes can shift in hours, standard curves should only be relied on for short periods of time. In addition, there are no commercially available fluorometric calibration technologies, although there are some published methods in scientific literature.
In summary, fluorescent calibration is best suited for demonstrating precision across nearly identical conditions when testing small liquid volumes and when accuracy and traceable measurements are not required.
Photometric calibration requires a photometer and stable dyes that absorb light in the visible or ultra-violet range. To use single-dye absorbance photometry to measure volumes, a dye solution is delivered into a cuvette, a measuring cell, or a clear-bottomed microtiter plate. A beam of light at a specified wavelength is passed through the solution and the photometer measures the quantity of light that passes through. The amount of light that is absorbed is proportional to the amount of dye present, permitting a volume determination to be made.
The photometric method produces good precision measurements and is less sensitive to environmental conditions than gravimetric and fluorometric calibration technologies. In addition, although photometric dyes do change due to temperature and pH, they tend to be more stable than fluorescent dyes. This means that the response from the photometric reader will be more consistent. In addition, photometry is typically immune to the presence of other chemicals that can have a large impact on a fluorescence signal. Therefore, photometry is better suited than fluorometry for making accuracy determinations.
Another benefit of photometric calibration methods is the ability to provide information about each channel in a multichannel device. Absorbance dyes that are readily available and commonly used include tartazine and potassium dichromate. There is also a commercially available single-dye method for single-channel pipettes that is commonly used in the clinical laboratory industry.
ISO 8655-7 recognizes the use of single-dye photometry for liquid handling device calibration. However, according to this standard, photometric methods should be accompanied by an uncertainty analysis that describes the measurement uncertainty. This analysis may include error contributions such as accuracy of the photometer and reagents, dye instability, deviation from ideal Beer’s Law behavior and the like.
To account for the dyes as a source of error, data on the stability of the dye, either from the manufacturer or developed in-house through a stability or validation study, is important. Because light is passed through the sample and an optical wall, the optical quality of the microtiter plate or cuvette used in the method can affect the accuracy and precision of the measurement, and laboratories must also account for this.
Like all dye-based methods, photometric methods must be properly standardized to obtain quantitative results for accuracy measurements. The traceability of the method depends on many factors, including how carefully the standardization is carried out. For traceable photometric readings, a standard curve must be developed by using a known liquid delivery device (calibrated pipette) or by weighing volumes. This process can be time consuming and tedious. In addition, it assumes that the liquid handling device used to develop the standard curve is reliable, and this adds a level of uncertainty.
In summary, single-dye photometric calibration is well-suited for measuring precision, particularly when handling volumes too small to be weighed on a balance. Accuracy measurements can also be made, however their robustness is limited due to the difficulty of ensuring that the method is properly standardized and an uncertainty analysis yields acceptable performance.
The ratiometric photometric calibration method is a refinement of photometry designed to overcome the accuracy limitations of traditional single-dye photometric volume measurements. Ratiometric photometry employs two standardized dyes and its measurement process produces absorbance readings in pairs that can be combined into absorbance ratio readings.
The primary benefit of this approach is its ability to improve the accuracy and robustness of measurement in comparison to non-ratiometric methods. Absorbance ratios can be measured more accurately than individual absorbances, leading to a higher degree of accuracy and precision in ratiometric methods versus traditional single-dye photometric methods. The underlying reason for this is that the absorbance of photometric calibration standards drifts over time, while ratios exhibit greater stability.
Compared to gravimetry, this method offers greater speed, ease-of-use and enhanced accuracy in small-volume measurements. Compared to fluorometry, ratiometric photometry provides accuracy as well as precision measurements and can do so to a traceable standard because the dyes function as an internal standard. Measuring the second dye in comparison to the first dye provides a nearly automatic compensation for the most common photometric error sources.
Systems based on ratiometric photometry provide information about each individual channel in multichannel devices and good reproducibility plate to plate. However, for ratiometric photometry to produce benefits, it must use well-characterized plates and carefully calibrated solutions of good stability.
In addition, to function properly, ratiometric photometric methods require use of specially formulated dyes in order to produce accurate absorbance ratios. Lastly, this technology is not always preferred when measuring only larger volumes, as other technologies may produce adequate measurements more cost effectively.
In summary, ratiometric photometry calibrations provide strong benefits when measuring small liquid volumes for protocols requiring traceability and a high degree of accuracy per channel as well as precision.
Pharmaceutical laboratories have varying protocols, processes and requirements and these can affect the choice of calibration technologies for liquid handling devices. Gravimetry, fluorometry, single-dye photometry and ratiometric photometry are common means for verifying liquid handling instrumentation, each with their own advantages and disadvantages. Understanding the assay and laboratory quality requirements, traceability needs, and tolerance for error as well as the level of accuracy and precision required can help laboratories make the right decision.
|Table 1: Comparison of Calibration Technologies|
|Gravimetry||› Good accuracy at high volumes
› Balances are usually readily available
› Offers traceability via weight sets
|› Problematic at low volumes
› Precision depends on environment
|Fluorometry||› Good precision
› Capable of low-volume measurement
|› Limited accuracy
› Poor traceability
|Single-Dye Photometry||› Good precision
› Insensitive to environment
|› Limited accuracy
› Traceability depends on preparation
|Ratiometric Photometry||› Good accuracy at all volumes
› Good precision at all volumes
› Insensitive to environment
› Traceability facilitated by dual-dye approach
|› Requires accurate photometer
› Requires accurate reagents
About the Authors:
Richard Curtis, PhD, is Technical Director at Artel. As well as overseeing the company’s strategic direction, Dr. Curtis manages Research, Development and Engineering activities, directing the advancement of Artel’s core technology through new platforms, evolution of current products, and continued introduction of new applications. He leads the Artel technical team in the development of proprietary technology and in securing patents in photometric analytical systems, electronic circuitry, optics, and engineering physics. Dr. Curtis earned a BA cum laude in Physics at Harvard and a PhD in Nuclear Physics at Brown University.
George Rodrigues, PhD, is Senior Scientific Manager at Artel. Dr. Rodrigues is responsible for developing and delivering communications and consulting programs designed to maximize laboratory quality and productivity through science-based management of liquid handling. In his role as Artel’s leading consultant, he has assisted numerous leading firms in the life sciences ensure their laboratory data integrity while improving their process efficiency. He participates in a number of international and national standards and quasi-regulatory bodies in the fields of metrology and liquid handling. Dr. Rodrigues earned a BS in Chemical Engineering at the U.C. Berkeley and a PhD in Chemical Engineering at the University of Wisconsin.