Liquid handling automated performance monitoring
Liquid handling is the most common LUO, therefore it stands to reason that the real-time monitoring of liquid handling performance is the most common form of performance feedback in automated laboratory systems. The importance of monitoring this LUO has been shown in detail in the LabAutopedia article Importance_of_Integrating_a_Volume_Verification_Method_for_Liquid_Handlers. This article will outline the various methodologies commonly used, with links to more detailed articles.
Capacitive liquid level sensing
Detailed article: Capacitive liquid level sensing
Capacitive sensing was the original form of performance monitoring to appear on automated liquid handling workstations, and is still commonly used. Capacitive sensors use the electrical property of "capacitance" to make measurements. Capacitance is a property that exists between any two conductive surfaces within some reasonable proximity.A capacitor consists of two conductors (plates) that are electrically isolated from one another by a nonconductor (dielectric). When the two conductors are at different potentials (voltages), the system is capable of storing an electric charge. The storage capability of a capacitor is measured in farads. In a capacitive sensor, the sensor surface consists of these two capacitor conductive plates. The sensing action is based on the difference in dielectric constant between the sensor surface and the material being detected. They can be used to detect the presence of a wide range of material, but require relatively close range. The sensor contains no moving parts, is rugged, simple to use, easy to clean.
The change in capacitive signal can be used to indicate the presence of a material or change of environment, such as liquid meeting a pipette tip or cannula. Dual-probe capacitance level sensors can also be used to sense the interface between two immiscible liquids with substantially different dielectric constants. Calibration of the output states is highly dependent on configuration and materials and is very sensitive to changes. Liquid level sensing with a pipette tip or cannula is a prime example.
A weak electrical potential is created between the pipetting channel and the labware carrier. The liquid detection measurement is based on the difference in measured capacitance (picofarads) in the reference environment (air) vs. the process environment (liquid), which is directly related to the difference in the dielectric constant of the two environments. Most liquid handling software allows the user to set a threshold value for the delta in measured capacitance that will be equated to the event of the dispensing tip contacting the liquid surface. The higher the difference between the dielectric constants of the two environments, the easier the measurement. The dielectric constant of a material can change due to variations in temperature, moisture, humidity, material bulk density, and particle size. Polar compounds will have higher dielectric constants. The following are some common dielectric constants at 20°C, 1 Atmosphere (unless otherwise stated).
|Material|| Dielectric Constant|
20°C, 1 ATM unless stated
|Liquid ammonia||25 (-78°C)|
Because capacitive sensors will react to their nearby non-contact environment as well as the contact environment, the dielectric constant of the liquid being sensed should not only differ measurably from air (1.00059), but also from common surrounding labware such as glass (3.7) polyethylene (2.25), polystyrene (2.6) or Teflon (2.1). In other words, the readout of the capacitive sensor will change somewhat as the dispensing tip approachs labware, change more as the tip enters the lip of labware, and change even more as it nears but is not yet touching the liquid surface. Thus, the capacitive change upon tip touching the liquid surface must be significantly greater than the net of all these non-contact changes for reliable liquid level sensing. Obviously, this eliminates using capactitive sensing to determine the presence of non-polar solvents, such as hexane (1.9) or mineral oil (2.1), which will be difficult or impossible to distinguish from the capacitive change caused by the dielectric constant of the surrounding labware.
Pressure-based performance monitoring
The calculation of fluid flow rate by reading the pressure loss across a flow channel restriction is perhaps the most commonly used flow measurement technique in industrial applications. In the 18th century, Bernoulli first established the relationship between static and kinetic energy in a flowing stream. As a fluid passes through a restriction, it accelerates, and the energy for this acceleration is obtained from the fluid's static pressure. Consequently, the line pressure drops at the point of constriction. Part of the pressure drop is recovered as the flow returns to the unrestricted channel. The pressure differential (h) developed by the flow element is measured, and the velocity (V), the volumetric flow (Q) and the mass flow (W) can all be calculated using the following generalized formulas:
|V = k (h/D)0.5|
|or Q = kA(h/D)0.5|
|or W = kA(hD)0.5|
where k is the discharge coefficient of the element (which also reflects the units of measurement), A is the cross-sectional area of the channel opening, and D is the density of the flowing fluid. The discharge coefficient k is influenced by the Reynolds number and by the "beta ratio," the ratio between the bore diameter of the flow restriction and the inside diameter of the channel.
MEMS-scale pressure transducers inside the liquid dispensing channel of an automated liquid handling workstation can measure the pressure inside the channel during pipetting. Such sensors can be placed in multiple liquid channels, but tend to be expensive and are typically not employed beyond 8x multiple channels.
Liquid level detection
The signal from imbedded pressure sensors changes as a pipette tip approaches a liquid surface, touches the surface and drives below. This data can be used to control pipetting in real-time. Pressure level sensing is independent of the polarity of the liquid environment.
Pressure data can be continuously recorded during pipetting. When the pressure goes outside the pressure limits prescribed for different times during the aspiration or dispense cycles, an error event is registered and communicated. Macro programs can be written in a variety of ways to handle these events. They can be written to ignore them, stop the method, ask for user intervention or automatically attempt to deal with the error without user intervention. For example, in the event of a clog or unusually low pressure during aspiration, the macro could evacuate the tip or needle and try to aspirate again. It could even be programmed to move the tip slightly to avoid a swab in a tube and perhaps be programmed to make several aspration attempts before requesting user intervention or flag the sample as an error and automatically move on to the next sample.
Aspiration error detection
- Tip blockage is detected when the pressure drops below the minimum set values. The minimum set values vary with aspiration/dispense time and are predetermined by the results of test runs. Limits can be widened or tightened to minimize false errors.
- Insufficient sample or incorrect aspiration is detected when the pressure rises above the maximum limit during aspiration.
Dispense error detection
- A blocked tip is detected when pressure rises above the maximum pressure limit during dispense.
- Leaking seals are detected when the pressure declines below the minimum pressure limit during dispense.
Gravimetric-based performance monitoring
A long-standing and accuract method for real-time monitoring of liquid handling is via gravimetric measurement using a balance or load cell. To translate gravimetric measurement precisely to volume, the temperature and pressure must be known along with the specific gravity of the liquid. This method is not always practical, because the liquid transfer must take place on the load cell to register the change in mass to realize real-time control. Alternatively, the empty vessel can be tared on the load cell, then moved to a dispense location, and returned to the load cell after dispensing to evaluate the mass of the dispensed liquid. This approach is both time-consuming and does not afford real-time control, but may be suitable when dispensing tolerances are not stringent. Gravimetric methods are also less suitable for monitoring multi-channel dispensing, because each channel must be dispensed separately in order to record the mass change corresponding to each channel. Consequently, the gravimetric technique is most often used to monitor single channel dispensing on a workstation that has a primary, fixed location for liquid dispensing, such as dissolution workstations.
Gravimetric methods can also be used to monitor the status of bulk reagents, solvents and carrier fluids, by placing the bulk container on a load cell (such as this "weigh pad"). Minimum mass set points can be defined to trigger a refill reminder or alarm, either during the course of an operation or at the time of system initialization.
Ultrasonic-based performance monitoring
Ultrasonic-based monitoring has been used in industrial bulk environments for some time, but has only recently appeared in the laboratory workstation environment (i.e. the Caliper Life Sciences “PING!”) as advances in piezoelectric crystal technology, driven primarily by the inkjet printing industry, have enabled miniaturization. This approach uses a series of brief high-voltage pulses delivered to a piezoelectric crystal to cause the crystal to oscillate (at 800kHz in case of “PING!”) and create ultrasonic vibrations that travel through the surrounding air. As these vibrations travel outward from the crystal and are reflected back off the nearest surface, the voltage to the crystal is turned off. The reflected vibrations impact the crystal, causing it to oscillate and generate a voltage that is monitored. The distance to the encountered surface is calculated using the signal transmission and return time together with the known velocity* of the signals in the surrounding media (air). In a laboratory workstation environment, the liquid level in a vessel can be determined by measuring absolute distance from the piezoelectric sensor, or by measuring the location of a liquid meniscus relative to the top of the vessel or the bottom of an empty vessel.
Ultrasonic monitoring has the advantage of being a non-contact technique, thus enabling a large number of measurements to be executed in a very short time and avoiding any concerns about tip washing and drying, both strong advantages in a multi-vessel environment such as a microplate. The technique is based upon acoustic reflections at points of density change from air, and so is not affected by the polarity, ionic strength or density of the liquid media nor by the media of the surrounding vessel. The measurement has a resolution of 0.1 to 0.3mm, and is not subject to the somewhat “fuzzy” signal state change that characterizes contact sensing techniques near and at the air/liquid interface. Ultrasonic monitoring can also be used to determine when a vessel contains no liquid or has been drawn down to contain no liquid – a measurement that would risk a tip collision using a contact technique.
Measurements (the speed of the ultrasonic signal) can be slightly affected by the ambient temperature and to a lesser degree by humidity, but these factors may be accounted for by measuring one surface, such as the surface of a liquid, relative to another surface, such as the top of the microplate. Irregular liquid surfaces (bubbles, froth) can affect ultrasonic distance measurements (contact techniques are also hampered by these conditions), but multiple ultrasonic measurements can be quickly made across a liquid surface (at a rate of 2 milliseconds / measurement) and the median value calculated to compensate for surface irregularities.
Ultrasonic liquid level sensing is currently configured in a 1x mode, but could be configured as an 8x linear array. Smaller well configurations (384 and 1536 well microplates) may be addressed with smaller and more focused ultrasonic devices than would typically be used for 96-well plates. Such devices generally have a shorter range.
Imaginative use of ultrasonic sensing may go beyond just liquid level sensing, as it can be used just like radar or sonar to “map” the contour of a surface – in this case the working deck of a workstation. Such capability could be used to confirm the proper setup of a workstation deck prior to starting a run. It could also be used to eliminate manual mechanical teaching of deck fixture locations and/or for fine tuning positions over time.
- The Hamilton Company: Liquid Handling Technology
- Omega.com: Flow and level measurement
- Capacitive sensors
- Practical pharmaceutical laboratory automation, Brian Bissett, CRC Press, 2003 ISBN 0849318149
- ↑ Daniel Bernoulli "Hydrodynamica", 1734. Britannica Online Encyclopedia.
- ↑ McNeil, A., Bitko, G., and Frank, R., Improving the MEMS Pressure Sensor, Sensor Magazine, July 1, 2000
- ↑ Provisional U.S. Patent Document: 700-02600_Provisional.pdf
|Click [+] for other articles on||The Market Place for Lab Automation & Screening||Automated Liquid Handling Systems Microplate Transfer Systems|
|Click [+] for other articles on||The Market Place for Lab Automation & Screening||Valves and Pumps|
|Click [+] for other articles on||The Market Place for Lab Automation & Screening||The Market Place|
|Click [+] for other articles on||The Market Place for Lab Automation & Screening||Microplate Dispensers|