Machines of all kinds vary in performance. Blood tests machines have precise calibration needs for their delicate samples. Mutiple variables exist for blood tests result ranges including machine, manufacturer, temperature in the environment, and of course human error in operating the machine. Sample freshness is also critical. Results may be different for a blood sample tested one day after drawn from the patient, versus the sample two days after the blood drawing. Microscopic examination of blood cells are less costly than automated methods but are more labor intensive. Improved accuracy and finer precision are reasons why blood measurement instruments are commonly used. However, many variables in instrument uses can affect reliability in results.
One study concluded after studying 4 samples and 3 machines, that temperature plays a major role in blood test results. Researchers found small but statistically significant different results between the samples in one machine. They found major differences between machines of different manufacturers. Recommendations were for machines and samples to be operated at the temperature recommended by the manufacturer.
The same study also found significant differences in results recorded by different machine operators. Like the temperature experiment, the machine operators were found to have small, but statistically significant differences in the samples on the one machine. However, the differences were larger again with the different manufacturers’ instrument models.
Instrument Properties and Functions
Blood measurement instruments have a variety of working parts, processors, solutions, light sources and electrical sources. Manufacturers use similiar principles for measuring blood, however many variables can affect blood test result ranges: flow velocity, turbulence, impedance or resistance to current flow, aspiration of cells into the chamber, dilution, lysing or destroying cells, electrical current, solutions,voltage pulse thresholds, computer processors, optical sensors, frequency conductance, forward and angular light scattering, and fluorescent staining, to name a few.
Accurate blood test results rely on a measure of production quality. Machine calibration or fine adjustments, is the way to assess and maintain machine accuracy. Calibration standards and technology are provided by the manufacturer along with guidelines for proper machine use and contraindications (suggestions against disuse). Calibration techniques must be learned by the operator or contracted out to a technician. Significant variables in calibration skills among researchers, technicians and contractors exist. Dynamic calibration, rather than static calibration or resting calibration, allows data to be captured automatically and calibration to be performed continuously. Dynamic calibration is quick, and gives precise information on machine performance.
Dynamic calibration is the method of choice to calibrate machines, reducing machine and operator error. Generally, when an instrument is turned on, set-up and data collection procedures engage. The collected data is automatically converted into the system. Correction data and error compensation is transferred to a measurement processor for subsequent use during operation to compensate during its entire life cycle of blood measurements. Samples from a particular machine should therefore have similiar results, and should have statistically similiar results to the same instrument models from other laboratories. Blood sample integrity would then be the only other significant variable affecting blood test result ranges.
[Aultman Health Foundation, Canton, OH, 6/01, Equipment QC, Calibrations, Backbones for Blood Bank Citations](https://www.thefreelibrary.com/Equipment+QC,+calibration,+backbones+for+blood+bank+citations.(Brief...-a082573516)