Lead exposure often occurs with no obvious symptoms and frequently goes unrecognized. A blood-lead test is unequivocally the best available way to measure lead exposure.1 The Centers for Disease Control and Prevention (CDC) monitors blood-lead levels (BLL) in the United States through the National Health and Nutrition Examination Survey.2 The National Health and Nutrition Examination Survey data from 1976 to 20163 confirms a decline in the BLL of the uninstitutionalized US population2 and children (aged 1-5 years) in particular.
While the trend toward lower BLL is encouraging for public health, it is apparent that there is no identifiable safe BLL in children1,4–6 or adults.7 We previously outlined some steps laboratories may need to consider to achieve accurate testing of BLLs at lower concentrations, potentially including alteration of current analytical procedures.5
To assist laboratories further in meeting new challenges they may encounter while measuring lead at lower BLLs, the CDC's LAMP (Lead and Multielement Proficiency) program is available as a voluntary, free of charge, external quality assurance program designed to promote high-quality whole blood-lead measurements in laboratories worldwide.8 The LAMP has approximately 200 laboratories enrolled, with approximately 50 currently active US laboratories. The LAMP program aspires to assist in the prevention, diagnosis, and treatment of the adverse health effects of lead exposure both in the United States and abroad by helping laboratories produce high-quality blood-lead measurements. The LAMP program is open for enrollment to any laboratory. Currently, LAMP participants comprise a mix of government, academic, and private laboratories. The duration of participation by laboratories has ranged from some participants completing the full 47 rounds to others continuing enrollment for only a few months. In any event, participation allows laboratories to troubleshoot and gain information about their accuracy and precision, as well to evaluate their own internal application of good laboratory practices. Laboratories may not use LAMP participation for accreditation or certification; however, the program does provide participants with detailed reports about how well they perform blood-lead analyses compared with CDC and with their method peer group. Laboratories that measure and report BLLs in people in the United States using methods categorized as moderate or high complexity under the Clinical Laboratory Improvement Amendments of 1988 (CLIA) are required to participate in proficiency testing6 where the criterion for acceptable performance is currently ±4 μg/dL or ±10%, whichever is greater.9 In 2010, the CDC Advisory Committee on Childhood Lead Poisoning Prevention recommended that this criterion be tightened to ±2 μg/dL or ±10%, whichever is greater.10 In 2016, the Lead Poisoning Prevention Subcommittee of the National Center for Environmental Health/Agency for Toxic Substances and Disease Registry Board of Scientific Counselors repeated the importance of the 2010 Advisory Committee on Childhood Lead Poisoning Prevention recommendation.11 In 2010, the Advisory Committee on Childhood Lead Poisoning Prevention noted that when proficiency testing data were evaluated at the tighter criteria, laboratories using anodic stripping voltammetry were the most likely to become unsuccessful (eg, 25% of 3010B laboratories, 22% of LeadCare II laboratories, and 9% of LeadCare I laboratories), but the majority of laboratories measuring blood lead were successful at the tighter criteria. In January 2017, the CDC LAMP program changed its criteria for successful participation for blood-lead measurement from ±4 μg/dL or ±10% to ±2 μg/dL or ±10%, whichever is greater. This change was made to encourage laboratories to maintain accuracy and precision that may be required under CLIA in the future and that is more appropriate for blood-lead reporting at levels less than 5 μg/dL. Because CLIA is relevant to the US laboratories, this report serves as a summary of US LAMP participants' performance for the latest 5 challenges, based upon these more stringent evaluation criteria.
The CDC's LAMP laboratories are divided into 4 method groups based upon technology used to measure blood lead: (1) inductively coupled plasma mass spectrometry (ICP-MS), (2) graphite furnace atomic absorption spectroscopy (GFAAS), (3) LeadCare II, and (4) LeadCare Ultra/Plus. The ICP-MS is used in laboratory-developed tests categorized as high complexity under CLIA and is becoming a common tool in many public health and private clinical laboratories due to its high degree of specificity, sensitivity, and selectivity, while also offering options for multielement analysis from a small sample size. Many clinical laboratories today use GFAAS in their high-complexity laboratory-developed tests to measure blood lead. The GFAAS is precise and dependable as a single element laboratory-based technique. In addition, approximately 26% of our LAMP's US participants are LeadCare II or LeadCare Ultra/Plus users. The LeadCare II is a point-of-care blood-lead analyzer that has the requirement to store blood at 10°C to 32°C from collection up to 24 hours prior to being mixed with treatment reagent. The LeadCare II test is categorized as a CLIA-waived test in which participation in proficiency testing is not required, though it is recommended. LeadCare Ultra and LeadCare Plus are newer blood-lead techniques also based on anodic stripping voltammetry categorized as moderate complex tests under CLIA that have the requirement to store blood at 1°C to 25°C from collection up to 72 hours prior to being mixed with treatment reagent. LeadCare Ultra and LeadCare Plus use the same test kit materials, but the LeadCare Ultra can test up to 6 blood samples simultaneously while the LeadCare Plus can test only 1 blood sample at a time.
The preparation of LAMP samples begins with the collection of blood from lead-dosed cows: cows receive an oral dose of lead nitrate in gelatin capsules and are maintained at the US Department of Agriculture; US Dairy Forage Research Center in Sauk City, Wisconsin (approved University of Wisconsin-Madison IUCAC protocol A005901). After collection, the blood is stored frozen (−25°C) until it is ready to be used. In addition to lead, LAMP also provides participating laboratories target concentrations for cadmium, mercury, selenium, and manganese. The target concentrations of the pools are predetermined over a year in advance by the LAMP program coordinator to meet the needs of the ever-changing landscape in the blood-lead arena. If needed, samples are spiked with cadmium- and mercury-certified standards (1000 mg/L certified standard), traceable to the National Institute of Standards and Technology. The leaded bovine blood pools are mixed to the appropriate concentration and then dispensed into 2-mL cryovials. Before dispensing, the CDC screens the cryovials to ensure that there is no background contamination present. After dispensing, we test the cryovials for homogeneity and assign target values for the ICP-MS and GFAAS method group using ICP-MS measurements performed at the CDC.12 The LeadCare II and LeadCare Ultra/Plus results are used to set the target values for the LeadCare users. Samples are stored frozen (−25°C) until shipped to participants. Each participant receives 2 vials of each pool for testing in duplicate on 2 separate runs (n = 8 results reported per pool). This testing scheme allows LAMP program staff to evaluate within-run, as well as between-run, accuracy and precision for each participating laboratory.
We evaluated LAMP participant performance data for the last 5 challenges distributed in the 5 quarters between January 2017 and March 2018. As previously described, during a challenge each participating laboratory analyzes a set of 3 unknown blood samples and reports their results to the CDC. The CDC compiles the results by analytical method and reports both the laboratory group summary statistics and the individual laboratory summary results compared with the CDC target value and the laboratory group or consensus means.4 The CDC assigns the blood-lead target values for ICP-MS and GFAAS using CDC data generated from CDC ICP-MS, linked to the National Institute of Standards and Technology, for the 15 pools included in this study (range: 0.7-47.5 μg/dL). Because of the logistics of obtaining, mixing, dispensing, checking homogeneity, setting the target values, and shipping the blood samples, the LAMP program is not able to provide fresh or unrefrigerated blood per the requirements of the LeadCare blood-lead test systems. We derived method-specific target values for LeadCare II and LeadCare Plus/Ultra using CDC in house LeadCare II and LeadCare Plus/Ultra data on all program challenge pools. For blood-lead target value computation for both LeadCare instrument classes, if results were reported as less than limit of detection (LOD), we imputed the actual LOD for that particular instrument (either 3.3 μg/dL for LeadCare II or 1.9 μg/dL for LeadCare Ultra/Plus) and averaged eight results for the LeadCare LAMP target values. We applied the more stringent acceptability criteria of ±2 μg/dL or ±10%, whichever is greater. When a blood-lead test result was submitted as less than LOD, it was treated as the LOD reported by laboratories (for laboratory-developed tests) or reported by the test manufacturer (for the LeadCare tests). The result was treated as acceptable if the LOD plus 2 μg/dL was within the acceptable range for the method group, or if the target value was less than the reported LOD.
Geometric means and 95th percentiles
We calculated geometric means and 95th percentiles for blood lead using SUDAAN version 11.0.0 (Research Triangle Institute, Research Triangle Park, North Carolina). The SUDAAN uses sample weights and calculates variance estimates that account for the complex survey design.
Least squares means and variances
We fit the measurement results for each sample with a general linear model (analysis of variance). The model has the “method” as a variable with a fixed effect and both “laboratory” and “run” as variables with random effects (each laboratory has 2 runs for each sample and each run has 2 vials for measurement). We obtained least squares means for each method from the model estimation. We used a separate general linear model with only “laboratory” and “run” as variables with random effects (without the “method” variable) to estimate the variance components for “laboratory,” “run,” and measurement error. The SAS version 9.4 (SAS Institute, Inc, Cary, North Carolina) was used to run a PROC GLM procedure with a “random statement.”
We summarized comparative results for lead concentrations in blood reported by our US LAMP laboratories in Table 1. The LeadCare II method group (LOD: 3.3 μg/dL) does not have summary statistics for the 4 samples, with blood-lead concentrations less than 3.3 μg/dL (samples 1-4). In 13 of the 15 samples, the variance of ICP-MS and GFAAS results is smaller than that of both LeadCare II and LeadCare Ultra/Plus. In addition, in pools greater than 3.0 μg/dL, the LeadCare II variance is less than that found for LeadCare Ultra/Plus as reported by our LAMP participants. In all instances, results were appropriately reported as less than LOD by all LeadCare II respondents.
Table 2 presents data for BLLs separated by method and listing the number of reported results that pass for both judging criteria: within ±2 μg/dL or 10% and within ±4 μg/dL or 10%. Ninety-two percent of ICP-MS results were acceptable with the more stringent ±2 μg/dL or 10% criteria, 94 reported results failed to fall within acceptable limits; in contrast, 61 of these failed results were acceptable under the criteria of ±4 μg/dL or 10% with 33 failures. Ninety-three percent of GFAAS-reported results were acceptable using ±2 μg/dL or 10% while 98% of this set of reported results were acceptable with ±4 μg/dL or 10% with 16 failing reported results. Seventy percent of LeadCare Ultra/Plus results were within ±2 μg/dL or 10%, while 89% of the submitted results met the current CLIA accuracy requirements. LeadCare II had 92% passing results; however, 30% of the reported LeadCare II results were less than the LOD (Table 3). LeadCare Ultra/Plus with a lower LOD reported only 8% of results as less than the LOD across all concentrations. Figure 1A shows all LAMP results from this study that fell outside both ±2 μg/dL or 10% and ±4 μg/dL or 10%.
Three to 5 μg/dL BLL
The imprecision of the reported results with target values between 3 and 5 μg/dL was consistently higher with LeadCare technologies as shown in Table 1. The LAMP participant data for samples with blood-lead concentrations in the range of 3 to 5 μg/dL stratified by instrument type had the following range of variances: 0.26 for ICP-MS, 0.35 for GFAAS, 0.43 for LeadCare II, and 1.50 for LeadCare Ultra/Plus (Table 4). Only 2 out of 1110 (0.2%) reported results were not within the acceptable range for challenge samples with lead concentrations between 3 and 5 μg/dL using the criteria of ±4 μg/dL. Similarly, ICP-MS, GFAAS, and LeadCare II had 99% acceptable reported results using ±2 μg/dL. The results for LeadCare Ultra/Plus were within the target range based on ±2 μg/dL 72% of the time (Table 4).
Using current CLIA-reporting requirements, the US LAMP participants between January 2017 and March 2018 accurately measured and reported BLLs in the range of 3 to 5 μg/dL 100% of the time; therefore, all participating laboratories successfully reported BLLs in the range of 3 to 5 μg/dL using ± 4 μg/dL. However, only 95% of reported results fall within ±2 μg/dL. LeadCare Ultra/plus–reported data outside of 2 μg/dL accounted for 4% of the failing results.
Our data indicate that no ICP-MS or GFAAS laboratory reported data less than LOD on samples in the 3 to 5 μg/dL range. All US LAMP ICP-MS and GFAAS laboratories participating in these challenges reported LODs less than 3 μg/dL. Thirty-three percent of LeadCare II results were reported as less than LOD (Table 3) for blood samples with lead concentrations between 3 and 5 μg/dL (Table 1 samples 4-9). None of the LeadCare Ultra/Plus results were reported as less than LOD. The average imprecision for LeadCare Ultra/Plus was about 5 times higher than GFAAS and ICP-MS while the LOD for LeadCare Ultra/Plus (1.9 μg/dL) is about 40% lower than LeadCare II (3.3 μg/dL).
We identified the false-negative results pattern by method in Figure 1B (see Supplemental Digital Content Table 1, available at http://links.lww.com/JPHMP/A530) for blood-lead values in the range of 3 to 5 μg/dL. Overall, there were 2 results outside the ±4 reporting limits and 59 results were outside ±2 reporting limits.
Discussion and Conclusion
Our data presented for US LAMP laboratories reporting blood-lead concentrations on unknown samples show that even with enhanced reporting criteria of ±2 μg/dL or 10%, 89% of results reported are accurate and fall within these acceptability limits. Laboratories improve their performance when current CLIA-required limits are applied to submitted results, with 96% of all submitted results acceptable. We had 2748 results reported for these 5 challenges. Using the current criteria, there were 106 results that were not within the acceptability criteria. If we narrow the criteria to that used by LAMP program currently, 288 reported results fall outside the acceptability range. If each submitted result represents a child's blood-lead test result, then using the enhanced ±2 μg/dL or 10%, 11 out of 100 results would be reported as false positive or false negative. If we narrow or focus on those blood-lead pools with BLLs between 3 and 5 μg/dL, 5% (59/1110) of the submitted results fall outside acceptable limits.
One weakness of the LAMP data is the requirement for LeadCare II to use fresh unrefrigerated blood within 24 hours and LeadCare Plus and LeadCare Ultra to use fresh blood within 72 hours. Our LAMP blood is older than 72 hours due to program logistical necessities, such as homogeneity testing and characterization requirements along with packaging and shipping. The LAMP program compensates for this issue by providing separate LeadCare target values.
Analytical methods such as GFAAS, ICP-MS, and electrochemical methods are available for the routine measurement of BLLs less than 5 μg/dL; however, if the blood-lead reference level drops, laboratories wishing to engage in routine biomonitoring of blood lead could consider evaluating their performance, continuing to lower their levels of detection, and paying attention to preanalytical variables. As public health focuses on lower levels of blood lead in children and tries to correlate the possible health effects, laboratories may need to place a greater emphasis on accuracy and precision.
Implications for Policy & Practice
- Laboratories in the United States can accurately measure BLLs using current CLIA-reporting guidelines.
- Imposing tighter reporting limits results in a decreased reporting proficiency. Narrowing acceptance criteria to ±2 μg/dL in the 3 to 5 μg/dL range yielded 5% failing BLL reported by LAMP laboratories in this investigation.
- Blood-lead results (99.8%) reported to LAMP were accurately reported blood-lead results in the 3 to 5 μg/dL range using acceptability criteria of ±4 μg/dL.
- Methods requiring fresh blood and limitations on maximum delay in analysis times present an evaluation challenge for LAMP and other external quality assurance programs.
1. Centers for Disease Control and Prevention. Low Level Lead Exposure Harms Children: A Renewed Call for Primary Prevention. Atlanta, GA: Centers for Disease Control and Prevention, US Department of Health & Human Services; 2012. http://www.cdc.gov/nceh/lead/ACCLPP/Final_Document_030712.pdf
. Accessed June 1, 2018.
2. National Center for Health Statistics. The National Health and Nutrition Examination Survey. Hyattsville, MD: US Department of Health & Human Services, Centers for Disease Control and Prevention; 2014. http://www.cdc.gov/nchs/nhanes/about_nhanes.htm
. Accessed May 28, 2018.
3. Centers for Disease Control and Prevention. Fourth Report on Human Exposure to Environmental Chemicals, Updated Tables. Atlanta, GA: US Department of Health & Human Services, Centers for Disease Control and Prevention; 2015. http://www.cdc.gov/exposurereport/
. Accessed May 28, 2018.
5. Caldwell KL, Cheng P-Y, Jarrett JM, et al Measurement challenges at low blood lead
levels. Pediatrics. 2017;140(2):e20170272.
6. Navas-Acien E, Guallar EK, Silbergeld SJ. Rothenberg lead exposure and cardiovascular disease—a systematic review. Environ Health Perspect. 2007;115(3):472–482.
7. National Toxicology Program. NTP monograph on health effects of low-level lead. NTP Monogr. 2012;(1):xiii, xv–148.
8. Centers for Disease Control and Prevention. Laboratory
Quality Assurance and Standardization Programs. Lead and Multielement Proficiency Program (LAMP). https://www.cdc.gov/labstandards/lamp.html
. Published 2016. Accessed June 1, 2018.
9. Clinical Laboratory
Improvement Amendments (CLIA) of 1988. Pub L No. 100-578, §353, 102 Stat. 2903 (1988).
12. Jones DR, Jarrett JM, Tevis DS, et al “Analysis of whole human blood for Pb, Cd, Hg, Se, and Mn by ICP-DRC-MS for biomonitoring and acute exposures.” Talanta. 2017;162:114–122.