Agreement of Refractive Error Measures by Cycloplegic Refractive Error from GSE
The effect of magnitude of refractive error (based on cycloplegic refraction from the GSE) on agreement was analyzed using the presence/absence of significant refractive error and by categorizing refractive error into four groups. Presence/absence of significant refractive error was defined as hyperopia >3.25 D, myopia >2.0 D, astigmatism >1.5 D, or anisometropia >1.0 D. The mean intertester differences of sphere and SE were similar between eyes with and without significant refractive error for the Retinomax. The mean of intertester differences was within 0.05 D for both sphere and SE (p > 0.05; Table 2). The mean of intertester difference of cylinder was slightly larger, yet statistically significant, in eyes with significant refractive error than eyes without significant refractive error (0.04 D vs. −0.02 D, p = 0.001; Table 2). For the SureSight, the mean intertester differences of sphere, cylinder, and SE were similar between eyes with and without significant refractive error (within 0.05 D for sphere, 0.01 D for cylinder, and 0.06 for SE, p > 0.05; Table 3).
For the Retinomax, the width of the 95% limits of agreement was greater in sphere, cylinder, and SE among eyes with significant refractive error than eyes without significant refractive error (all p < 0.01; Table 2). For the SureSight, the width of the 95% limits of agreement was greater for cylinder (p < 0.0001) but not for sphere and SE (p > 0.05) among eyes with significant refractive error than eyes without significant refractive error (Table 3).
When cycloplegic refractive error measured during the GSE was analyzed as SE and grouped into four groups (myopia ≤ −0.5 D, emmetropia −0.5 to 1 D, mild hyperopia 1 to 2 D, moderate to severe hyperopia >2 D), the mean intertester difference of refractive error from the Retinomax did not differ by levels of refractive error, but the 95% limits were significantly larger when the eye was either myopic or hyperopic (p < 0.0001; Table 2). Similarly for the SureSight, the mean intertester difference of refractive error did not differ by levels of refractive error, but the 95% limits were significantly larger when the eye was either myopic or hyperopic (p < 0.0001; Table 3).
Agreement of Refractive Error Measures by the Confidence Number of the Reading
The confidence number of the reading by either nurse or lay screeners was below the manufacturer’s recommended value in 123 (4.3%) eyes measured with the Retinomax and in 185 (6.8%) eyes measured with the SureSight.
For both instruments, the mean intertester differences were small whether or not readings from lay or nurse screeners had a confidence number below the manufacturer’s recommended value (p > 0.05; Tables 2, 3). However, the 95% limits of agreement in sphere, cylinder, and SE were greater when the confidence numbers from either nurse or lay screeners were less than the manufacturer’s recommended value (p < 0.05; Tables 2, 3). This was particularly true for the Retinomax, the width of 95% agreement limits of sphere and SE were more than doubled when the confidence number of the reading was below the manufacturer’s recommended value (p < 0.01; Table 2).
This study evaluated the intertester agreement between trained lay and nurse screeners for measuring refractive error using the Retinomax and the SureSight. Based on data from large numbers of Head Start preschoolers across five clinical centers, this study found that lay and nurse screeners agreed well in measuring refractive error in a screening setting. Child’s age, refractive error, and confidence number of autorefractor reading had little impact on the mean intertester difference. However, the variation of intertester differences is larger (i.e., wider limits of agreement) when children had a significant refractive error or when the confidence number of the reading was below the manufacturer’s recommended value.
Retinomax and the SureSight are handheld autorefractors and have been widely used in vision screening and clinical practice for measuring refractive error by personnel with various levels of training. Previous VIP articles have reported that both the Retinomax and the SureSight correlated well with gold standard eye examinations for detecting vision disorders whether tests were administered by licensed eye care professionals (sensitivity of 63 to 64% at a specificity of 90% and 51 to 52% at 94% specificity)1,20 or trained nurses or lay screeners (sensitivity of 62 to 68% for the Retinomax and 61 to 64% for the SureSight at 90% specificity).2 Other researches also have indicated that the Retinomax and the SureSight provided valid measures of refractive error in young children.4–8 However, data on their intratester or intertester agreement are scarce. A few small-sample studies evaluated the intratester (i.e., test-retest) agreement of the two instruments in a variety of age groups,9–14 and results suggested good intratester agreement for both instruments in the preschool age group (Table 4). The means of intratester differences from these studies were all within 0.15 D for the Retinomax and within 0.20 D for the SureSight, except one study in very young children (aged 2 to 12 months) that showed large mean differences of 1.8 D for sphere and 1.3 D for cylinder.12 Our study evaluated intertester agreement between two types of testers (trained lay screeners vs. trained pediatric nurse screeners) for measuring sphere, cylinder, and SE on a very large sample of 3- to 5-year-old preschoolers (N = 1452). The mean intertester differences from our study are comparable to the mean intratester differences reported in the literature. However, the 95% limits of intertester agreement tend to be larger than that for intratester difference. This may be caused by the additional variation introduced by the second tester in the intertester agreement. These findings on the intertester agreement provide valuable information on the expected difference of measuring refractive error when testing is performed by two different screeners using the Retinomax or the SureSight. Considering all the findings of intratester agreement from previous studies and intertester agreement of the current study for the Retinomax and the SureSight, both instruments seem to provide very consistent measures of refractive error in preschoolers whether the test is administered by the same screener or by a different screener, supporting their use for detecting the change of refractive error over time.
The ranges of refractive error measures from the Retinomax and the SureSight are very different. The possible range of sphere is −18 to +23 D for the Retinomax versus −5.0 to +6.0 D for the SureSight. The possible range of cylinder is −12 to 12 D for the Retinomax versus −4 to 4 D for the SureSight. Because of the smaller range of sphere and cylinder from the SureSight, an out-of-range reading from the SureSight occurred in approximately 1% of eyes for sphere and 2% of eyes for cylinder, whereas no measurements were out of range using the Retinomax. We excluded the out-of-range readings from the assessment of intertester agreement because their true value of sphere or cylinder is unknown in these cases. The exclusion of these out-of-range values may lead to underestimation of the limits of intertester agreement for the SureSight. An out-of-range reading did not occur in the Retinomax because it allows a wider range of sphere and cylinder. Thus, any direct comparison of intertester agreement between the Retinomax and the SureSight needs to consider these differences.
In the subgroup analyses, we examined the impact of children’s age, cycloplegic refractive error, and confidence number of readings on the intertester agreement of refractive error measurements by comparing their mean intertester difference and their 95% limits of agreement. Among 3- to 5-year-old preschoolers, this study found that the child’s age has no substantial impact on the mean intertester difference from both instruments. However, we found that cycloplegic refractive error (either myopic or hyperopic) was associated with wider limits of intertester agreement for both the Retinomax and the SureSight, and confidence numbers below the manufacturer’s recommended value were significantly associated with wider limits of intertester agreement for both the Retinomax and the SureSight. Our previous work also demonstrated that lower confidence numbers were associated with worse sensitivity and specificity for detecting vision disorders using the Retinomax.21 These data suggest that, when the confidence number is below the manufacturer’s recommended value, repeated testing is recommended to obtain higher confidence numbers and a more reliable measure of refractive error.
In the VIP Study, the lay screeners and nurse screeners received the same training for performing vision screening using the Retinomax and the SureSight provided by a team of VIP Study personnel. How the screeners are trained may impact their ability to obtain reliable measures of refractive error using screening instruments. It is reasonable to assume that the intertester differences may increase if the testers are not trained or are not trained in the same way. It is also important to note that the Retinomax has three test modes (normal mode, quick mode, and auto mode) and the SureSight has two modes (child mode and adult mode). The VIP Study used the auto mode for the Retinomax and child mode for the SureSight. The findings from using these modes may not be generalizable to the other modes.
The strengths of this study include the large sample size, inclusion of preschool children with various vision disorders (amblyopia, strabismus, astigmatism, significant refractive error), and the standard training of screeners and application of same screening protocol to both lay and nurse screeners. In addition, because the Retinomax and the SureSight were measured without cyloplegic refraction, the results more accurately represent the expected results in when conducting preschool vision screening using the Retinomax and the SureSight.
In conclusion, the evaluation of intertester agreement from a large sample of VIP participants demonstrated that trained lay and nurse screeners agree well in measuring refractive error when using either the Retinomax or the SureSight on preschool children in a screening setting. These results are also consistent with the main findings from the VIP Study that the Retinomax and the SureSight are similarly effective when used by nurse and lay screeners. Although the preschooler’s age, refractive error, and confidence number of the autorefractor reading have little impact on the mean intertester differences of refractive error measurements, the agreement limits for intertester difference are greater in eyes with significant refractive error or when the confidence number is below the manufacturer’s recommended number.
3535 Market Street, Suite 700
Philadelphia, PA 19104
The Vision In Preschoolers Study Group
Executive Committee: Paulette Schmidt, OD, MS (Chair); Agnieshka Baumritter, MA; Elise Ciner, OD; Lynn Cyert, PhD, OD; Velma Dobson, PhD; Beth Haas; Marjean Taylor Kulp, OD,MS; Maureen Maguire, PhD; Bruce Moore, OD; Deborah Orel-Bixler, PhD, OD; Ellen Peskin, MA; Graham Quinn, MD, MSCE; Maryann Redford, DDS, MPH; Janet Schultz, RN, MA, CPNP; Gui-shuang Ying, PhD.
Participating Centers: (AA)=Administrative Assistant (BPC)=Back-up Project Coordinator; (GSE)=Gold Standard Examiner; (LS)=Lay Screener; (NS)=Nurse Screener; (PI)=Principal Investigator; (PC)=Project Coordinator; (PL)=Parent Liaison; (PR)=Programmer; (VD)=Van Driver; (NHC)=Nurse/Health Coordinator.
Berkeley, CA: University of California Berkeley School of Optometry Deborah Orel-Bixler, PhD, OD (PI/GSE); Pamela Qualley, MA (PC); Dru Howard (BPC/PL); Lempi Miller Suzuki (BPC); Sarah Fisher, PhD, OD (GSE); Darlene Fong, OD (GSE); Sara Frane, OD (GSE); Cindy Hsiao-Threlkeld, OD (GSE); Selim Koseoglu, MD (GSE); A. Mika Moy, OD (GSE); Sharyn Shapiro, OD (GSE); Lisa Verdon, OD (GSE); Tonya Watson, OD (GSE); Sean McDonnell (LS/VD); Erika Paez (LS); Darlene Sloan (LS); Evelyn Smith (LS); Leticia Soto (LS); Robert Prinz (LS); Joan Edelstein, RN (NS); Beatrice Moe, RN (NS).
Boston, MA: New England College of Optometry. Bruce Moore, OD (PI/GSE); Joanne Bolden (PC); Sandra Umaþa (PC/LS/PL); Amy Silbert (BPC); Nicole Quinn, OD (GSE); Heather Bordeau, OD (GSE); Nancy Carlson, OD (GSE); Amy Croteau, OD (GSE); Micki Flynn, OD (GSE); Barry Kran, OD (GSE); Jean Ramsey, MD (GSE); Melissa Suckow, OD (GSE); Erik Weissberg, OD (GSE); Marthedala Chery (LS/PL); Maria Diaz (LS); Leticia Gonzalez (LS/PL); Edward Braverman (LS/VD); Rosalyn Johnson (LS/PL); Charlene Henderson (LS/PL); Maria Bonila (PL); Cathy Doherty, RN (NS); Cynthia Peace-Pierre, RN (NS); Ann Saxbe, RN (NS); Vadra Tabb, RN (NS).
Columbus, OH: The Ohio State University College of Optometry. Paulette Schmidt OD, MS (PI); Marjean Taylor Kulp, OD, MS (coinvestigator/GSE); Molly Biddle, MA (PC); Jason Hudson (BPC); Melanie Ackerman, OD (GSE); Sandra Anderson, OD(GSE); Michael Earley, OD, PhD (GSE); Kristyne Edwards, OD, MS (GSE); Nancy Evans, OD (GSE); Heather Gebhart, OD (GSE); Jay Henry, OD, MS (GSE); Richard Hertle, MD(GSE); Jeffrey Hutchinson, DO (GSE); LeVelle Jenkins, OD (GSE); Andrew Toole, OD, MS (GSE); Keith Johnson (LS/VD); Richard Shoemaker (VD); Rita Atkinson (LS); Fran Hochstedler (LS); Tonya James (LS); Tasha Jones (LS); June Kellum (LS); Denise Martin (LS); Christina Dunagan, RN (NS); Joy Cline, RN (NS); Sue Rund, RN (NS).
Philadelphia, PA: Pennsylvania College of Optometry: Elise Ciner, OD (PI/GSE); Angela Duson (PC/LS); Lydia Parke (BPC); Mark Boas, OD (GSE); Shannon Burgess, OD (GSE); Penelope Copenhaven, OD (GSE); Ellie Francis, PhD, OD (GSE); Michael Gallaway, OD (GSE); Sheryl Menacker, MD (GSE); Graham Quinn, MD, MSCE (GSE); Janet Schwartz, OD (GSE); Brandy Scombordi-Raghu, OD (GSE); Janet Swiatocha, OD (GSE); Edward Zikoski, OD (GSE); Leslie Kennedy (LS/PL); Rosemary Little (LS/PL); Geneva Moss (LS/PL); Latricia Rorie (LS); Shirley Stokes (LS/PL); Jose Figueroa (LS/VD); Eric Nesmith (LS); Gwen Gold (BPC/NHC/PL); Ashanti Carter (PL); David Harvey (LS/VD); Sandra Hall, RN (NS); Lisa Hildebrand, RN (NS); Margaret Lapsley, RN (NS); Cecilia Quenzer, RN (NS); Lynn Rosenbach, RN (NHC/NS).
Tahlequah, OK: Northeastern State University College of Optometry Lynn Cyert, PhD, OD (PI/GSE); Linda Cheatham (PC/VD); Anna Chambless (BPC/PL); Colby Beats, OD (GSE); Jerry Carter, OD (GSE); Debbie Coy, OD (GSE); Jeffrey Long, OD (GSE); Shelly Rice, OD (GSE); Shelly Dreadfulwater, (LS/PL); Cindy McCully (LS/PL); Rod Wyers (LS/VD); Ramona Blake (LS/PL); Jamey Boswell (LS/PL); Anna Brown (LS/PL); Jeff Fisher, RN (NS); Jody Larrison, RN (NS).
Study Center: The Ohio State University College of Optometry, Columbus, OH Paulette Schmidt, OD, MS (PI); Beth Haas (study coordinator).
Coordinating Center: University of Pennsylvania, Department of Ophthalmology, Philadelphia, PA: Maureen Maguire, PhD (PI); Agnieshka Baumritter, MA (project director); Mary Brightwell-Arnold (systems analyst); Christine Holmes (AA); Andrew James (PR); Aleksandr Khvatov (PR); Lori O’Brien (AA); Ellen Peskin, MA (project director); Claressa Whearry (AA); Gui-shuang Ying, PhD (biostatistician).
National Eye Institute, Bethesda, MD: Maryann Redford, DDS, MPH
This study is supported by grants from the National Eye Institute, National Institutes of Health, Department of Health and Human Services, Bethesda, MD: U10EY12534; U10EY12545; U10EY12547; U10EY12550; U10EY12644; U10EY12647; U10EY12648; R21EY018908.
Presented in part at the Annual Meeting of the Association for Research in Vision and Ophthalmology, Fort Lauderdale, FL, May 8, 2012.
Received February 21, 2013; accepted May 16, 2013.
1. Schmidt P, Maguire M, Dobson V, Quinn G, Ciner E, Cyert L, Kulp MT, Moore B, Orel-Bixler D, Redford M, Ying GS. Comparison of preschool vision screening
tests as administered by licensed eye care professionals in the Vision In Preschoolers Study. Vision In Preschoolers (VIP) Study Group. Ophthalmology 2004; 111: 637–50.
2. Vision In Preschoolers (VIP) Study Group. Preschool vision screening
tests administered by nurse screeners compared to lay screeners in the Vision In Preschoolers Study. Invest Ophthalmol Vis Sci 2005; 46: 2639–48.
3. Cordonnier M, De Maertelaer V. Comparison between two hand-held autorefractors: the SureSight and the Retinomax. Strabismus 2004; 12: 261–74.
4. Ying GS, Maguire M, Quinn G, Kulp MT, Cyert L. ROC analysis of the accuracy of noncycloplegic retinoscopy, Retinomax Autorefractor, and SureSight Vision Screener for preschool vision screening
. Invest Ophthalmol Vis Sci 2011; 52: 9658–64.
5. Paff T, Oudesluys-Murphy AM, Wolterbeek R, Swart-van den Berg M, de Nie JM, Tijssen E, Schalij-Delfos NE. Screening for refractive errors in children: the PlusoptiX S08 and the Retinomax K-plus2 performed by a lay screener compared to cycloplegic retinoscopy. J AAPOS 2010; 14: 478–83.
6. Rowatt AJ, Donahue SP, Crosby C, Hudson AC, Simon S, Emmons K. Field evaluation of the Welch Allyn SureSight vision screener: incorporating the vision in preschoolers study recommendations. J AAPOS 2007; 11: 243–8.
7. Lang D, Leman R, Arnold AW, Arnold RW. Validated portable pediatric vision screening in the Alaska Bush. A VIPS-like study in the Koyukon. Alaska Med 2007; 49: 2–15.
8. Harvey EM, Dobson V, Miller JM, Clifford-Donaldson CE, Green TK, Messer DH, Garvey KA. Accuracy of the Welch Allyn SureSight for measurement of magnitude of astigmatism in 3- to 7-year-old children. J AAPOS 2009; 13: 466–71.
9. Harvey EM, Miller JM, Wagner LK, Dobson V. Reproducibility and accuracy of measurements with a handheld autorefractor in children. Br J Ophthalmol 1997; 81: 941–8.
10. Harvey EM, Miller JM, Dobson V, Tyszko R, Davis AL. Measurement of refractive error
in Native American preschoolers: validity and reproducibility of autorefraction. Optom Vis Sci 2000; 77: 140–9.
11. Cordonnier M, Dramaix M. Screening for refractive errors in children: accuracy of the hand held refractor Retinomax to screen for astigmatism. Br J Ophthalmol 1999; 83: 157–61.
12. Adams RJ, Dalton SM, Murphy AM, Hall HL, Courage ML. Testing young infants with the Welch Allyn SureSight noncycloplegic autorefractor. Ophthalmic Physiol Opt 2002; 22: 546–51.
13. Rosenfield M, Chiu NN. Repeatability of subjective and objective refraction. Optom Vis Sci 1995; 72: 577–9.
14. Shoemaker JA. Repeatibility of Welch Allyn SureSight autorefraction in adults. Invest Ophthalmol Vis Sci 2000; 41 (Suppl.): S301.
15. The Vision in Preschoolers (VIP) Study Group. Development and implementation of a preschool vision screening
program in a mobile setting. NHSA Dialog 2005; 1: 16–24.
16. Holmes JM, Beck RW, Repka MX, Leske DA, Kraker RT, Blair RC, Moke PS, Birch EE, Saunders RA, Hertle RW, Quinn GE, Simons KA, Miller JM. The Amblyopia Treatment Study Visual Acuity Testing protocol. Arch Ophthalmol 2001; 119: 1345–53.
17. Bland JM, Altman DG. Measuring agreement in method comparison studies. Stat Methods Med Res 1999; 8: 135–60.
18. Brown MB, Forsythe AB. Robust tests for the equality of variances. J AM Stat Assoc 1974; 69: 364–7.
19. Liang KY, Zeger SL. Longitudinal data analysis using generalized linear models. Biometrika 1986; 73: 13–22.
20. Ying GS, Kulp MT, Maguire M, Ciner E, Cyert L, Schmidt P. Sensitivity of screening tests for detecting vision in preschoolers-targeted vision disorders when specificity is 94%. The Vision In Preschoolers Study Group. Optom Vis Sci 2005; 82: 432–8.
21. Vision In Preschoolers (VIP) Study Group. Impact of confidence number on the screening accuracy of the Retinomax Autorefractor. Optom Vis Sci 2007; 84: 181–8.
Keywords:© 2013 American Academy of Optometry
refractive error; intertester agreement; preschool vision screening