Default versus SetCCT Monitor Settings
Using the CCVT as a screening test only (i.e., analysis of the proportion passing the test), the default (78% passed; 95% CI, 72 to 83%) versus SetCCT (87% passed; 95% CI, 82 to 91%) conditions were significantly different (p = 0.017). The same analysis examining only data from subjects with CVD (default condition, 6.1% passed; 95% CI, 1.3 to 16.9%; SetCCT condition, 5.9% passed; 95% CI, 0.7 to 19.7%) did not show a significant difference (p > 0.05), but this was an underpowered analysis.
Subset of Subjects Performing CCVT under Both Monitor Settings
Although there was not enough statistical power to analyze the paired data, we note that the 7 CVN subjects and the 24 CVD subjects who repeated the CCVT had screening test results that were the same for both test presentations. Of the 23 CVD subjects who failed, 6 had different diagnostic results on the two runs of the CCVT (Table 7).
One subject with tritan deficiency was enrolled in the study, but we excluded these data from the aforementioned analyses. The subject’s aggregate results on all clinical tests, including a measured Moreland matching range (using the HMC II with 4 degrees attachment), supported the diagnosis. The subject passed the screening portion of the CCVT (which included tritan plates). The tritan diagnostic section of the CCVT failed the subject, but this section would not normally be presented unless the screening test was failed.
We have evaluated the CCVT in a population of color normal and color vision–deficient subjects. In general, the screening performance of the CCVT was very similar to the other PIP tests, whereas the diagnostic performance was not.
The sensitivity and specificity of the CCVT were high and not statistically significantly different from the Ishihara and HRR tests. The point estimate of sensitivity for the HRR in this study was slightly lower than that found by Cole et al.,17 who used a more stringent fail criterion.
Diagnostically, the CCVT was relatively good at categorizing subjects as either having a deutan or protan deficiency (90% correctly classified compared with the anomaloscope). Cole et al.17 reported a point estimate of 86% correctly classified by the HRR and thus the CCVT appears to perform similarly to the HRR in this respect and better than the Ishihara test.2
Severity categorization by the CCVT was not as useful as the HRR test. The CCVT generally appears to categorize subjects with mild CVD appropriately, whereas the HRR has a bit more discrimination among the severity categories relative to the anomaloscope range. Some of the subjects with dichromacy included in the study were labeled as “moderate” severity by the CCVT and the HRR. Cole et al.17 also found this occurrence with the HRR test.
In comparing results between the default and SetCCT monitor conditions, the specificities were essentially the same, but the sensitivities could not be rigorously assessed because of the lack of statistical power. However, descriptive statistics showed that 18 of 24 (75%) subjects had the same diagnosis (type and extent) with either monitor condition. Of the six subjects with disparate results (Table 7), the SetCCT condition appeared to have somewhat better agreement with the anomaloscope than the default condition.
Given the widespread use of software applications and their varied platforms (e.g., tablet and smartphone) for presentation, it is now commonplace to find vision testing software. Although it is quite easy to present such software on multiple platforms, for color vision testing, it may not be trivial. In this study, we used relatively common desktop computers and monitors. However, our results may not extend to other hardware/software configurations. Pigment-based tests once printed should have the same spectral reflectance profiles and, when used with a standard illuminator, should always have the same spectral profile distribution presented to the eye (note, however, that many printed pigments can potentially fade with time and light exposure—although the extent is unknown). Software applications have their own inherent advantages, but those involving color stimuli are susceptible to variable testing conditions with changes in hardware (computers, displays, and display settings), software (display drivers and color management within applications), and viewing environment. Thus, generalizability of any computerized color vision software is suspect. Nonetheless, the screening performance of the CCVT found in this study is quite good.
Although the CCVT has definite advantages over many traditional clinical color vision tests, its use would still benefit from oversight. The CCVT improves upon most traditional tests in that it can be self-administered, has strict control over presentation time per plate, can automatically randomize order of plate presentation, and has automated scoring. Conventional cautions are still needed in its use, however. It would still be possible to cheat on the test perhaps by viewing it with a software application on a smartphone, using a colored filter,18,19 or simply having another individual supply the answers during the test. Thus, the CCVT does not eliminate oversight by an examiner, although with the CCVT test, the examiner less likely needs to be a color vision specialist or even a health care provider. Additionally, a single examiner could perhaps administer multiple tests simultaneously—a difficult task with most traditional tests.
When used carefully and appropriately, the CCVT may be sufficient as a screening color vision test. Additionally, it may be used to provide a diagnosis of type that is as accurate as the Richmond HRR. However, the CCVT has a tendency to assign subjects as more severe than does the HRR. Like other PIP tests, a result from the CCVT would likely require confirmation using additional color vision tests conducted by an experienced clinician, especially with regard to the severity of the CVD.
Jason S. Ng
Southern California College of Optometry
Marshall B. Ketchum University
2575 Yorba Linda Blvd
Fullerton, CA 92831
The authors thank Christy Guenther, OD, Brian Shih, BS, and Sophia Liem, BS, for their contributions to data entry and data management. Additionally, appreciation is given to the creators of the test for supplying the test for our research use. The Waggoner CCVT is now marketed also as ColorDx. The software has been ported to other device platforms and changed in some ways, which were not evaluated in this study.
None of the authors has any conflicts of interest to disclose.
Received July 16, 2014; accepted February 5, 2015.
1. Dain SJ. Clinical colour vision tests. Clin Exp Optom 2004; 87: 276–93.
2. Cole BL. Assessment of inherited colour vision defects in clinical practice. Clin Exp Optom 2007; 90: 157–75.
3. Rabin J, Gooch J, Ivan D. Rapid quantification of color vision
: the cone contrast test. Invest Ophthalmol Vis Sci 2011; 52: 816–20.
4. Regan BC, Reffin JP, Mollon JD. Luminance noise and the rapid determination of discrimination ellipses in colour deficiency. Vision Res 1994; 34: 1279–99.
5. Goulart PR, Bandeira ML, Tsubota D, Oiwa NN, Costa MF, Ventura DF. A computer-controlled color vision
test for children based on the Cambridge Colour Test. Vis Neurosci 2008; 25: 445–50.
6. Mancuso K, Neitz M, Neitz J. An adaptation of the Cambridge Colour Test for use with animals. Vis Neurosci 2006; 23: 695–701.
7. Barbur JL, Harlow AJ, Plant GT. Insights into the different exploits of colour in the visual cortex. Proc Biol Sci 1994; 258: 327–34.
8. Miyahara E, Pokorny J, Smith VC, Szewczyk E, McCartin J, Caldwell K, Klerer A. Computerized color-vision test based upon postreceptoral channel sensitivities. Vis Neurosci 2004; 21: 465–9.
9. Regan BC, Freudenthaler N, Kolle R, Mollon JD, Paulus W. Colour discrimination thresholds in Parkinson’s disease: results obtained with a rapid computer-controlled colour vision test. Vision Res 1998; 38: 3427–31.
10. Rodriguez-Carmona M, O’Neill-Biba M, Barbur JL. Assessing the severity of color vision
loss with implications for aviation and other occupational environments. Aviat Space Environ Med 2012; 83: 19–29.
11. Berger A, Strocka D. Quantitative assessment of artificial light sources for the best fit to standard illuminant d65. Appl Opt 1973; 12: 338–48.
12. International Lighting Vocabulary Standard CIE S 017/E:2011. Vienna, Austria: Commission Internationale de I’Eclairage; 2011.
14. Alpern M, Wake T. Cone pigments in human deutan colour vision defects. J Physiol 1977; 266: 595–612.
15. Pokorny J (Ed). Congenital and Acquired Color Vision
Defects, New York, NY: Grune & Stratton; 1979.
16. R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2013. Available at http://web.mit.edu/r_v3.0.1/fullrefman.pdf
. Accessed January 30, 2015.
17. Cole BL, Lian KY, Lakkis C. The new Richmond HRR pseudoisochromatic test for colour vision is better than the Ishihara test. Clin Exp Optom 2006; 89: 73–80.
18. Swarbrick HA, Nguyen P, Nguyen T, Pham P. The ChromaGen contact lens system: colour vision test results and subjective responses. Ophthalmic Physiol Opt 2001; 21: 182–96.
19. Hovis JK. Long wavelength pass filters designed for the management of color vision
deficiencies. Optom Vis Sci 1997; 74: 222–30.
Keywords:© 2015 American Academy of Optometry
color vision; color vision testing; color vision deficiency