Lighting designers commonly rely on two metrics to evaluate, compare, and predict the color performance of white light sources in applications where color is important: correlated color temperature (CCT) and color-rendering index (CRI). While CCT describes the color appearance of light—whether it is visually warm, neutral, or cool—a light source's CRI rating describes how well it renders colors “naturally”—compared to a reference ideal light source.
Originally developed more than 40 years ago to address fluorescent lamp technology, CRI has proved a durable, if somewhat limited, metric for evaluating color-rendering ability. The advent of solid-state lighting (SSL), however, has highlighted the shortcomings of CRI, and created demand for a new way to measure color rendering that not only addresses conventional sources, but accurately depicts SSL product performance.
The CRI value for a given light source is derived from how closely it renders a set of eight standard color samples compared to a reference source with the same CCT. Higher CRI generally indicates better color rendering.
The reference source is a blackbody radiator for light sources with a CCT up to 5000K and a mathematical model of daylight for light sources with a CCT over 5000K, both assumed to have a CRI rating of 100, the scale maximum. The greater the deviation of the tested source from the applicable reference source, the lower the CRI value for the given test color. The resulting CRI value for the light source is the average of the CRI numbers for the eight test colors.
CRI uses only eight medium-saturated test samples. As an average of these eight colors, it provides little certainty that other colors will be rendered as well as suggested by the overall rating. There are an additional six test colors representing saturated colors (R9 to R12, and most notably R9, which represents a saturated red commonly found in prints, fabrics, food, flowers, etc.) as well as foliage and a Caucasian flesh tone; these colors are optional for special purposes but are generally not included in the calculation of the CRI rating. The metric loses its accuracy at very warm and cool color temperatures indicating imbalanced spectral power distribution. For example, while a 2800K incandescent lamp is typically rated with a CRI close to 100, it may not render blues and greens as well because its color output is relatively deficient at these wavelengths.
Originally developed more than 40 years ago to address fluorescent lamp technology, CRI has proved a durable, if somewhat limited, metric for evaluating color-rendering ability. The advent of solid-state lighting (SSL), however, has highlighted the shortcomings of CRI, and created a demand for a new way to measure color rendering that not only addresses conventional sources, but accurately depicts SSL product performance.
“The phenomenon of color rendering is complex and there are several aspects in color quality of light sources,” says Yoshi Ohno, a group leader at Lighting and Color Group for the Sensor Science Division of the National Institute of Standards and Technology (NIST). “CRI measures accuracy of color with respect to the reference source (fidelity). There are other aspects of color quality—visual preferences, color-discrimination ability, and visual clarity—which are also important for daily life. It is well known that color fidelity cannot serve as a sole indicator of color quality of illumination sources.” Nonetheless, CRI's shortcomings were accepted by the lighting industry for decades until SSL products became widely available for general lighting.
CRI and SSL
Solid-state sources produce white light from blue- or ultraviolet-emitting LEDs with a phosphor coating or a combination of red, green, and blue (RGB) or more colors, producing a spectral emission that is distinct from conventional technologies. These different spectra often do not work well with CRI, accentuating its limitations.
“In some cases, the CRI score does not agree well with what we see,” Ohno says. “In particular, some RGB combinations get low CRI scores although they visually look pretty good. There are other cases where CRI scores are good although the source visually looks very poor.”
“Recent developments in SSL have simply brought attention to the fact that high CRI does not accurately predict preference of natural objects—such as skin, fruits, vegetables, wood—or hue discrimination—seeing the difference between subtle shades of purple and maroon,” says Mark S. Rea, professor and director of the Lighting Research Center (LRC), a division of Rensselaer Polytechnic Institute. “Specifically, several studies have been published now showing that some SSL sources with low CRI can provide better color rendering than traditional sources with high CRI.”
Ohno points to neodymium incandescent lamps as an example of a technology that is in a similar boat as SSL. These lamps filter out yellow emissions to enhance color contrast, an effect that can be easily achieved by energy-efficient SSL sources. But CRI penalizes such lamps (77 CRI). As a result, SSL product development in this direction is discouraged, indicating that continuing emphasis on high CRI is influencing the direction of SSL technology, sometimes resulting in lost opportunities to benefit from good products that produce high efficiency and good color rendering but low CRI.
To address these concerns, researchers have been working on alternatives to the CRI. I will discuss two of these: the Color Quality Scale, championed by NIST, and Gamut Area Index, championed by the LRC.