Wednesday 23 July 2008


I have recently been implementing a highly parallel bitmap grayscale kernel as part of another project (more on this later).

Interestingly almost all sources on the web, including wikipedia ( still quote the:

Y = (r * 0.3) + (g * 0.59) + (b * 0.11)

This is the correct technique for older displays.  Improvements in display technology (LCD / plasma) etc mean this is no longer the case.

The more correct form for modern displays is:

Y =  (r * 0.2125) + (g * 0.7154) + (b * 0.0721)

As soon as I find the link to the paper / author that conducted this research I will post it here as it is not immediately findable on

The two equations produce markedly different results, with the second one producing a much fuller / smoother luminance range on my monitors.

I wonder how many legacy (or new) video and image editing software suites still use the older version? With the proliferation of LCD displays its well worth a quick update.


  1. Interestingly Wikipedia currently does cite these figures in its coverage of the Luma Coefficients of HDTV.

  2. Well spotted - perhaps they should ammend the entry on grayscaling to show this is the "correct" version for HD.

    Of course "correct" is a very subjective term.

    As cited makes for an interesting read.