I have recently been implementing a highly parallel bitmap grayscale kernel as part of another project (more on this later).
Interestingly almost all sources on the web, including wikipedia (
http://en.wikipedia.org/wiki/Grayscale) still quote the:
Y = (r * 0.3) + (g * 0.59) + (b * 0.11)
This is the correct technique for older displays. Improvements in display technology (LCD / plasma) etc mean this is no longer the case.
The more correct form for modern displays is:
Y = (r * 0.2125) + (g * 0.7154) + (b * 0.0721)
As soon as I find the link to the paper / author that conducted this research I will post it here as it is not immediately findable on google.com
The two equations produce markedly different results, with the second one producing a much fuller / smoother luminance range on my monitors.
I wonder how many legacy (or new) video and image editing software suites still use the older version? With the proliferation of LCD displays its well worth a quick update.