lee scratchy wrote:well yeah D65 has a slightly redish white point
Klaus Karcher wrote:in comparison to what?
lee scratchy wrote:
D50 ? D75 ? :) AFAIK all movies/photos are meant to be seen at D65
I hope my (rhetorical) question hit its target: it's useless to call the color cast of a emissive source without referring to a reference white.
D65 for example is more yellowish than the native white point of a CRT (approx. 9300 K), but more bluish than D50, the standard light source in the graphic industry. When you switch between different monitor white points in a darkened room, you will notice a strong color cast, but after a few minutes your eye is completely adapted and your brain accepts the new setting as white. When you switch back to the old setting, you will notice a color cast in the opposite direction.
D65 is the standard white point for home- office- web- and video-applications, D50 is the standard white point for soft proofing (when you have to compare images on the display with prints or proofs in a D50 viewing booth). Both are close to "average" natural daylight.
Generally speaking it's important that the color and luminance of the monitor goes with to the room illumination when you want to assess color reliable and without fatigue.
It's hard to attain this goal with a CRT in an office- or prepress-environment: As mentioned its native white point is close to 9300 K and you have to reduce the blue and green gain to hit the target color temperature. At the same time the luminance decreases and often you end up with a luminance below 80 cd/m2 at D50 -- definitely not enough for softproofing purposes.
On the other hand, apparently some LCD vendors wear "brigther is better" on their sleeves: there are latterly some LCDs incapable to get much *below* 200 cd/m2 -- a great deal too much to feel comfortable in a dim environment.