When computer monitors were first a thing they used electron guns and cathode ray tubes. Electrons fired out of the electron gun hit phosphors

When computer monitors were first a thing they used electron guns and cathode ray tubes. Electrons fired out of the electron gun hit phosphors, causing photons to be emitted, which we perceive as light.

The data your computer produces for colour is from 0-255, which is 8 bit information. In this system, 127 (half way between 0 and 255) is half as bright as 255, and twice as bright as 63, etc.

The older video cards that had a VGA output on them used to convert these codes into voltages. So, a code of 255 might go to 1V, and 127 might go to 0.5V and so on. These signals then get routed to the computer monitor's electron guns through a few amplifiers. The problem is, the electron gun doesn't respond linearly. So 0.5V doesn't produce half the electrons as 1V, it produces much less. Also, the human eye doesn't perceive half the electrons as half as bright, as it has a logarithmic style response.

So, if we want to perceive this system correctly (which allows you to easily manipulate graphical data) you need to correct the values so that 127 looks about half as bright as 255 and so on. So, the solution is to apply a gamma correction curve. This multiplies the values by some scale factor which varies with the value - it is typically implemented using a "look-up table" or on the graphics card hardware.

This was all 20-25 years ago when digital processing inside monitors was limited - there was literally no digital stuff outside of your computer, it was all analog, and it would be quite hard to implement this using analog electronics. So, it was done on the computer side.

Nowadays we are all used to LCD, plasma and OLED monitors -- all of which include digital processing as a necessity for the display technology and most of which are connected via digital interfaces like HDMI, DisplayPort or DVI, so this all sounds a bit strange. In fact, for LCD monitors at least, the digital processing completely reverses the gamma correction your computer does because LCDs actually have an inverse behavior (to that of CRTs) with pixels changing rapidly over a small voltage range.

So why do we still use this system when it seems that it is not necessary? It is important to remain compatible with old systems. When DVI came out, some CRT monitors supported it, and gamma correction was kept in so that those manufacturers could get by with the minimum amount of hardware. And DisplayPort & HDMI are both backwards compatible with DVI.

This is gamma correction and gamma is the name of the parameter which affects this, a value of around 1.5-2.5 is pretty typical. You can adjust the gamma parameter to your preference.

Brightness is literally the brightness of the black level. It's like offsetting the lowest levels of the image to make smaller details visible. On a correctly calibrated monitor with a low black level (plasma or OLED) the brightness should be set such that all detail is visible above the black level. So that an RGB level of 0 and 1 are distinguished. Most people tend to increase brightness in an attempt to improve detail, but you really should increase the contrast instead, as that increases the level of peak brightness, increasing the range of levels available in the image, up to the point of white clipping at least.

YOUR REACTION?

Facebook Conversations