• Gamma (was: Re: Television cameras: The changeover from lens turret to

    From J. P. Gilliver (John)@21:1/5 to rjfs@escapetime.myzen.co.uk on Sat May 14 16:04:01 2022
    On Sat, 14 May 2022 at 15:48:39, Roderick Stewart
    <rjfs@escapetime.myzen.co.uk> wrote (my responses usually FOLLOW):
    []
    In the decoder, RGB signals are derived by matrixing the luminance
    signal with the colour difference signals, which have also been
    derived from the gamma corrected RGB signals in the encoder. The
    "un-gamma" process is done to individual colours by the CRT itself.
    []
    I. e. the phosphor.

    Do modern LC displays, and other technologies - Plasma? OLED? - have a
    similar gamma curve to CRT phosphors? If not, presumably modern sets
    have an un-phosphor-gamma followed by gamma correction for whatever they
    _are_ using (assuming it's needed). Do you think we'll ever see a
    different (or no) curve start to used, with archive material doomed to
    be viewed slightly wrongly, apart from by true enthusiasts? Or has this
    already happened? (I imagine it could be quite subtle so not thought
    worth bothering with by many.)
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    Thay have a saying for it: /Geiz ist geil/, which roughly translates as, "It's sexy to be stingly". - Joe Fattorini, RT insert 2016/9/10-16

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to G6JPG@255soft.uk on Sat May 14 18:12:22 2022
    "J. P. Gilliver (John)" <G6JPG@255soft.uk> wrote in message news:R$cU4NThT8fiFwr0@a.a...
    On Sat, 14 May 2022 at 15:48:39, Roderick Stewart <rjfs@escapetime.myzen.co.uk> wrote (my responses usually FOLLOW):
    []
    In the decoder, RGB signals are derived by matrixing the luminance
    signal with the colour difference signals, which have also been
    derived from the gamma corrected RGB signals in the encoder. The
    "un-gamma" process is done to individual colours by the CRT itself.
    []
    I. e. the phosphor.

    Do modern LC displays, and other technologies - Plasma? OLED? - have a similar gamma curve to CRT phosphors? If not, presumably modern sets have
    an un-phosphor-gamma followed by gamma correction for whatever they _are_ using (assuming it's needed). Do you think we'll ever see a different (or
    no) curve start to used, with archive material doomed to be viewed
    slightly wrongly, apart from by true enthusiasts? Or has this already happened? (I imagine it could be quite subtle so not thought worth
    bothering with by many.)

    I imagine that gamma is "hardcoded" into the design of any camera, signal processing, broadcasting, reception, display chain so that it cannot be
    changed without rendering all current and past TVs "obsolete" in the sense
    that images without gamma (or with a very different value) would look
    "wrong".

    I believe UK and US use different values of gamma which is why some US made-on-video TV programmes look vaguely wrong if they are shown on UK TV.
    It's either gamma or black level/pedestal which is different. I noticed it before I first read about it, and thought the pictures looked a bit like the "colour plates" in books from the 1930s/40s/50s - low contrast, excessive saturation, too much shadow detail and too little highlight detail. It seems
    to have been an analogue-only thing.


    What was the purpose of gamma? Was it to make the signal more immune to
    noise (a bit like LPs apply HF emphasis on the record and the record player
    has inverse de-emphasis)? Or was it because CRTs had an inbuilt gamma
    because of the phosphor and transfer characteristics of a CRT, and it was easier/cheaper to correct for this at the broadcast end rather than
    requiring the correction circuitry in every TV?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Other John@21:1/5 to All on Sat May 14 19:02:07 2022
    On Sat, 14 May 2022 18:12:22 +0100, NY wrote:

    was it because CRTs had an inbuilt gamma because of the phosphor and
    transfer characteristics of a CRT, and it was easier/cheaper to correct
    for this at the broadcast end rather than requiring the correction
    circuitry in every TV?

    Yes! :)

    --
    TOJ.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Ratcliffe@21:1/5 to me@privacy.invalid on Sat May 14 19:35:32 2022
    On Sat, 14 May 2022 18:12:22 +0100, NY <me@privacy.invalid> wrote:

    What was the purpose of gamma? Was it to make the signal more immune to
    noise (a bit like LPs apply HF emphasis on the record and the record player has inverse de-emphasis)? Or was it because CRTs had an inbuilt gamma
    because of the phosphor and transfer characteristics of a CRT, and it was easier/cheaper to correct for this at the broadcast end rather than
    requiring the correction circuitry in every TV?

    The latter.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Paul Ratcliffe on Sat May 14 21:29:02 2022
    "Paul Ratcliffe" <abuse@orac12.clara34.co56.uk78> wrote in message news:slrnt80143.2jg.abuse@news.pr.network...
    On Sat, 14 May 2022 18:12:22 +0100, NY <me@privacy.invalid> wrote:

    What was the purpose of gamma? Was it to make the signal more immune to
    noise (a bit like LPs apply HF emphasis on the record and the record
    player
    has inverse de-emphasis)? Or was it because CRTs had an inbuilt gamma
    because of the phosphor and transfer characteristics of a CRT, and it was
    easier/cheaper to correct for this at the broadcast end rather than
    requiring the correction circuitry in every TV?

    The latter.


    So if (in an alternative universe!) TV had been developed when LCD screens
    were the normal display technology, TV would have been linear rather than parabolic output = input ^ gamma and life would have been a great deal
    simpler ;-) Assuming that LCD screens have linear light output for linear input pixel value and/or voltage.

    As a matter of interest, is digital TV linear (ie do pixels with values 10,
    50 and 100 have brightnesses in the ratio 1:5:10) and is the
    gamma-correction for the SCART-output of a set-top-box done locally, or is gamma at the prescribed value still applied between the camera and the
    encoded pixel values in the video stream? Is what JP Gilliver said true -
    that an LCD TV applies inverse gamma conversion, followed by any non-linear parameter that might apply to an LCD screen?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Sun May 15 08:56:25 2022
    On Sat, 14 May 2022 21:29:02 +0100, "NY" <me@privacy.invalid> wrote:

    As a matter of interest, is digital TV linear (ie do pixels with values 10, >50 and 100 have brightnesses in the ratio 1:5:10) and is the
    gamma-correction for the SCART-output of a set-top-box done locally, or is >gamma at the prescribed value still applied between the camera and the >encoded pixel values in the video stream? Is what JP Gilliver said true - >that an LCD TV applies inverse gamma conversion, followed by any non-linear >parameter that might apply to an LCD screen?

    When I got my first digital stills camera (a Nikon Coolpix) way back
    in year 2000, I wondered about this too, and as I was working with
    broadcast television cameras at the time, I was able to photograph the
    various test charts we used for those, and measure the numerical
    outputs using the histogram in graphics software on the computer.

    We had the BBC No 57 greyscale chart, and various other charts that
    included greyscales. I found the greyscale in the Macbeth chart to
    give particularly clear results.

    I also had a laptop with a video capture card that enabled me to save
    still images of the same charts as JPG files like the ones from the
    Nikon camera, and make the same measurements.

    Since then, I've done the same kinds of tests with various other
    digital stills cameras. And the answer... is that digital stills
    cameras and broadcast television cameras generate signals with exactly
    the same gamma characteristics as far as I could measure them, so they
    must all have the same kind of gamma correction circuitry, or its
    digital equivalent. It's only what you would expect if you think about
    it, because we can view images originating from both television and
    stills cameras on the same screens, side by side if you like, and they
    all look similar. Broadcast cameras usually have a means of bypassing
    or switching off the gamma correction circuitry (sometimes used as
    part of lineup) and the effect makes most of the image, notably face
    tones, grotesquely dark and oversaturated, and everything looking very contrasty, so it would be immediately obvious if this were missing.

    Also, electronic images of all sorts that we used to view on CRTs can
    now be viewed on modern flat screen displays without adjusting them,
    so it seems that gamma correction is one thing that has been
    maintained as a universally recognised standard.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)