• What shape are pixels?

    From Brian Gaff (Sofa)@21:1/5 to All on Sat Mar 19 10:44:55 2022
    May seem a daft question, but in the old days of tubes, there were various patterns of shadow masks on them. Some tended to be noticeable as vertical stripes, others as little triads, And when we started to get digital video
    as in games, it was not unusual to see oval circles on things, due to the
    pixel being displayed oblong instead of square, so to speak.
    Brian



    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    briang1@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From R. Mark Clayton@21:1/5 to All on Sun Mar 20 10:29:27 2022
    On Saturday, 19 March 2022 at 10:45:02 UTC, Brian Gaff (Sofa) wrote:
    May seem a daft question, but in the old days of tubes, there were various patterns of shadow masks on them. Some tended to be noticeable as vertical stripes, others as little triads, And when we started to get digital video
    as in games, it was not unusual to see oval circles on things, due to the pixel being displayed oblong instead of square, so to speak.
    Brian



    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    bri...@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!

    Old CRT's - dots illuminated by electron beam through a shadow mask
    Trinitron - lines illuminated by electron beam through wires
    Plasma - dots illuminated by plasma from "points" behind"
    Most LCD - rectangular, sometimes with extra colours (e.g. Sharp), usually arrayed, but on vertical alignment models they are vertical bars. Great picture, although slow response (so not for gamers)
    OLED - dunno, but probably an array.
    Micro LED - little LED chips usually arrayed.

    On 4k monitors or TV's at a sensible distance it is quite hard to make out the pixels.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brian Gaff (Sofa)@21:1/5 to All on Mon Mar 21 08:02:28 2022
    Well I cannot see them these days of course, but I guess the circle problem then was when early display electronics did not all use the same aspect
    ratio effectively widening or narrowing pixels. I can well remember tat my
    ZX Spectrum on a standard TV was correct , but when you ran other computers
    of the time with simulated code to run spectrum games, ie all were z80 chip based, you often found that the display was a different aspect ratio,
    creating the issue. Memotech, and Sam Coupe machines both had this issue as
    one used a Texas chip for the display, the other aULA made by Fujitsu.
    Brian

    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    briang1@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!
    "R. Mark Clayton" <notyalckram@gmail.com> wrote in message news:9b8a1314-4767-4a74-9097-af16b836ed71n@googlegroups.com...
    On Saturday, 19 March 2022 at 10:45:02 UTC, Brian Gaff (Sofa) wrote:
    May seem a daft question, but in the old days of tubes, there were
    various
    patterns of shadow masks on them. Some tended to be noticeable as
    vertical
    stripes, others as little triads, And when we started to get digital
    video
    as in games, it was not unusual to see oval circles on things, due to the
    pixel being displayed oblong instead of square, so to speak.
    Brian



    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    bri...@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!

    Old CRT's - dots illuminated by electron beam through a shadow mask
    Trinitron - lines illuminated by electron beam through wires
    Plasma - dots illuminated by plasma from "points" behind"
    Most LCD - rectangular, sometimes with extra colours (e.g. Sharp), usually arrayed, but on vertical alignment models they are vertical bars. Great picture, although slow response (so not for gamers)
    OLED - dunno, but probably an array.
    Micro LED - little LED chips usually arrayed.

    On 4k monitors or TV's at a sensible distance it is quite hard to make out the pixels.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Woolley@21:1/5 to Richard Tobin on Mon Mar 21 12:49:46 2022
    On 21/03/2022 12:24, Richard Tobin wrote:
    RGB and white, presumably the LEDs themselves.

    LEDs can't be white. What are called "white LEDs" are blue LEDs with a
    yellow phosphor. They generate a relatively narrow band blue, from the
    LED, and a very broadband yellow, from the phosphor. Unpowered I'd
    expect "white" LEDs to look yellow.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard Tobin@21:1/5 to R. Mark Clayton on Mon Mar 21 12:24:10 2022
    In article <9b8a1314-4767-4a74-9097-af16b836ed71n@googlegroups.com>,
    R. Mark Clayton <notyalckram@gmail.com> wrote:
    OLED - dunno, but probably an array.

    Looking at mine through a rather poor-quality USB digital microscope,
    each rectangular pixel appears to consist of 4 rectangular
    colours, RGB and white, presumably the LEDs themselves.

    https://www.cogsci.ed.ac.uk/~richard/oled-pixels.jpg

    -- Richard

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard Tobin@21:1/5 to david@ex.djwhome.demon.invalid on Mon Mar 21 14:31:29 2022
    In article <t19s9b$3r1$1@dont-email.me>,
    David Woolley <david@ex.djwhome.demon.invalid> wrote:

    RGB and white, presumably the LEDs themselves.

    LEDs can't be white. What are called "white LEDs" are blue LEDs with a >yellow phosphor. They generate a relatively narrow band blue, from the
    LED, and a very broadband yellow, from the phosphor. Unpowered I'd
    expect "white" LEDs to look yellow.

    Apparently the RGB ones are R, G and B either. They are "white" OLEDs
    with filters over them.

    https://www.oled-info.com/lgs-wrgb-oled-tv-sub-pixels-captured-macro-photo

    I guess WRGB is just the additive equivalent of CMYK.

    -- Richard

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Dave W@21:1/5 to All on Mon Mar 21 17:41:59 2022
    T24gTW9uLCAyMSBNYXIgMjAyMiAxNDozMToyOSArMDAwMCAoVVRDKSwgcmljaGFyZEBjb2dzY2ku ZWQuYWMudWsNCihSaWNoYXJkIFRvYmluKSB3cm90ZToNCg0KPkluIGFydGljbGUgPHQxOXM5YiQz cjEkMUBkb250LWVtYWlsLm1lPiwNCj5EYXZpZCBXb29sbGV5ICA8ZGF2aWRAZXguZGp3aG9tZS5k ZW1vbi5pbnZhbGlkPiB3cm90ZToNCj4NCj4+PiBSR0IgYW5kIHdoaXRlLCBwcmVzdW1hYmx5IHRo ZSBMRURzIHRoZW1zZWx2ZXMuDQo+DQo+PkxFRHMgY2FuJ3QgYmUgd2hpdGUuICBXaGF0IGFyZSBj YWxsZWQgIndoaXRlIExFRHMiIGFyZSBibHVlIExFRHMgd2l0aCBhIA0KPj55ZWxsb3cgcGhvc3Bo b3IuICBUaGV5IGdlbmVyYXRlIGEgcmVsYXRpdmVseSBuYXJyb3cgYmFuZCBibHVlLCBmcm9tIHRo ZSANCj4+TEVELCBhbmQgYSB2ZXJ5IGJyb2FkYmFuZCB5ZWxsb3csIGZyb20gdGhlIHBob3NwaG9y LiAgVW5wb3dlcmVkIEknZCANCj4+ZXhwZWN0ICJ3aGl0ZSIgTEVEcyB0byBsb29rIHllbGxvdy4N Cj4NCj5BcHBhcmVudGx5IHRoZSBSR0Igb25lcyBhcmUgUiwgRyBhbmQgQiBlaXRoZXIuICBUaGV5 IGFyZSAid2hpdGUiIE9MRURzDQo+d2l0aCBmaWx0ZXJzIG92ZXIgdGhlbS4NCj4NCj5odHRwczov L3d3dy5vbGVkLWluZm8uY29tL2xncy13cmdiLW9sZWQtdHYtc3ViLXBpeGVscy1jYXB0dXJlZC1t YWNyby1waG90bw0KPg0KPkkgZ3Vlc3MgV1JHQiBpcyBqdXN0IHRoZSBhZGRpdGl2ZSBlcXVpdmFs ZW50IG9mIENNWUsuDQo+DQo+LS0gUmljaGFyZA0KDQpNeSBUViBoYXMgdmVydGljYWwgc3RyaXBz LCBhbGwgY29tcG9zZWQgb2YgUkdCIHBpeGVscyBob3Jpem9udGFsbHkNCmFycmFuZ2VkLiBXaGVu IGFsbCB0aHJlZSBhcmUgb24gaXQgbG9va3Mgd2hpdGUuDQotLQ0KRGF2ZSBXDQo=

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From R. Mark Clayton@21:1/5 to Dave W on Mon Mar 21 11:09:07 2022
    On Monday, 21 March 2022 at 17:42:01 UTC, Dave W wrote:
    On Mon, 21 Mar 2022 14:31:29 +0000 (UTC), ric...@cogsci.ed.ac.uk
    (Richard Tobin) wrote:

    In article <t19s9b$3r1$1...@dont-email.me>,
    David Woolley <da...@ex.djwhome.demon.invalid> wrote:

    RGB and white, presumably the LEDs themselves.

    LEDs can't be white. What are called "white LEDs" are blue LEDs with a >>yellow phosphor. They generate a relatively narrow band blue, from the >>LED, and a very broadband yellow, from the phosphor. Unpowered I'd
    expect "white" LEDs to look yellow.

    Apparently the RGB ones are R, G and B either. They are "white" OLEDs
    with filters over them.

    https://www.oled-info.com/lgs-wrgb-oled-tv-sub-pixels-captured-macro-photo

    I guess WRGB is just the additive equivalent of CMYK.

    -- Richard
    My TV has vertical strips, all composed of RGB pixels horizontally
    arranged. When all three are on it looks white.
    --
    Dave W

    On a CRT that was Trinitron, on a flat panel it is VA.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brian Gaff (Sofa)@21:1/5 to David Woolley on Tue Mar 22 07:58:54 2022
    I do remember its all to do with colour temperature and lumins, something I never bothered about when I could see. All I remember was early LEDs that looked white to the naked eye seemed to look bluish when they were used daylight around and that most fluorescent tubes looked dotty or yellow. If
    you shot pictures with a video in such environments, the latter turned everyone's faces green unless you changed the tint on the camera, and the former looked a bit like blue light shown through mud is the best way I can describe it. It does show up how clever the eye and brain are when they work correctly to getting the colours to look right, wheas electronic devices
    show what is really there.
    I remember taking non flash photos under those orange street lights with
    some very alien looking results. Some cars almost shone, yet others were
    really black even when blue.
    Some headlights were greeny yellow,while others were just whitish.
    Brian

    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    briang1@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!
    "David Woolley" <david@ex.djwhome.demon.invalid> wrote in message news:t19s9b$3r1$1@dont-email.me...
    On 21/03/2022 12:24, Richard Tobin wrote:
    RGB and white, presumably the LEDs themselves.

    LEDs can't be white. What are called "white LEDs" are blue LEDs with a yellow phosphor. They generate a relatively narrow band blue, from the
    LED, and a very broadband yellow, from the phosphor. Unpowered I'd expect "white" LEDs to look yellow.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brian Gaff (Sofa)@21:1/5 to Dave W on Tue Mar 22 08:06:11 2022
    As I recall, some issues with displays used to be that golden things and things that reflected the sun like glint from water, never did work well on
    tv, whether the issue was dynamic range on the camera or somewhere in the chain, ormaybe in the display is hard to tell.
    The old test of panning a spot so it momentarily shone in the camera used
    to look really kind of flat compared to the real thing on your eye.
    However, somebody I knew who had a Sharp TV some years ago, reckoned his
    TV made a better job of glint than most. I suspect that might be a bit of fakery going on with that extra led or whatever they were using which was a pale orange fooling te eye maybe being controlled by software. I wish I
    could have seen it.
    The Best CRT pictures I ever saw were from Hitachi larger tvs, but you
    needed a crane to lift them!

    Brian

    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    briang1@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!
    "Dave W" <davewi11@yahoo.co.uk> wrote in message news:l3eh3hlhimlf4234rodd7mhkd9uc06m9q8@4ax.com...
    On Mon, 21 Mar 2022 14:31:29 +0000 (UTC), richard@cogsci.ed.ac.uk
    (Richard Tobin) wrote:

    In article <t19s9b$3r1$1@dont-email.me>,
    David Woolley <david@ex.djwhome.demon.invalid> wrote:

    RGB and white, presumably the LEDs themselves.

    LEDs can't be white. What are called "white LEDs" are blue LEDs with a >>>yellow phosphor. They generate a relatively narrow band blue, from the >>>LED, and a very broadband yellow, from the phosphor. Unpowered I'd >>>expect "white" LEDs to look yellow.

    Apparently the RGB ones are R, G and B either. They are "white" OLEDs
    with filters over them.

    https://www.oled-info.com/lgs-wrgb-oled-tv-sub-pixels-captured-macro-photo

    I guess WRGB is just the additive equivalent of CMYK.

    -- Richard

    My TV has vertical strips, all composed of RGB pixels horizontally
    arranged. When all three are on it looks white.
    --
    Dave W


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to briang1@blueyonder.co.uk on Tue Mar 22 09:52:04 2022
    "Brian Gaff (Sofa)" <briang1@blueyonder.co.uk> wrote in message news:t1bvk0$l4s$1@dont-email.me...
    I do remember its all to do with colour temperature and lumins, something
    I never bothered about when I could see. All I remember was early LEDs
    that looked white to the naked eye seemed to look bluish when they were
    used daylight around and that most fluorescent tubes looked dotty or
    yellow. If you shot pictures with a video in such environments, the latter turned everyone's faces green unless you changed the tint on the camera,
    and the former looked a bit like blue light shown through mud is the best
    way I can describe it. It does show up how clever the eye and brain are
    when they work correctly to getting the colours to look right, wheas electronic devices show what is really there.
    I remember taking non flash photos under those orange street lights with
    some very alien looking results. Some cars almost shone, yet others were really black even when blue.
    Some headlights were greeny yellow,while others were just whitish.

    Any lights with discontinuous spectrums (as opposed to "black body
    radiation", as taught in physics A level) will appear differently depending
    on the ways the RGB sensors respond to light.

    I found that, to the naked eye, "daylight white" CFL bulbs looked
    beautifully white and matched daylight coming in through windows. But some digital cameras rendered them as very yellow - almost as much as tungsten
    bulbs - whereas the same camera, with fixed 5000 K "sunlight" white balance, rendered "warm white" fluorescent tubes as being less yellow and more blue
    even though to the eye they looked warm compared with sunlight.

    Most colour slide film (Kodachrome, Ektachrome) rendered fluorescent tubes (probably warm white) as a horrible sickly green which the naked eye could
    not detect. You could get filters for various types of film which corrected
    for this.

    I found that digital cameras with auto white balance correction, where you point the camera at something white and tell it "this is white", give pretty good rendition of colours in a variety of light sources. The main difference with CFLs and LEDs, compared with tungsten and sun/shade daylight, is that certain shades of red look too dark even though the overall colour balance
    is correct.

    https://i.postimg.cc/c4TW1zT4/daylight.jpg https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg https://i.postimg.cc/NfQYct6z/Led.jpg https://i.postimg.cc/1RGP2XqF/white-fluor.jpg https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    Are a series of photos of the front cover of an edition of the Radio Times, with a pack of screw that have a saturated red label and a pack of drill
    bits that have a royal blue case - so there's a good range of tones. I illuminated it by various light sources, as described in the filename. and auto-white-balanced the camera from light reflected off a sheet of A4
    printer paper, lit with the same light source - apart from one case where I used a warm-white fluorescent tube and deliberately set the camera to
    sunlight WB.

    I realise that Brian can't see this. The general colour balance and colour
    cast for daylight, daylight CFL, LED and (warm) white fluorescent is pretty consistent, though the red panel on the box of screws is much brighter on daylight than on any of the artificial sources. The fluorescent with the
    camera set for daylight has a pale yellow tint, though nowhere near as the
    very strong reddish yellow that you'd get with a tungsten bulb.

    One of the things you need to be careful of with LEDs is that they are
    pulsed at high frequency with a mark:space ratio which varies as you dim the light. This can lead to horizontal bands on photos if the camera sensor
    reads the brightness of each row of pixels in turn, and the LEDs were on for some rows and off for others. I presume when LEDs are used to illuminate a
    TV studio they are driven from DC, with control over the current to dim the light, so they do not turn on and off.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From R. Mark Clayton@21:1/5 to All on Tue Mar 22 05:13:29 2022
    On Tuesday, 22 March 2022 at 09:52:31 UTC, NY wrote:
    "Brian Gaff (Sofa)" <bri...@blueyonder.co.uk> wrote in message news:t1bvk0$l4s$1...@dont-email.me...
    I do remember its all to do with colour temperature and lumins, something
    I never bothered about when I could see. All I remember was early LEDs
    that looked white to the naked eye seemed to look bluish when they were used daylight around and that most fluorescent tubes looked dotty or yellow. If you shot pictures with a video in such environments, the latter turned everyone's faces green unless you changed the tint on the camera, and the former looked a bit like blue light shown through mud is the best way I can describe it. It does show up how clever the eye and brain are when they work correctly to getting the colours to look right, wheas electronic devices show what is really there.
    I remember taking non flash photos under those orange street lights with some very alien looking results. Some cars almost shone, yet others were really black even when blue.
    Some headlights were greeny yellow,while others were just whitish.
    Any lights with discontinuous spectrums (as opposed to "black body radiation", as taught in physics A level) will appear differently depending on the ways the RGB sensors respond to light.

    I found that, to the naked eye, "daylight white" CFL bulbs looked
    beautifully white and matched daylight coming in through windows. But some digital cameras rendered them as very yellow - almost as much as tungsten bulbs - whereas the same camera, with fixed 5000 K "sunlight" white balance, rendered "warm white" fluorescent tubes as being less yellow and more blue even though to the eye they looked warm compared with sunlight.

    Most colour slide film (Kodachrome, Ektachrome) rendered fluorescent tubes (probably warm white) as a horrible sickly green which the naked eye could not detect. You could get filters for various types of film which corrected for this.

    I found that digital cameras with auto white balance correction, where you point the camera at something white and tell it "this is white", give pretty good rendition of colours in a variety of light sources. The main difference with CFLs and LEDs, compared with tungsten and sun/shade daylight, is that certain shades of red look too dark even though the overall colour balance
    is correct.

    https://i.postimg.cc/c4TW1zT4/daylight.jpg https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg https://i.postimg.cc/NfQYct6z/Led.jpg https://i.postimg.cc/1RGP2XqF/white-fluor.jpg https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    Are a series of photos of the front cover of an edition of the Radio Times, with a pack of screw that have a saturated red label and a pack of drill
    bits that have a royal blue case - so there's a good range of tones. I illuminated it by various light sources, as described in the filename. and auto-white-balanced the camera from light reflected off a sheet of A4
    printer paper, lit with the same light source - apart from one case where I used a warm-white fluorescent tube and deliberately set the camera to sunlight WB.

    I realise that Brian can't see this. The general colour balance and colour cast for daylight, daylight CFL, LED and (warm) white fluorescent is pretty consistent, though the red panel on the box of screws is much brighter on daylight than on any of the artificial sources. The fluorescent with the camera set for daylight has a pale yellow tint, though nowhere near as the very strong reddish yellow that you'd get with a tungsten bulb.

    One of the things you need to be careful of with LEDs is that they are
    pulsed at high frequency with a mark:space ratio which varies as you dim the light. This can lead to horizontal bands on photos if the camera sensor
    reads the brightness of each row of pixels in turn, and the LEDs were on for some rows and off for others. I presume when LEDs are used to illuminate a
    TV studio they are driven from DC, with control over the current to dim the light, so they do not turn on and off.

    Depends a bit on the phosphors. Philips due tri-phosphor tubes which give a much more balanced light.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Java Jive@21:1/5 to All on Tue Mar 22 14:34:22 2022
    On 22/03/2022 09:52, NY wrote:

    "Brian Gaff (Sofa)" <briang1@blueyonder.co.uk> wrote in message news:t1bvk0$l4s$1@dont-email.me...

    I do remember its all  to do with colour temperature and lumins,
    something I never bothered about when I could see. All I remember was
    early LEDs that looked white to the naked eye seemed to look bluish
    when they were used daylight around and that most fluorescent tubes
    looked dotty or yellow. If you shot pictures with a video in such
    environments, the latter turned everyone's faces green unless you
    changed the tint on the camera, and the former looked a bit like blue
    light shown through mud is the best way I can describe it. It does
    show up how clever the eye and brain are when they work correctly to
    getting the colours to look right, wheas electronic devices show what
    is really there.
    I remember taking non flash photos under those orange street lights
    with some very alien looking results. Some cars almost shone, yet
    others were really black even when blue.
    Some headlights were greeny yellow,while others were just whitish.

    Any lights with discontinuous spectrums (as opposed to "black body radiation", as taught in physics A level) will appear differently
    depending on the ways the RGB sensors respond to light.

    Yes, while the sodium street lights that Brian refers to are
    predominantly yellow, because the spectrum of sodium has a pair of very
    bright yellow lines, in fact there are other colours in there as well -
    although it's long time since I looked at the spectrum of one of those
    lamps as part of a lab experiment, I remember some fainter red and green
    lines as well. [Searches] Yes, lovely photo of the spectrum here,
    though I'm not sure why the areas between the radiation lines aren't
    black as they were in the experiment that I ran:

    https://en.wikipedia.org/wiki/Sodium#/media/File:Sodium_spectrum_visible.png

    [snip]

    I found that digital cameras with auto white balance correction, where
    you point the camera at something white and tell it "this is white",
    give pretty good rendition of colours in a variety of light sources. The
    main difference with CFLs and LEDs, compared with tungsten and sun/shade daylight, is that certain shades of red look too dark even though the
    overall colour balance is correct.

    https://i.postimg.cc/c4TW1zT4/daylight.jpg https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg https://i.postimg.cc/NfQYct6z/Led.jpg https://i.postimg.cc/1RGP2XqF/white-fluor.jpg https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    Interesting. Only the last of those looks truly awful. However, can I
    offer some advice? Next time use a tripod to ensure that the setup of
    each shot is identical. As you suggest most photo/scanning equipment
    uses correction technologies of some sort, even if it's only as
    unsophisticated as 'integrate to grey' in a film-era photo lab's
    printing machine, and therefore, theoretically at least, varying the composition might alter any auto-corrections applied by the equipment
    used, and hence the result not be a comparison only of the light source
    used, though I admit that such is not apparent in the shots you have linked.

    One of the things you need to be careful of with LEDs is that they are
    pulsed at high frequency with a mark:space ratio which varies as you dim
    the light. This can lead to horizontal bands on photos if the camera
    sensor reads the brightness of each row of pixels in turn, and the LEDs
    were on for some rows and off for others. I presume when LEDs are used
    to illuminate a TV studio they are driven from DC, with control over the current to dim the light, so they do not turn on and off.

    You get similar, but much worse, problems trying to photograph CRT TVs,
    as I found when drawing up the following web-pages on my site. Although
    I haven't illustrated it there, most of the photos of the CRT taken in preparation had to be junked because sections of the picture were very
    dark and others very bright - it was rare that by luck the camera
    captured one complete CRT scan acceptably evenly. Their primary purpose
    was to compare the output of a CRT TV and a flat panel TV displaying the
    same static picture, but wrt to the current thread title, by chance the
    second page of the demo shows quite well the effect of the vertical
    shadow-mask of the CRT, and the actual LEDs of the flat panel.

    www.macfh.co.uk/JavaJive/AudioVisualTV/CRTvsLCD/CRTvsLCD-p1.html

    What you haven't really mentioned, in fact I don't think anyone really
    has in the thread, is the response of the human eye. Although on this
    subject I have found contrary information on some sites that I would
    normally consider to be reliable, my understanding remains as follows.

    There are two types of photo-receptive cell in the human eye, rods and
    cones (there is actually a third type responsive to photo-period, which
    is not relevant here), with the former being sensitive to blue light and
    the latter being further split into two more types being sensitive to
    red and green light respectively; evolutionarily speaking, the
    differentiation between red and green came last. The above explains why
    we are able to mimic so much of the world that we see by using three
    primary colours, the additive primaries above, or their corresponding subtractive primaries yellow (white without blue), magenta (ditto
    green), and cyan (ditto red).

    A point often overlooked is that the cones are less sensitive than the
    rods, hence, as dusk falls, our seeing of red and green starts to fail
    before our seeing of blue, resulting in what the French expressively
    call "L'Heure Bleu", "The Blue Hour", around dusk. However, this also
    affects the colours we see under sodium street lighting, because of the faintness of the other colours in sodium's spectrum.

    You should be able to demonstrate this relative sensitivity to
    yourselves with the following picture. Print it out and leave it on a
    window sill where there is no artificial light as dusk begins to fall.
    You should observe that to begin with the red and perhaps the green
    appear brighter than the blue, but as the daylight fades there'll come a
    point where the blue should appear brighter than the other two colours.
    At this point, the cones are failing, but the rods are still working.

    www.macfh.co.uk/Temp/RedGreenBlue.png

    For completeness, there is also a subtractive primary version of the
    above, but I don't recall ever trying to see what happens to this as
    dusk falls!

    www.macfh.co.uk/Temp/CyanMagentaYellow.png

    --

    Fake news kills!

    I may be contacted via the contact address given on my website:
    www.macfh.co.uk

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Java Jive on Tue Mar 22 18:38:11 2022
    "Java Jive" <java@evij.com.invalid> wrote in message news:t1cmph$u4e$1@dont-email.me...
    On 22/03/2022 09:52, NY wrote:

    "Brian Gaff (Sofa)" <briang1@blueyonder.co.uk> wrote in message
    news:t1bvk0$l4s$1@dont-email.me...

    I do remember its all to do with colour temperature and lumins,
    something I never bothered about when I could see. All I remember was
    early LEDs that looked white to the naked eye seemed to look bluish when >>> they were used daylight around and that most fluorescent tubes looked
    dotty or yellow. If you shot pictures with a video in such environments, >>> the latter turned everyone's faces green unless you changed the tint on
    the camera, and the former looked a bit like blue light shown through
    mud is the best way I can describe it. It does show up how clever the
    eye and brain are when they work correctly to getting the colours to
    look right, wheas electronic devices show what is really there.
    I remember taking non flash photos under those orange street lights with >>> some very alien looking results. Some cars almost shone, yet others were >>> really black even when blue.
    Some headlights were greeny yellow,while others were just whitish.

    Any lights with discontinuous spectrums (as opposed to "black body
    radiation", as taught in physics A level) will appear differently
    depending on the ways the RGB sensors respond to light.

    Yes, while the sodium street lights that Brian refers to are predominantly yellow, because the spectrum of sodium has a pair of very bright yellow lines, in fact there are other colours in there as well - although it's
    long time since I looked at the spectrum of one of those lamps as part of
    a lab experiment, I remember some fainter red and green lines as well. [Searches] Yes, lovely photo of the spectrum here, though I'm not sure
    why the areas between the radiation lines aren't black as they were in the experiment that I ran:

    Also there are two types of sodium lights. Low pressure ones are the sort
    that were used for street lights and were truly horrible as a light source because they are virtually monochromatic and so don't give any perception of colour. But there are also high-pressure ones which give a peach-coloured
    light with a bit of colour perception because they have a broader spectrum -
    ie a greater spread of discrete lines.

    https://i.postimg.cc/c4TW1zT4/daylight.jpg
    https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg
    https://i.postimg.cc/NfQYct6z/Led.jpg
    https://i.postimg.cc/1RGP2XqF/white-fluor.jpg
    https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    Interesting. Only the last of those looks truly awful. However, can I
    offer some advice? Next time use a tripod to ensure that the setup of
    each shot is identical. As you suggest most photo/scanning equipment uses correction technologies of some sort, even if it's only as unsophisticated
    as 'integrate to grey' in a film-era photo lab's printing machine, and therefore, theoretically at least, varying the composition might alter any auto-corrections applied by the equipment used, and hence the result not
    be a comparison only of the light source used, though I admit that such is not apparent in the shots you have linked.

    I had to take the photos in a variety of locations, depending on where there was a lamp of the desired type. In all cases I white-balanced off a piece of paper which completely filled the frame, so the camera would only "see"
    white of the appropriate colour. There may be slight differences in
    exposure, though I tried to make sure the test image filled almost the whole frame. I should have stuck the loose objects (box of screws and box of
    drills) to the background image so they were in identical locations for all shots and ensured all the images were framed identically.


    One of the things you need to be careful of with LEDs is that they are
    pulsed at high frequency with a mark:space ratio which varies as you dim
    the light. This can lead to horizontal bands on photos if the camera
    sensor reads the brightness of each row of pixels in turn, and the LEDs
    were on for some rows and off for others. I presume when LEDs are used to
    illuminate a TV studio they are driven from DC, with control over the
    current to dim the light, so they do not turn on and off.

    You get similar, but much worse, problems trying to photograph CRT TVs, as
    I found when drawing up the following web-pages on my site. Although I haven't illustrated it there, most of the photos of the CRT taken in preparation had to be junked because sections of the picture were very
    dark and others very bright - it was rare that by luck the camera
    captured one complete CRT scan acceptably evenly. Their primary purpose
    was to compare the output of a CRT TV and a flat panel TV displaying the
    same static picture, but wrt to the current thread title, by chance the second page of the demo shows quite well the effect of the vertical shadow-mask of the CRT, and the actual LEDs of the flat panel.

    The best way to photograph a TV screen apparently is to use a small aperture and/or a neutral density filter to make the exposure several seconds, so the difference between n and n+1 scans in different parts of the frame is
    minimal because n is large. Obviously this only works with a still image on
    the TV screen ;-) ND filter is probably a better way because most lenses
    have an optimum aperture: too large (small f number) and parts of the screen may be out of focus when the centre is in focus, but too small an aperture
    may introduce diffraction effects.

    www.macfh.co.uk/JavaJive/AudioVisualTV/CRTvsLCD/CRTvsLCD-p1.html

    An interesting comparison.

    The other problem that you might encounter is moiré: a wavy-line pattern
    which is "beating" between the pixel spacing in the source image and that in the photo of the screen, though perhaps it is less so when the pixels of the source image are so much larger than the pixels of your photos of the
    screen.

    It would be interesting to compare those photos with the original image, if
    you can grab it electronically and scale it the same as the photos of the screen.

    What you haven't really mentioned, in fact I don't think anyone really has
    in the thread, is the response of the human eye. Although on this subject
    I have found contrary information on some sites that I would normally consider to be reliable, my understanding remains as follows.

    There are two types of photo-receptive cell in the human eye, rods and
    cones (there is actually a third type responsive to photo-period, which is not relevant here), with the former being sensitive to blue light and the latter being further split into two more types being sensitive to red and green light respectively; evolutionarily speaking, the differentiation between red and green came last. The above explains why we are able to
    mimic so much of the world that we see by using three primary colours, the additive primaries above, or their corresponding subtractive primaries
    yellow (white without blue), magenta (ditto green), and cyan (ditto red).

    A point often overlooked is that the cones are less sensitive than the
    rods, hence, as dusk falls, our seeing of red and green starts to fail
    before our seeing of blue, resulting in what the French expressively call "L'Heure Bleu", "The Blue Hour", around dusk. However, this also affects
    the colours we see under sodium street lighting, because of the faintness
    of the other colours in sodium's spectrum.

    You should be able to demonstrate this relative sensitivity to yourselves with the following picture. Print it out and leave it on a window sill
    where there is no artificial light as dusk begins to fall. You should
    observe that to begin with the red and perhaps the green appear brighter
    than the blue, but as the daylight fades there'll come a point where the
    blue should appear brighter than the other two colours. At this point, the cones are failing, but the rods are still working.

    www.macfh.co.uk/Temp/RedGreenBlue.png

    For completeness, there is also a subtractive primary version of the
    above, but I don't recall ever trying to see what happens to this as dusk falls!

    www.macfh.co.uk/Temp/CyanMagentaYellow.png

    Amazing what facts the brain retains without you knowing. As soon as I read your description, the term "Purkinje effect" spring into my brain. I didn't know that I knew about this...

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From williamwright@21:1/5 to All on Wed Mar 23 02:44:31 2022
    On 22/03/2022 09:52, NY wrote:
    https://i.postimg.cc/c4TW1zT4/daylight.jpg https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg https://i.postimg.cc/NfQYct6z/Led.jpg https://i.postimg.cc/1RGP2XqF/white-fluor.jpg https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    You should have equalised the exposures and gamma in order to make a comparison.

    Bill

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Wed Mar 23 09:50:39 2022
    On Tue, 22 Mar 2022 18:38:11 -0000, "NY" <me@privacy.invalid> wrote:

    The best way to photograph a TV screen apparently is to use a small aperture >and/or a neutral density filter to make the exposure several seconds, so the >difference between n and n+1 scans in different parts of the frame is
    minimal because n is large.

    Once I worked in a university audiovisual department, where we would occasionally be asked - sometimes by professors or other clever people
    with letters after their names - if when photographing a TV screen it
    was best to have the camera's flash switched on or off.

    Apparently it's possible to live in the real world but not pay much
    attention to it.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to wrightsaerials@f2s.com on Wed Mar 23 10:01:53 2022
    On Wed, 23 Mar 2022 02:44:31 +0000, williamwright
    <wrightsaerials@f2s.com> wrote:

    On 22/03/2022 09:52, NY wrote:
    https://i.postimg.cc/c4TW1zT4/daylight.jpg
    https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg
    https://i.postimg.cc/NfQYct6z/Led.jpg
    https://i.postimg.cc/1RGP2XqF/white-fluor.jpg
    https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    You should have equalised the exposures and gamma in order to make a >comparison.

    Bill

    Maybe also use a variety of natural test subjects covering the full
    visible spectrum. A photograph in a printed magazine has already been
    analysed by a filter system and then displayed using only three
    artificial pigments.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From charles@21:1/5 to Roderick Stewart on Wed Mar 23 10:17:52 2022
    In article <52rl3hldbj798157dtecv8mqqum3kq25o3@4ax.com>,
    Roderick Stewart <rjfs@escapetime.myzen.co.uk> wrote:
    On Tue, 22 Mar 2022 18:38:11 -0000, "NY" <me@privacy.invalid> wrote:

    The best way to photograph a TV screen apparently is to use a small >aperture and/or a neutral density filter to make the exposure several >seconds, so the difference between n and n+1 scans in different parts
    of the frame is minimal because n is large.

    Once I worked in a university audiovisual department, where we would occasionally be asked - sometimes by professors or other clever people
    with letters after their names - if when photographing a TV screen it
    was best to have the camera's flash switched on or off.

    Apparently it's possible to live in the real world but not pay much
    attention to it.

    Rod.


    Back in the 1970s, I was showing off Ceefax at the TV exhibition in
    Montreaux. Someone wanted a picture of the screen, held up his light meter, remarked that it was quite low levle, so turned on his flash. I wonder
    what he though when thefilm came back from the lab.

    --
    from KT24 in Surrey, England
    "I'd rather die of exhaustion than die of boredom" Thomas Carlyle

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From williamwright@21:1/5 to charles on Wed Mar 23 10:58:19 2022
    On 23/03/2022 10:17, charles wrote:
    Back in the 1970s, I was showing off Ceefax at the TV exhibition in Montreaux. Someone wanted a picture of the screen, held up his light meter, remarked that it was quite low levle, so turned on his flash. I wonder
    what he though when thefilm came back from the lab.

    He'd think the telly was faulty.

    Bill

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Wed Mar 23 11:12:12 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:ccrl3htbgpj1lje0ru67fbmjkb9r8t5u7n@4ax.com...
    On Wed, 23 Mar 2022 02:44:31 +0000, williamwright
    <wrightsaerials@f2s.com> wrote:

    On 22/03/2022 09:52, NY wrote:
    https://i.postimg.cc/c4TW1zT4/daylight.jpg
    https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg
    https://i.postimg.cc/NfQYct6z/Led.jpg
    https://i.postimg.cc/1RGP2XqF/white-fluor.jpg
    https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    You should have equalised the exposures and gamma in order to make a >>comparison.

    Bill

    Maybe also use a variety of natural test subjects covering the full
    visible spectrum. A photograph in a printed magazine has already been analysed by a filter system and then displayed using only three
    artificial pigments.

    All good suggestions. I was limited in my choice of test subject by needing something which could be moved to the locations where the lights were. The
    only fluorescent tubes we had were lights in the loft, and the only LED
    lights were GU10s in the bathroom ceiling. I remember having to hold the
    camera at a slight angle to the printed page for the LED photo to avoid
    picking up a reflection of the light in the shiny paper. The photos I posted are smaller versions of the originals, and this means the exposure details
    have not been preserved, but I remember choosing long exposures for the artificial lights to avoid any appreciable difference due to persistence of phosphors on fluorescent and LED: an exposure of (for example) 1/80 second would only catch *part* of a mains cycle and so there could be a greater dominance of the discharge (blueish) of a fluorescent and less of the
    phosphor (yellowish). So I was thinking of minimising *some* differences ;-)

    If I was doing the test now I would choose a subject that had a full range
    of colours of real-world objects (not restricting to a printed photo subject
    to CMYK inks). And I'd mark the subject so I knew exactly how to adjust
    camera position (framing, distance from subject) to get identical framing in all photos which would allow the camera's auto exposure to correct for the great differences in light levels between one lights source and another -
    for example, bright sunlight would be many many times brighter than
    artificial lights. Maybe I'd also find somewhere shady outdoors (but free of colour cast reflected from walls, trees etc) to test shade (which is about 10,000 K, as opposed to about 5500 K for direct sunlight). And maybe try various different LED bulbs: we have two types in our kitchen which are both sold as "daylight" but one is slightly more yellow and less blue than the
    other (to the naked eye). And maybe even try Philips Hue (which allow RGB levels to be adjusted) on various standard presets.

    One thing that I'd need to consider: some objects fluoresce slightly under
    UV light which might be present in direct sunlight, though whether that
    would be noticeable in relation to the incident sunlight is very doubtful
    ;-)

    I remember once testing tungsten bulbs on various settings of a thyristor dimmer: as well as the brightness varying, the colour temperature decreases (light becomes more yellow/orange) as you dim. I can't find the photos, but
    I seem to remember that the photos I took showed remarkably little
    difference once the camera had auto-adjusted the white balance as well as
    the exposure. But then that is a black-body source with a continuous
    spectrum, whereas fluorescent and LED have discontinuities.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Java Jive@21:1/5 to All on Wed Mar 23 11:18:57 2022
    On 22/03/2022 18:38, NY wrote:

    "Java Jive" <java@evij.com.invalid> wrote in message news:t1cmph$u4e$1@dont-email.me...

    Yes, while the sodium street lights that Brian refers to are
    predominantly yellow, because the spectrum of sodium has a pair of
    very bright yellow lines, in fact there are other colours in there as
    well  - although it's long time since I looked at the spectrum of one
    of those lamps as part of a lab experiment, I remember some fainter
    red and green lines as well. [Searches]  Yes, lovely photo of the
    spectrum here, though I'm not sure why the areas between the radiation
    lines aren't black as they were in the experiment that I ran:

    www.macfh.co.uk/JavaJive/AudioVisualTV/CRTvsLCD/CRTvsLCD-p1.html

    Also there are two types of sodium lights. Low pressure ones are the
    sort that were used for street lights and were truly horrible as a light source because they are virtually monochromatic and so don't give any perception of colour. But there are also high-pressure ones which give a peach-coloured light with a bit of colour perception because they have a broader spectrum - ie a greater spread of discrete lines.

    Yes, that may explain the difference between the spectrum I obtained in
    my experiment at uni, which was black between the spectral lines, and
    the one I linked above.

    https://i.postimg.cc/c4TW1zT4/daylight.jpg
    https://i.postimg.cc/HLrdhtwW/daylight-CFL.jpg
    https://i.postimg.cc/NfQYct6z/Led.jpg
    https://i.postimg.cc/1RGP2XqF/white-fluor.jpg
    https://i.postimg.cc/br9htVhg/white-fluor-daylight-WB.jpg

    Interesting.  Only the last of those looks truly awful.  However, can
    I offer some advice?  Next time use a tripod to ensure that the setup
    of each shot is identical.  As you suggest most photo/scanning
    equipment uses correction technologies of some sort, even if it's only
    as unsophisticated as 'integrate to grey' in a film-era photo lab's
    printing machine, and therefore, theoretically at least, varying the
    composition might alter any auto-corrections applied by the equipment
    used, and hence the result not be a comparison only of the light
    source used, though I admit that such is not apparent in the shots you
    have linked.

    I had to take the photos in a variety of locations, depending on where
    there was a lamp of the desired type. In all cases I white-balanced off
    a piece of paper which completely filled the frame, so the camera would
    only "see" white of the appropriate colour. There may be slight
    differences in exposure, though I tried to make sure the test image
    filled almost the whole frame. I should have stuck the loose objects
    (box of screws and box of drills) to the background image so they were
    in identical locations for all shots and ensured all the images were
    framed identically.

    Ah I hadn't realised that. Although the minor criticisms raised by
    myself and others are valid, I think it was a good effort given the circumstances.

    One of the things you need to be careful of with LEDs is that they
    are pulsed at high frequency with a mark:space ratio which varies as
    you dim the light. This can lead to horizontal bands on photos if the
    camera sensor reads the brightness of each row of pixels in turn, and
    the LEDs were on for some rows and off for others. I presume when
    LEDs are used to illuminate a TV studio they are driven from DC, with
    control over the current to dim the light, so they do not turn on and
    off.

    You get similar, but much worse, problems trying to photograph CRT
    TVs, as I found when drawing up the following web-pages on my site.
    Although I haven't illustrated it there, most of the photos of the CRT
    taken in preparation had to be junked because sections of the picture
    were very dark and others very bright  -  it was rare that by luck the
    camera captured one complete CRT scan acceptably evenly.  Their
    primary purpose was to compare the output of a CRT TV and a flat panel
    TV displaying the same static picture, but wrt to the current thread
    title, by chance the second page of the demo shows quite well the
    effect of the vertical shadow-mask of the CRT, and the actual LEDs of
    the flat panel.

    The best way to photograph a TV screen apparently is to use a small
    aperture and/or a neutral density filter to make the exposure several seconds, so the difference between n and n+1 scans in different parts of
    the frame is minimal because n is large. Obviously this only works with
    a still image on the TV screen ;-)  ND filter is probably a better way because most lenses have an optimum aperture: too large (small f number)
    and parts of the screen may be out of focus when the centre is in focus,
    but too small an aperture may introduce diffraction effects.

    Not tried that, and I'm not sure it would have been a valid method given
    the direct frame for frame nature of the comparison I was trying to make
    with the limited range and capabilities of the kit that I had available.

    www.macfh.co.uk/JavaJive/AudioVisualTV/CRTvsLCD/CRTvsLCD-p1.html

    An interesting comparison.

    The other problem that you might encounter is moiré: a wavy-line pattern which is "beating" between the pixel spacing in the source image and
    that in the photo of the screen, though perhaps it is less so when the
    pixels of the source image are so much larger than the pixels of your
    photos of the screen.

    Yes, I've seen that often, particularly in JPGs, and also in scans of
    120 film negatives that were too big to fit in the neg holder of my
    scanner, so had to be laid flat on the glass, and actually weren't quite completely flat, resulting in what I call Moiré, others call Newton's
    rings. I still have about 20-30 of these latter that gave this problem,
    even under glass, and that I hope to rescan with a better quality result
    one day.

    It would be interesting to compare those photos with the original image,
    if you can grab it electronically and scale it the same as the photos of
    the screen.

    Not possible, it was a paused recording of a skiing program which I no
    longer have.

    What you haven't really mentioned, in fact I don't think anyone really
    has in the thread, is the response of the human eye.  Although on this
    subject I have found contrary information on some sites that I would
    normally consider to be reliable, my understanding remains as follows.

    There are two types of photo-receptive cell in the human eye, rods and
    cones (there is actually a third type responsive to photo-period,
    which is not relevant here), with the former being sensitive to blue
    light and the latter being further split into two more types being
    sensitive to red and green light respectively; evolutionarily
    speaking, the differentiation between red and green came last.  The
    above explains why we are able to mimic so much of the world that we
    see by using three primary colours, the additive primaries above, or
    their corresponding subtractive primaries yellow (white without blue),
    magenta (ditto green), and cyan (ditto red).

    A point often overlooked is that the cones are less sensitive than the
    rods, hence, as dusk falls, our seeing of red and green starts to fail
    before our seeing of blue, resulting in what the French expressively
    call "L'Heure Bleu", "The Blue Hour", around dusk.  However, this also
    affects the colours we see under sodium street lighting, because of
    the faintness of the other colours in sodium's spectrum.

    You should be able to demonstrate this relative sensitivity to
    yourselves with the following picture.  Print it out and leave it on a
    window sill where there is no artificial light as dusk begins to fall.
    You should observe that to begin with the red and perhaps the green
    appear brighter than the blue, but as the daylight fades there'll come
    a point where the blue should appear brighter than the other two
    colours. At this point, the cones are failing, but the rods are still
    working.

    www.macfh.co.uk/Temp/RedGreenBlue.png

    For completeness, there is also a subtractive primary version of the
    above, but I don't recall ever trying to see what happens to this as
    dusk falls!

    www.macfh.co.uk/Temp/CyanMagentaYellow.png

    Amazing what facts the brain retains without you knowing. As soon as I
    read your description, the term "Purkinje effect" spring into my brain.
    I didn't know that I knew about this...

    I've never heard it called that before, but yes, that's the exact
    phenomenon I was trying to describe above:

    https://en.wikipedia.org/wiki/Purkinje_effect

    --

    Fake news kills!

    I may be contacted via the contact address given on my website:
    www.macfh.co.uk

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to williamwright on Wed Mar 23 11:29:54 2022
    "williamwright" <wrightsaerials@f2s.com> wrote in message news:ja0cqbFf5ciU1@mid.individual.net...
    On 23/03/2022 10:17, charles wrote:
    Back in the 1970s, I was showing off Ceefax at the TV exhibition in
    Montreaux. Someone wanted a picture of the screen, held up his light
    meter,
    remarked that it was quite low levle, so turned on his flash. I wonder
    what he though when thefilm came back from the lab.

    He'd think the telly was faulty.

    One of the problems with photographing TV or computer screens in a general
    view of an office etc is to get the exposure of the image and of the office sufficiently similar that the screen is not burnt-out. Screens are usually mid-grey when they are off, so the image on the screen is usually set to be bright enough that a full range of tones is visible, without all the darker tones being lost in the grey ambient reflection.

    When shooting by flash, there are several variables that need to be juggled:

    - shutter speed of maybe half a second to minimise difference between
    different parts of the screen getting n or n+1 scans

    - shutter speed definitely low enough (< 1/125 second) that flash will synchronise (but the first point takes care of that)

    - take a light reading off the screen (with the room lighting low) to
    determine correct aperture

    - set flash power so it is correct for the chosen aperture; probably bounce
    the flash off a white ceiling or at least a card to give softer light than direct flash, and to lessen any reflection of the flash in the screen.


    Apparently when CRTs were included in vision for a film camera or a TV
    camera, the camera was synchronised with the video sync to avoid a rolling
    bar on the screen image: maybe even run the camera without film, checking
    for a visible bar in the viewfinder and adjusting phase to move it
    off-screen.

    Beating can be a problem between 50 Hz video rate, and 60 Hz or 75 or 90 or
    120 or whatever for computer image. When I was filming a short video at work once with a UK video camera, I adjusted the in-vision computer so it used a
    50 Hz (or may 100 Hz) refresh rate to minimise rolling bars. I also turned
    the screen brightness down to a level which was a strain on the eye to use,
    so as to reduce the difference in screen and general scene brightness.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to Roderick Stewart on Wed Mar 23 11:55:31 2022
    On 23/03/2022 09:50, Roderick Stewart wrote:
    On Tue, 22 Mar 2022 18:38:11 -0000, "NY" <me@privacy.invalid> wrote:

    The best way to photograph a TV screen apparently is to use a small aperture >> and/or a neutral density filter to make the exposure several seconds, so the >> difference between n and n+1 scans in different parts of the frame is
    minimal because n is large.

    Once I worked in a university audiovisual department, where we would occasionally be asked - sometimes by professors or other clever people
    with letters after their names - if when photographing a TV screen it
    was best to have the camera's flash switched on or off.

    Apparently it's possible to live in the real world but not pay much
    attention to it.

    It's not immediately obvious which displays work by reflection and which
    by producing light. LCDs work by reflection, but what if it relies on a backlight? There are shopping coupons which are scanned by a laser in
    the shop, but, if you don't have access to a printer, you are supposed
    to be able to scan the bar/QR code on a smartphone screen; how does that
    work? What if the display is OLED?

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Java Jive on Wed Mar 23 12:32:15 2022
    "Java Jive" <java@evij.com.invalid> wrote in message news:t1evn6$gkk$1@dont-email.me...
    The other problem that you might encounter is moiré: a wavy-line pattern
    which is "beating" between the pixel spacing in the source image and that
    in the photo of the screen, though perhaps it is less so when the pixels
    of the source image are so much larger than the pixels of your photos of
    the screen.

    Yes, I've seen that often, particularly in JPGs, and also in scans of 120 film negatives that were too big to fit in the neg holder of my scanner,
    so had to be laid flat on the glass, and actually weren't quite completely flat, resulting in what I call Moiré, others call Newton's rings. I still have about 20-30 of these latter that gave this problem, even under glass, and that I hope to rescan with a better quality result one day.

    Ah! The dreaded Newton's Rings. I encountered that the other day when I was scanning some 35 mm slides of my parents' wedding for their 60th
    anniversary. Some of the slides had been mounted in glass-covered mounts and the Newton's Rings were horrible. The glass was also dirty: it was easy to clean the outside, but the inside looked to be mucky as well. I decided to
    take those mounts apart (I was afraid of the plastic of the mount having
    gone brittle after 60 years) and remount them temporarily in a spare mount
    that I had. There was a very noticeable difference in quality due to there being less (dirty) glass in the way, and no Newton's rings. I took the opportunity to clean both side of both pieces of glass before remounting the film in the glass frames after scanning.


    When I had to enlarge from some of my dad's 120 negs many years ago I had
    the same problem as you with not having a 120 neg frame. I used a sheet of glass that I cleaned well and then sat on top of the neg on thin shims (I
    think I used a couple of bits of fogged film from the beginning of a film)
    so the neg didn't touch the glass. With a flatbed scanner as opposed to printing onto paper, you can turn the film upside down to compensate for direction of film curl when avoiding the film touching the glass and causing Newton's rings, and then correct for the mirror-image in the computer.

    My scanner tends to produce better results (better tonal range) if I scan as
    a positive and correct in software, rather than scan as a (B&W) negative.
    But negs in general (and especially colour ones) are a lot more difficult to get good scans from than positive slides. The software corrects find for the orange cast of a colour neg, but I get a lot of ghosting around dark objects against a bright sky (ie object is bright and sky is dark on the neg) and horrendous amoeba-like grain. That's both with a flatbed scanner and a dedicated film scanner, and VueScan software because the Minolta software doesn't install on Win 7 or 10. Results vary from really abominable to fantastic, depending on film type. It probably doesn't help that standard
    Kodak colour negatives types didn't seem to be in the software's presets, either by name (Kodacolor) or code on the film's edge-marking. At best I got considerably more detail in shadows and highlights than scanning from a processing-shop print, but some photos had a weird effect like you got in "colour plates" in books from the 1940s and 50s: colours too strong and contrast too flat - all about larger-than-life - and standard saturation, contrast and gamma controls in software didn't really correct for it.
    Slides, other the other hand, required very little tweaking unless they were vastly under- or over-exposed. I was impressed with how much I managed to correct for some Ektachrome slides that I'd shot many years ago of
    illuminated buildings at night, and I'd guessed the exposure wrong by about
    3 stops overexposed. I was able to resolve enough highlight detail to
    improve things quite a bit.

    As an aside, I discovered a nasty problem with my film scanner when I was scanning the wedding slides. My scanner has a shaft that advances the slide past the sensor at a known rate, and this shaft is connected by a plastic collar to a stepper motor. And over the years, that collar has split and so sometimes was slipping. After getting several scan that were long and thin,
    I managed a repair: wrap a bit of 3M magic tape (not conventional Sellotape which has glue that oozes!) around the motor shaft to make it a bit bigger diameter and then refit the split collar. Problem solved! I'd forgotten just how slow my film scanner is when I'm doing multiple-reading scans, with
    grain reduction, to minimise both grain and sensor noise. It was worth the hassle, though. Thankfully all the slides I chose seem to have been
    something over than Kodachrome so the IR dirt-reduction algorithm worked OK (unlike Agfa and Kodak Ektachrome, Kodachrome has an emulsion which is not uniformly transparent to IR, so the algorithm which uses IR to look for dirt and dust on the film doesn't work because it sees darker parts of the image
    as dirt).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to All on Wed Mar 23 12:37:18 2022
    "Max Demian" <max_demian@bigfoot.com> wrote in message news:E82dnS-guvgpl6b_nZ2dnUU7-c3NnZ2d@brightview.co.uk...
    It's not immediately obvious which displays work by reflection and which
    by producing light. LCDs work by reflection, but what if it relies on a backlight? There are shopping coupons which are scanned by a laser in the shop, but, if you don't have access to a printer, you are supposed to be
    able to scan the bar/QR code on a smartphone screen; how does that work?
    What if the display is OLED?

    Like you, I've always wondered how laser barcode scanners manage to scan a backlit LED screen rather than a reflective paper coupon (or an LCD screen). Scanners in shops seem to have a switchable setting which maybe turns the
    laser off and moves the sensor (or relies on movement of the phone screen
    past the scanner) to sense the horizontal axis of the bar code.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From charles@21:1/5 to Mark Carver on Wed Mar 23 14:11:42 2022
    In article <ja0ktpFgm4bU1@mid.individual.net>,
    Mark Carver <mark.carver@invalid.invalid> wrote:
    On 23/03/2022 10:17, charles wrote:

    Back in the 1970s, I was showing off Ceefax at the TV exhibition in Montreaux.
    Crystal Palace had one hell of a range !

    no, the signal was coming from a VT on the Ampex stand 2 floors down. But,
    I could get R2 on my car radio (expect under the overhead tram wires.

    --
    from KT24 in Surrey, England
    "I'd rather die of exhaustion than die of boredom" Thomas Carlyle

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to charles on Wed Mar 23 13:16:41 2022
    On 23/03/2022 10:17, charles wrote:

    Back in the 1970s, I was showing off Ceefax at the TV exhibition in Montreaux.
    Crystal Palace had one hell of a range !

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to charles on Wed Mar 23 15:35:52 2022
    On 23/03/2022 14:11, charles wrote:
    In article <ja0ktpFgm4bU1@mid.individual.net>,
    Mark Carver <mark.carver@invalid.invalid> wrote:
    On 23/03/2022 10:17, charles wrote:
    Back in the 1970s, I was showing off Ceefax at the TV exhibition in
    Montreaux.
    Crystal Palace had one hell of a range !
    no, the signal was coming from a VT on the Ampex stand 2 floors down.
    Ah, very good. Back in those days even almost-rival manufacturers would
    happily swap feeds between stands.
    Lots of Hum-Buckers everywhere of course. SDI took the fun out of that,
    but by then the 'grown-ups' had put a stop to those sort of things.
    I could get R2 on my car radio (expect under the overhead tram wires.
    Yep, good old Droitwich normally made it with the radio near the hotel
    room window

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From charles@21:1/5 to Mark Carver on Wed Mar 23 16:30:03 2022
    In article <ja0t2oFi92cU1@mid.individual.net>,
    Mark Carver <mark.carver@invalid.invalid> wrote:
    On 23/03/2022 14:11, charles wrote:
    In article <ja0ktpFgm4bU1@mid.individual.net>,
    Mark Carver <mark.carver@invalid.invalid> wrote:
    On 23/03/2022 10:17, charles wrote:
    Back in the 1970s, I was showing off Ceefax at the TV exhibition in
    Montreaux.
    Crystal Palace had one hell of a range !
    no, the signal was coming from a VT on the Ampex stand 2 floors down.
    Ah, very good. Back in those days even almost-rival manufacturers would happily swap feeds between stands.

    The main member of the Ampex team had been at Woodnorton with me, as had
    the boss of the PYE stand.

    Lots of Hum-Buckers everywhere of course.

    indeed. 3v hum and 1v video. Probably different phases on each floor.


    SDI took the fun out of that,
    but by then the 'grown-ups' had put a stop to those sort of things.
    I could get R2 on my car radio (expect under the overhead tram wires.
    Yep, good old Droitwich normally made it with the radio near the hotel
    room window

    --
    from KT24 in Surrey, England
    "I'd rather die of exhaustion than die of boredom" Thomas Carlyle

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Java Jive@21:1/5 to All on Wed Mar 23 17:26:39 2022
    On 23/03/2022 12:32, NY wrote:

    My scanner tends to produce better results (better tonal range) if I
    scan as a positive and correct in software, rather than scan as a (B&W) negative. But negs in general (and especially colour ones) are a lot
    more difficult to get good scans from than positive slides.

    Actually I disagree, at least as far as my scanner goes, it's an old HP
    5490C (Model C9850A), the bummer being that I can't get 64-bit drivers
    for it, so have to use it with either XP or Linux. However it does have
    an illuminated neg/slide holder, and I can choose in the menu of its
    associated software which I'm scanning, and it'll apply any necessary corrections automatically. The results from scanning the original
    negatives or slides are almost always far superior to those from
    scanning prints made from them, in terms of surface damage, contrast,
    and colours.

    Thankfully all
    the slides I chose seem to have been something over than Kodachrome so
    the IR dirt-reduction algorithm worked OK (unlike Agfa and Kodak
    Ektachrome, Kodachrome has an emulsion which is not uniformly
    transparent to IR, so the algorithm which uses IR to look for dirt and
    dust on the film doesn't work because it sees darker parts of the image
    as dirt).

    Kodachrome was a favourite film for me, the worst thing about the slides
    made from it are that the cardboard surrounds disintegrate and leave
    dust over the slides themselves. AFAIAA, my scanner doesn't have a dirt removal function as you describe, and it took me the best part of two
    years to go through all the family albums scanning or photographing them
    and cleaning up the digitised results. The results vary, with the best
    being really very good, but some of the material had degraded so far
    that little post-processing could be done to improve it.

    --

    Fake news kills!

    I may be contacted via the contact address given on my website:
    www.macfh.co.uk

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Thu Mar 24 09:27:12 2022
    On Wed, 23 Mar 2022 12:37:18 -0000, "NY" <me@privacy.invalid> wrote:

    "Max Demian" <max_demian@bigfoot.com> wrote in message >news:E82dnS-guvgpl6b_nZ2dnUU7-c3NnZ2d@brightview.co.uk...
    It's not immediately obvious which displays work by reflection and which
    by producing light. LCDs work by reflection, but what if it relies on a
    backlight? There are shopping coupons which are scanned by a laser in the
    shop, but, if you don't have access to a printer, you are supposed to be
    able to scan the bar/QR code on a smartphone screen; how does that work?
    What if the display is OLED?

    Like you, I've always wondered how laser barcode scanners manage to scan a >backlit LED screen rather than a reflective paper coupon (or an LCD screen). >Scanners in shops seem to have a switchable setting which maybe turns the >laser off and moves the sensor (or relies on movement of the phone screen >past the scanner) to sense the horizontal axis of the bar code.

    The first time I saw a checkout scanner cope with the barcode on a
    crumpled bag of frozen peas dripping with moisture, and without the
    operator needing to pause as she whizzed it past the scanning window,
    I was amazed that anyone ever thought such a system could work at all.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Thu Mar 24 09:19:56 2022
    On Wed, 23 Mar 2022 11:29:54 -0000, "NY" <me@privacy.invalid> wrote:

    Apparently when CRTs were included in vision for a film camera or a TV >camera, the camera was synchronised with the video sync to avoid a rolling >bar on the screen image: maybe even run the camera without film, checking
    for a visible bar in the viewfinder and adjusting phase to move it >off-screen.

    It's not possible to "overcrank" or "undercrank" a TV camera to a
    different frame rate in the same way as a film camera, so it can't
    actually be synchronised to anything other than system sync pulses.

    Computer CRT scanning rates were unrelated to TV rates and usually
    much higher, so electrically synchronising the cameras would have been
    out of the question.

    It's actually the exposure time of the camera, not synchronisation,
    that was adjusted to accommodate in-vision computer CRT displays, and
    it only became possible with the advent of solid state "chip" cameras.
    The start of the time interval for which the chip was sensitive to
    light could be adjusted, sometimes continuously and sometimes in steps depending on the design of the camera, but the readout time at the end
    of that interval had to be synchronised to television system pulses.
    There would still be randomly positioned horizontal bars on the CRT
    display, but with zero overlap, which would be less visible as long as
    the camera didn't pan.

    It was easy to confirm this by pointing a camera at a television
    monitor displaying something bright with the same scanning rate as the
    camera, while adjusting the electronic shutter control of the camera.
    It would be the upper edge of the bright bar, i.e. the start of the
    exposure time, that would move as the control was adjusted.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Thu Mar 24 14:12:12 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:4gbo3h5lur8cvdlruqpkfs38ak4bfh2ivl@4ax.com...
    On Wed, 23 Mar 2022 11:29:54 -0000, "NY" <me@privacy.invalid> wrote:

    Apparently when CRTs were included in vision for a film camera or a TV >>camera, the camera was synchronised with the video sync to avoid a rolling >>bar on the screen image: maybe even run the camera without film, checking >>for a visible bar in the viewfinder and adjusting phase to move it >>off-screen.

    It's not possible to "overcrank" or "undercrank" a TV camera to a
    different frame rate in the same way as a film camera, so it can't
    actually be synchronised to anything other than system sync pulses.

    What I described related to film cameras, hence my reference to running
    without film to see the effect through the viewfinder to check for the
    *phase* of the shutter opening, with the expectation that the *frequency*
    would be accurately controlled at 25 fps by the crystal which keeps the
    sepmag sound in sync with the film. Maybe the cable takes the place of the crystal as a source of frame sync. I imagine with a free-running TV camera (when recording locally) you could use the TV's video signal to sync the
    camera to the TV source, so the raster of the camera matched the raster of
    the TV source.

    Computer CRT scanning rates were unrelated to TV rates and usually
    much higher, so electrically synchronising the cameras would have been
    out of the question.

    It's actually the exposure time of the camera, not synchronisation,
    that was adjusted to accommodate in-vision computer CRT displays, and
    it only became possible with the advent of solid state "chip" cameras.
    The start of the time interval for which the chip was sensitive to
    light could be adjusted, sometimes continuously and sometimes in steps depending on the design of the camera, but the readout time at the end
    of that interval had to be synchronised to television system pulses.
    There would still be randomly positioned horizontal bars on the CRT
    display, but with zero overlap, which would be less visible as long as
    the camera didn't pan.

    It was easy to confirm this by pointing a camera at a television
    monitor displaying something bright with the same scanning rate as the camera, while adjusting the electronic shutter control of the camera.
    It would be the upper edge of the bright bar, i.e. the start of the
    exposure time, that would move as the control was adjusted.

    I would have thought that to be able to use a TV camera to view a TV screen, you want the shutter speed to be the full 1/25 second, especially if the TV screen fills the majority for the frame. If the screen fills (for example)
    50 lines of the camera, you can use a correspondingly shorter shutter speed
    as the camera will only see the TV screen for 50/625 * 1/25 second.


    I'm always amazed at how a simple video camera (as long as it is UK-spec 25 fps) with no sync to the TV can give a stable picture with no visible
    flicker or rolling bar. I presume if you wait long enough you may see a bar roll through the picture at a rate of (TV frame rate - camera frame rate) if the camera is free-running and is not running at precisely 25 fps due to
    normal component tolerances. Sadly a lot of simple cameras (GoPro, cameras
    in mobile phones or tablets, etc) are preset to run only at the US 29.97 fps frame rate. Actually, a lot of that sort of camera seem to run at 30 fps, so
    no doubt there would need to be the old 1000/1001 drop-frame correction.
    It's a great shame that all video cameras (including mobile phones, GoPro
    etc) aren't designed to be switchable between the two frame rates, to
    satisfy the European, Australian, African, Arabian etc market which I
    presume is a sizable minority compared with the US/Canada/Japanese market.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to All on Thu Mar 24 14:30:40 2022
    On 23/03/2022 12:37, NY wrote:
    "Max Demian" <max_demian@bigfoot.com> wrote in message news:E82dnS-guvgpl6b_nZ2dnUU7-c3NnZ2d@brightview.co.uk...

    It's not immediately obvious which displays work by reflection and
    which by producing light. LCDs work by reflection, but what if it
    relies on a backlight? There are shopping coupons which are scanned by
    a laser in the shop, but, if you don't have access to a printer, you
    are supposed to be able to scan the bar/QR code on a smartphone
    screen; how does that work? What if the display is OLED?

    Like you, I've always wondered how laser barcode scanners manage to scan
    a backlit LED screen rather than a reflective paper coupon (or an LCD screen). Scanners in shops seem to have a switchable setting which maybe turns the laser off and moves the sensor (or relies on movement of the
    phone screen past the scanner) to sense the horizontal axis of the bar
    code.

    I've just scanned an eVoucher on my Motorola E5 Play at the ASDA self
    checkout - actually the assistant held it at the right angle, but I
    don't recall seeing her switch anything.

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From R. Mark Clayton@21:1/5 to All on Thu Mar 24 10:35:22 2022
    On Saturday, 19 March 2022 at 10:45:02 UTC, Brian Gaff (Sofa) wrote:
    May seem a daft question, but in the old days of tubes, there were various patterns of shadow masks on them. Some tended to be noticeable as vertical stripes, others as little triads, And when we started to get digital video
    as in games, it was not unusual to see oval circles on things, due to the pixel being displayed oblong instead of square, so to speak.
    Brian



    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    bri...@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!

    It seems Samsung has opted for triangular on its new OLED TV's, but: -

    https://www.techradar.com/news/samsung-might-have-a-problem-with-its-new-oled-tvs-sub-pixels

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Fri Mar 25 10:11:27 2022
    On Thu, 24 Mar 2022 14:12:12 -0000, "NY" <me@privacy.invalid> wrote:

    I would have thought that to be able to use a TV camera to view a TV screen, >you want the shutter speed to be the full 1/25 second, especially if the TV >screen fills the majority for the frame. If the screen fills (for example)
    50 lines of the camera, you can use a correspondingly shorter shutter speed >as the camera will only see the TV screen for 50/625 * 1/25 second.

    Vertical scan is actually 50Hz not 25, but the principle is the same.
    To photograph TV CRT monitors displaying signals with the same frame
    rate as the camera, chip cameras should be set to use no shutter at
    all, so they are sensitive to light 100% of the time.

    There will still be a horizontal split, but if the signals the
    monitors are displaying are derived from the same sync pulses as the
    cameras (which would be usual in a studio setting) then it will be off
    the screen.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Woolley@21:1/5 to Roderick Stewart on Fri Mar 25 21:44:43 2022
    On 25/03/2022 10:11, Roderick Stewart wrote:
    Vertical scan is actually 50Hz not 25, but the principle is the same.

    But if interlaced, e.g. standard analogue TV, the whole picture requires
    two vertical scans, one for odd and one for even lines.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From williamwright@21:1/5 to David Woolley on Sat Mar 26 00:10:43 2022
    On 25/03/2022 21:44, David Woolley wrote:
    But if interlaced, e.g. standard analogue TV,

    Analogue TV? Sorry, don't remember...

    Bill

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to David Woolley on Sat Mar 26 22:07:18 2022
    "David Woolley" <david@ex.djwhome.demon.invalid> wrote in message news:t1ld4c$ce2$1@dont-email.me...
    On 25/03/2022 10:11, Roderick Stewart wrote:
    Vertical scan is actually 50Hz not 25, but the principle is the same.

    But if interlaced, e.g. standard analogue TV, the whole picture requires
    two vertical scans, one for odd and one for even lines.

    I presume that digital tuners and recorders that have an analogue SCART or
    RF output for compatibility with analogue TVs will *always* output
    interlaced, even if the digital source happens to be 25p (SD or HD), because analogue signals are *defined* to be interlaced, and 25p would look very flickery. Obviously in the case of HD, the tuner/PVR also has to downscale
    from 1080 to 576 lines.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brian Gaff (Sofa)@21:1/5 to All on Sun Mar 27 10:00:20 2022
    I was musing on this, and allied to this of course is resolution and aspect ratio as well. It seems not all of these have agreement exactly, so they nowadays use software to approximate things so they look correct.

    Brian

    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    briang1@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!
    "R. Mark Clayton" <notyalckram@gmail.com> wrote in message news:8b1b2163-9dd7-45ac-8891-c633a268e76dn@googlegroups.com...
    On Saturday, 19 March 2022 at 10:45:02 UTC, Brian Gaff (Sofa) wrote:
    May seem a daft question, but in the old days of tubes, there were
    various
    patterns of shadow masks on them. Some tended to be noticeable as
    vertical
    stripes, others as little triads, And when we started to get digital
    video
    as in games, it was not unusual to see oval circles on things, due to the
    pixel being displayed oblong instead of square, so to speak.
    Brian



    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    bri...@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!

    It seems Samsung has opted for triangular on its new OLED TV's, but: -

    https://www.techradar.com/news/samsung-might-have-a-problem-with-its-new-oled-tvs-sub-pixels

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Sun Mar 27 10:44:42 2022
    On Sat, 26 Mar 2022 22:07:18 -0000, "NY" <me@privacy.invalid> wrote:

    "David Woolley" <david@ex.djwhome.demon.invalid> wrote in message >news:t1ld4c$ce2$1@dont-email.me...
    On 25/03/2022 10:11, Roderick Stewart wrote:
    Vertical scan is actually 50Hz not 25, but the principle is the same.

    But if interlaced, e.g. standard analogue TV, the whole picture requires
    two vertical scans, one for odd and one for even lines.

    I presume that digital tuners and recorders that have an analogue SCART or
    RF output for compatibility with analogue TVs will *always* output >interlaced, even if the digital source happens to be 25p (SD or HD), because >analogue signals are *defined* to be interlaced, and 25p would look very >flickery. Obviously in the case of HD, the tuner/PVR also has to downscale >from 1080 to 576 lines.

    25p wouldn't work directly with an analogue display, which is designed
    to scan vertically at 50Hz. It wouldn't synchronise and you wouldn't
    get a stable display.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Ratcliffe@21:1/5 to mark.carver@invalid.invalid on Tue Mar 29 18:58:30 2022
    On Wed, 23 Mar 2022 15:35:52 +0000, Mark Carver
    <mark.carver@invalid.invalid> wrote:

    Back in the 1970s, I was showing off Ceefax at the TV exhibition in
    Montreaux.

    Crystal Palace had one hell of a range !

    no, the signal was coming from a VT on the Ampex stand 2 floors down.

    Ah, very good. Back in those days even almost-rival manufacturers would happily swap feeds between stands.
    Lots of Hum-Buckers everywhere of course. SDI took the fun out of that,
    but by then the 'grown-ups' had put a stop to those sort of things.

    I could get R2 on my car radio (expect under the overhead tram wires.

    Yep, good old Droitwich normally made it with the radio near the hotel
    room window

    I picked it up (R4 then of course) on higher ground in Majorca on the bog-standard hire car radio, about 25 years ago.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From david lan@21:1/5 to Paul Ratcliffe on Thu Jun 30 02:48:11 2022
    On Wednesday, March 30, 2022 at 12:31:09 AM UTC+5:30, Paul Ratcliffe wrote:
    On Wed, 23 Mar 2022 15:35:52 +0000, Mark Carver
    <mark....@invalid.invalid> wrote:

    Back in the 1970s, I was showing off Ceefax at the TV exhibition in
    Montreaux.

    Crystal Palace had one hell of a range !

    no, the signal was coming from a VT on the Ampex stand 2 floors down.

    Ah, very good. Back in those days even almost-rival manufacturers would happily swap feeds between stands.
    Lots of Hum-Buckers everywhere of course. SDI took the fun out of that,
    but by then the 'grown-ups' had put a stop to those sort of things.

    I could get R2 on my car radio (expect under the overhead tram wires.

    Yep, good old Droitwich normally made it with the radio near the hotel
    room window
    I picked it up (R4 then of course) on higher ground in Majorca on the bog-standard hire car radio, about 25 years ago.
    One of my friends have been using autoclicker (the most latest version) and told me that he is having great experience with op auto clicker despite having some sort of mixed opinions about the software. Here is the link https://autoclicker.online/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)