• Your eyes only see at 10fps, so are you wasting your money on tech?

    From Dimensional Traveler@21:1/5 to All on Sun Apr 9 16:36:24 2023
    https://www.msn.com/en-us/news/technology/your-eyes-only-see-at-10fps-so-are-you-wasting-your-money-on-tech/ar-AA19DxZV?ocid=winpstoreapp&cvid=5c2b902737724b57f44ed9e0d0c51659&ei=14
    TinyURL version: https://tinyurl.com/msdhwnu7

    Higher frame rates have been topping wish lists for camera gear,
    smartphones and – perhaps most of all – gamers.

    With viewfinders on cameras like the Canon EOS R3 hitting 120fps,
    Apple's ProMotion clocking in at 120Hz, and a growing market for 144Hz
    monitors (or higher) and matching GPUs, it's hard not to fetishize the
    tech – but can people actually perceive the difference? Some, like FilmmakerIQ, say not.

    Gamers also love to cite a test given to potential USAF fighter pilots;
    an image of an aircraft is flashed on a screen for 1/220 sec and the
    pilots can identify the aircraft they've seen. They'll tell you that
    this means the human eye can see at least 220fps (and some invest
    accordingly). Any photographer can see the flaw in this logic, though.
    After all, you can see a xenon flash fire, and that's just a fraction of
    a millisecond.

    Looked at differently, if you were shooting video at 24fps and fired a
    xenon flash, the frame when the flash fires is brightened, even if the
    flash is only 'on' for a fraction of the time the shutter is open. So if
    the human eye is working at 10fps, it'd still perceive that plane at
    220fps.

    Alright, but a faster refresh rate like ProMotion does look smoother, so
    that proves it? No. Motion on a phone screen, an old 30Hz monitor, or a super-fast gaming display is still a succession of still images. When
    things 'move' on them, scientists call this 'Apparent Motion,' the basis
    of all animation.

    Move a mouse around fast and you'll see small gaps and multiple mouse
    pointers on the display. The faster the refresh rate, the more pointers
    and the smaller the gaps – but you'll still likely see multiple
    instances. Which actually supports the idea that the eye's 'refresh
    rate' is lower than the monitor's.

    Okay, so where it really gets interesting is with research from
    Adolphe-Moïse Bloch in 1885, which said that – below a certain amount of time (or 'exposure', let's say) – the eye perceived light as less bright
    if seen for less time. Above that exposure, the perception of brightness
    wasn't affected. Bloch and other scientists found the time period where perception was affected by the duration of exposure to light was – drum
    roll – 100 milliseconds. Or a tenth of a second.

    Unlike a camera, there is no digital clock running capture and readout.
    The eye is always active so there is no need for – and there isn't – an actual frame rate. The human eye actually has different areas of
    perception; the high-resolution fovea – the middle – sees better color
    but is slower. The peripheral vision is better adapted to identifying
    movement for evolutionary reasons.

    It too, though, can't usually identify the flickering of, say, a
    low-energy lightbulb somewhere around 60-90Hz. Video makers, however,
    will be well aware that strobing is something that cameras can easily
    pick up if the shutter speed is wrong.

    The stroboscopic effect, however, can be seen in the eye. You'll know it
    most from videos of a wheel turning in which, at a certain point, the
    wheel appears to be turning another way. JF Schouten, in 1967, showed
    that humans viewing a rotating subject in continuous light (no flicker) nevertheless saw a 'subjective stroboscopic', the first being 8-12
    cycles per second (so, yes, around 10Hz again).

    Since then, different researchers have pursued the idea that this
    reveals a frame rate (some basing their conclusions on LSD users'
    perceptions of their experiences). The most recent research seems clear, though: there is no frame rate. Biology is just more complex.

    All of which is a very long way of explaining why Peter Jackson might
    have been wrong to choose HFR (High Frame Rate) for The Hobbit!


    --
    I've done good in this world. Now I'm tired and just want to be a cranky
    dirty old man.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From JAB@21:1/5 to All on Mon Apr 10 11:17:26 2023
    That's an argument that's been running for a long time and I've seen
    various takes on it from the more scientific approach, MIT puts it at
    about '75Hz', to gamers who will swear blind it's much higher than that.

    Personally I find that the most important thing is a mostly constant
    frame rate. For me that is particular noticeable at low frame rates
    where fluctuations start putting it into the territory of you start
    seeing individual frames.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Spalls Hurgenson@21:1/5 to dtravel@sonic.net on Mon Apr 10 09:08:05 2023
    On Sun, 9 Apr 2023 16:36:24 -0700, Dimensional Traveler
    <dtravel@sonic.net> wrote:

    https://www.msn.com/en-us/news/technology/your-eyes-only-see-at-10fps-so-are-you-wasting-your-money-on-tech/ar-AA19DxZV?ocid=winpstoreapp&cvid=5c2b902737724b57f44ed9e0d0c51659&ei=14
    TinyURL version: https://tinyurl.com/msdhwnu7

    I have no idea what point this article is trying to make. It's
    probably because I'm so distracted by that clickbaity headline.

    Of course our eyes don't see in "framerates". Even if you could easily determine what each 'speed' each component of our visual system runs
    at, that speed isn't constant between each part, or even for each part throughout the day (cells get tired too). It is not like there is not
    even a centralized 'clock' inside our head to keep all our neurons (or
    cones and rods in the eyes) firing in lockstep all the time. Our
    bodies aren't developing an "image" at a constant rate. It's a flow of
    signals that are mushily amalgamated into sight.

    Computers (and etc.) take advantage of this mushiness. A flickering
    image of 10 or 22 or 120 pictures moved rapidly in front of us appears
    to move because our own uneven visual and neural processes smush all
    this regularly spaced data together into a smoother stream. But you
    certainly can distinguish the difference between frame-rates higher
    than 10fps, because our different cones/rods flicker - even if largely regulated to a 10hz rate - fire off at uneven increments within that
    range (e.g., some cones fire at 10s, some at 10.1s, some at 10.2s,
    etc...).

    Consistency is more important. Our eyes and brains are wired to see
    irregular movement in a cluttered environment - whether predator,
    prey, or a branch falling down on us - in order to give us time to
    react. Low framerates cause gaps in motion that trigger this effect.
    Move an object across a 20" screen at 100fps, and that object moves
    5mm per frame. Move that same object at 10fps, and that object jumps
    across the screen at 20mm per second; even with our incredible visual processing adding in interpolating data, there is still a significant jerkiness. More powerful tech allows you to ensure a more consistent
    framerate.

    So ultimately the article is making a pedantic apple-to-oranges
    comparison that doesn't help anyone. Our brain doesn't generate
    "frames" of images, so we don't really see "frames per second"... but
    there is definite and noticable advantage to higher framerates. What constitutes "high enough" will vary from person to person; some might
    be able to notice a difference between 100 or 200fps; I seem to top
    out around 80hz, but that probably is because I'm getting older (damn whippersnappers with their young eyes!) Or maybe its because - back in
    the day - I played video games where 10fps was a luxury, and trained
    my brain to ignore anything if it was more than an order of magnitude
    beyond that. But even if you can't directly notice the difference of
    the framerates, you probably are still benefitting from it ;-)


    TL;DR: Money spent on tech that outputs at 120fps (or higher) isn't
    wasted... but the author of the posted article might be. ;-)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Xocyll@21:1/5 to All on Tue Apr 11 03:58:05 2023
    JAB <noway@nochance.com> looked up from reading the entrails of the porn spammer to utter "The Augury is good, the signs say:

    That's an argument that's been running for a long time and I've seen
    various takes on it from the more scientific approach, MIT puts it at
    about '75Hz', to gamers who will swear blind it's much higher than that.

    Personally I find that the most important thing is a mostly constant
    frame rate. For me that is particular noticeable at low frame rates
    where fluctuations start putting it into the territory of you start
    seeing individual frames.

    Different people, different eyes.

    Some people can see higher frequencies, some cannot.
    I personally never saw a point to anything over 30 fps, since I can't
    tell a difference over that.

    Other people do.


    Those new traffic lights that are a grid of pulsed LEDs instead of a
    single bulb are a case in point.

    To most it appears as a steady light, since the pulsing is at a high
    freq, but some people see them as a flickering light.

    Some of us have Ferrari eyes, some Saab eyes, some Trabant eyes.

    Xocyll
    --
    I don't particularly want you to FOAD, myself. You'll be more of
    a cautionary example if you'll FO And Get Chronically, Incurably,
    Painfully, Progressively, Expensively, Debilitatingly Ill. So
    FOAGCIPPEDI. -- Mike Andrews responding to an idiot in asr

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)