• Nikon ditches mechanical shutter in new pro camera

    From RichA@21:1/5 to All on Thu Oct 28 07:35:16 2021
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Incubus@21:1/5 to RichA on Thu Oct 28 14:47:46 2021
    On 2021-10-28, RichA <rander3128@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Browne@21:1/5 to Incubus on Thu Oct 28 13:37:32 2021
    On 2021-10-28 10:47, Incubus wrote:
    On 2021-10-28, RichA <rander3128@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.

    "Belong"? That's a pretty emotional statement. The only way to lead is
    to innovate and make sales.

    Also your food chain analogy is just wrong.


    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From RichA@21:1/5 to Alan Browne on Thu Oct 28 16:42:13 2021
    On Thursday, 28 October 2021 at 13:37:38 UTC-4, Alan Browne wrote:
    On 2021-10-28 10:47, Incubus wrote:
    On 2021-10-28, RichA <rande...@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.
    "Belong"? That's a pretty emotional statement. The only way to lead is
    to innovate and make sales.

    Also your food chain analogy is just wrong.

    Nikon and Canon HORRIBLY dragged their feet on mirrorless.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Incubus@21:1/5 to Alan Browne on Fri Oct 29 09:12:42 2021
    On 2021-10-28, Alan Browne <bitbucket@blackhole.com> wrote:
    On 2021-10-28 10:47, Incubus wrote:
    On 2021-10-28, RichA <rander3128@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.

    "Belong"? That's a pretty emotional statement. The only way to lead is
    to innovate and make sales.

    Which they did from 1959 until circa 1986 and then again with the DSLR.

    Also your food chain analogy is just wrong.

    Not really.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Incubus@21:1/5 to RichA on Fri Oct 29 09:13:32 2021
    On 2021-10-28, RichA <rander3128@gmail.com> wrote:
    On Thursday, 28 October 2021 at 13:37:38 UTC-4, Alan Browne wrote:
    On 2021-10-28 10:47, Incubus wrote:
    On 2021-10-28, RichA <rande...@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.
    "Belong"? That's a pretty emotional statement. The only way to lead is
    to innovate and make sales.

    Also your food chain analogy is just wrong.

    Nikon and Canon HORRIBLY dragged their feet on mirrorless.

    Nikon moreso. However, they understand better than Sony what
    photographers need.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Incubus on Fri Oct 29 05:01:22 2021
    On Thursday, 28 October 2021 at 15:47:51 UTC+1, Incubus wrote:
    On 2021-10-28, RichA <rande...@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.

    My EOS M6 has the an option of using a electronic shutter, and it;s usefull as it doesn't make such
    a loudish noise when opertated, BUT certain other functions aren't availble when this is selected noticably
    not being able to use the Hi or low frame rates (FPS) when using 'drive' , plus a few others things I've forgotten.
    Another was something about distortion of fast moving objects such as propellors.
    But I'm not sure how this compares to a mechanical shutter.
    I thought about doing a test with my fan heater, but unfortunanly it's a dyson so is effectively fanless
    from a photograhy POV. Guess I'll have to wait for a helicopter to fly over
    Or a drone of course , who do we know with a drone that could pop over to London for 1/2 an hour or so for me to run a test ;-)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Incubus@21:1/5 to Whisky-dave on Fri Oct 29 14:34:21 2021
    On 2021-10-29, Whisky-dave <whisky.dave@gmail.com> wrote:
    On Thursday, 28 October 2021 at 15:47:51 UTC+1, Incubus wrote:
    On 2021-10-28, RichA <rande...@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.

    My EOS M6 has the an option of using a electronic shutter, and it;s usefull as it doesn't make such
    a loudish noise when opertated, BUT certain other functions aren't availble when this is selected noticably
    not being able to use the Hi or low frame rates (FPS) when using 'drive' , plus a few others things I've forgotten.
    Another was something about distortion of fast moving objects such as propellors.
    But I'm not sure how this compares to a mechanical shutter.
    I thought about doing a test with my fan heater, but unfortunanly it's a dyson so is effectively fanless
    from a photograhy POV. Guess I'll have to wait for a helicopter to fly over Or a drone of course , who do we know with a drone that could pop over to London for 1/2 an hour or so for me to run a test ;-)

    I think newer sensors don't have the issue with propellors, rotor blades
    etc. I don't know how LED lighting will work.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alfred Molon@21:1/5 to All on Fri Oct 29 18:37:48 2021
    Am 29.10.2021 um 16:34 schrieb Incubus:
    I think newer sensors don't have the issue with propellors, rotor blades
    etc. I don't know how LED lighting will work.

    I think the Z9 doesn't have a global shutter, only a fast readout
    sensor. Fast moving things could be a problem.
    --
    Alfred Molon

    Olympus 4/3 and micro 4/3 cameras forum at
    https://groups.io/g/myolympus
    https://myolympus.org/ photo sharing site

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From RichA@21:1/5 to Whisky-dave on Sun Oct 31 14:27:27 2021
    On Friday, 29 October 2021 at 08:01:26 UTC-4, Whisky-dave wrote:
    On Thursday, 28 October 2021 at 15:47:51 UTC+1, Incubus wrote:
    On 2021-10-28, RichA <rande...@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.
    My EOS M6 has the an option of using a electronic shutter, and it;s usefull as it doesn't make such
    a loudish noise when opertated, BUT certain other functions aren't availble when this is selected noticably
    not being able to use the Hi or low frame rates (FPS) when using 'drive' , plus a few others things I've forgotten.
    Another was something about distortion of fast moving objects such as propellors.
    But I'm not sure how this compares to a mechanical shutter.
    I thought about doing a test with my fan heater, but unfortunanly it's a dyson so is effectively fanless
    from a photograhy POV. Guess I'll have to wait for a helicopter to fly over Or a drone of course , who do we know with a drone that could pop over to London for 1/2 an hour or so for me to run a test ;-)

    Nikon may have figure out a way to minimized that effect.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From RichA@21:1/5 to Alfred Molon on Sun Oct 31 14:27:53 2021
    On Friday, 29 October 2021 at 12:37:52 UTC-4, Alfred Molon wrote:
    Am 29.10.2021 um 16:34 schrieb Incubus:
    I think newer sensors don't have the issue with propellors, rotor blades etc. I don't know how LED lighting will work.
    I think the Z9 doesn't have a global shutter, only a fast readout
    sensor. Fast moving things could be a problem.
    --
    Alfred Molon


    No global shutters in consumer cameras as of yet.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Fishrrman@21:1/5 to Alfred Molon on Sun Oct 31 19:15:12 2021
    On 10/29/21 12:37 PM, Alfred Molon wrote:

    I think the Z9 doesn't have a global shutter, only a fast
    readout sensor. Fast moving things could be a problem.

    First fully global shutter will come on Canon R1 when it
    gets released...

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Incubus@21:1/5 to Alfred Molon on Mon Nov 1 13:53:26 2021
    On 2021-10-29, Alfred Molon <alfred_molon@yahoo.com> wrote:
    Am 29.10.2021 um 16:34 schrieb Incubus:
    I think newer sensors don't have the issue with propellors, rotor blades
    etc. I don't know how LED lighting will work.

    I think the Z9 doesn't have a global shutter, only a fast readout
    sensor. Fast moving things could be a problem.

    I can't imagine they would release if it fast moving things are a
    problem given that it is designed for fast moving things.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Incubus on Mon Nov 1 06:34:47 2021
    On Friday, 29 October 2021 at 15:34:26 UTC+1, Incubus wrote:
    On 2021-10-29, Whisky-dave <whisk...@gmail.com> wrote:
    On Thursday, 28 October 2021 at 15:47:51 UTC+1, Incubus wrote:
    On 2021-10-28, RichA <rande...@gmail.com> wrote:
    https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/

    Nikon have just pissed all over Canon and Sony with this release.
    Although I'll be sticking with my D3 where having a pro body is
    concerned, I really hope this puts Nikon back where they belong at the
    top of the food chain.

    My EOS M6 has the an option of using a electronic shutter, and it;s usefull as it doesn't make such
    a loudish noise when opertated, BUT certain other functions aren't availble when this is selected noticably
    not being able to use the Hi or low frame rates (FPS) when using 'drive' , plus a few others things I've forgotten.
    Another was something about distortion of fast moving objects such as propellors.
    But I'm not sure how this compares to a mechanical shutter.
    I thought about doing a test with my fan heater, but unfortunanly it's a dyson so is effectively fanless
    from a photograhy POV. Guess I'll have to wait for a helicopter to fly over Or a drone of course , who do we know with a drone that could pop over to London for 1/2 an hour or so for me to run a test ;-)
    I think newer sensors don't have the issue with propellors, rotor blades
    etc. I don't know how LED lighting will work.

    On TV sometimes you can see a cars headlights flicker on and off.
    There's an anti flicker option on my EOS 6 mkII but I haven;t used it yet, maybe it auto adjusts.

    I did a small 4K video at home and I have LED lights and didn;t notice any flicker.

    students built one of these so I filled the result at home easier than taking my camera to work.
    https://youtu.be/AUE2CH9uZNc

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Incubus on Tue Nov 2 06:22:02 2021
    On Monday, 1 November 2021 at 13:53:31 UTC, Incubus wrote:
    On 2021-10-29, Alfred Molon <alfred...@yahoo.com> wrote:
    Am 29.10.2021 um 16:34 schrieb Incubus:
    I think newer sensors don't have the issue with propellors, rotor blades >> etc. I don't know how LED lighting will work.

    I think the Z9 doesn't have a global shutter, only a fast readout
    sensor. Fast moving things could be a problem.
    I can't imagine they would release if it fast moving things are a
    problem given that it is designed for fast moving things.

    I guess it depends on how fast a thing is actually moving as to whether it affects the final image.
    But it's only 120 fps my 6 yearv old iphone can do 240 fps lower resolution sure.

    fastest shutter speed is 1/32000 fast but not so incredable that nothing will be have motion blur.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Savageduck@21:1/5 to All on Tue Nov 2 07:48:47 2021
    On Nov 2, 2021, Whisky-dave wrote
    (in article<8ec0fee5-b272-42a7-98ed-6c522f266726n@googlegroups.com>):

    On Monday, 1 November 2021 at 13:53:31 UTC, Incubus wrote:
    On 2021-10-29, Alfred Molon<alfred...@yahoo.com> wrote:
    Am 29.10.2021 um 16:34 schrieb Incubus:
    I think newer sensors don't have the issue with propellors, rotor blades
    etc. I don't know how LED lighting will work.

    I think the Z9 doesn't have a global shutter, only a fast readout
    sensor. Fast moving things could be a problem.
    I can't imagine they would release if it fast moving things are a
    problem given that it is designed for fast moving things.

    I guess it depends on how fast a thing is actually moving as to whether it affects the final image.
    But it's only 120 fps my 6 yearv old iphone can do 240 fps lower resolution sure.

    fastest shutter speed is 1/32000 fast but not so incredable that nothing will be have motion blur.

    The issue isn’t motion blur that is the problem with an electronic shutter. The problem is the rolling shutter effect as we are yet to see a global shutter in a pro, or consumer camera.
    <https://en.wikipedia.org/wiki/Rolling_shutter>

    As for high frame rates of 120 fps, or 240 fps that is a video concern rather than a stills photography issue.

    ...and here is what rolling shutter can look like with an electronic shutter image. Check thehummingbird wing tip.
    <https://photos.smugmug.com/photos/i-9d3QSvv/0/5bfddd31/O/i-9d3QSvv.jpg>

    --
    Regards,
    Savageduck

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Savageduck on Wed Nov 3 06:21:25 2021
    On Tuesday, 2 November 2021 at 14:48:57 UTC, Savageduck wrote:
    On Nov 2, 2021, Whisky-dave wrote
    (in article<8ec0fee5-b272-42a7...@googlegroups.com>):
    On Monday, 1 November 2021 at 13:53:31 UTC, Incubus wrote:
    On 2021-10-29, Alfred Molon<alfred...@yahoo.com> wrote:
    Am 29.10.2021 um 16:34 schrieb Incubus:
    I think newer sensors don't have the issue with propellors, rotor blades
    etc. I don't know how LED lighting will work.

    I think the Z9 doesn't have a global shutter, only a fast readout sensor. Fast moving things could be a problem.
    I can't imagine they would release if it fast moving things are a problem given that it is designed for fast moving things.

    I guess it depends on how fast a thing is actually moving as to whether it affects the final image.
    But it's only 120 fps my 6 yearv old iphone can do 240 fps lower resolution sure.

    fastest shutter speed is 1/32000 fast but not so incredable that nothing will be have motion blur.
    The issue isn’t motion blur that is the problem with an electronic shutter.

    I think the problem will have the same effect but the term shutter is at fault here, which is why there is confusion in terms.


    The problem is the rolling shutter effect as we are yet to see a global shutter in a pro, or consumer camera.
    <https://en.wikipedia.org/wiki/Rolling_shutter>

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.


    As for high frame rates of 120 fps, or 240 fps that is a video concern rather than a stills photography issue.

    ...and here is what rolling shutter can look like with an electronic shutter image. Check thehummingbird wing tip.
    <https://photos.smugmug.com/photos/i-9d3QSvv/0/5bfddd31/O/i-9d3QSvv.jpg>

    your humming birds are just deformed ;-)

    My pigeons are perfect specimens of pigeonyness. https://www.flickr.com/photos/whiskydave/51575096534/



    --
    Regards,
    Savageduck

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Browne@21:1/5 to Whisky-dave on Wed Nov 3 11:29:02 2021
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.

    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.

    An array can theoretically be devised that samples all 30Mpixels
    simultaneously for an arbitrary shutter period. It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")
    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    This is independent of the time to offload the data which would define
    the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Incubus@21:1/5 to Alan Browne on Wed Nov 3 16:15:06 2021
    On 2021-11-03, Alan Browne <bitbucket@blackhole.com> wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.

    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.

    Damn. I thought it was a documentary, like The X Files.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.

    An array can theoretically be devised that samples all 30Mpixels simultaneously for an arbitrary shutter period. It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")
    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    This is independent of the time to offload the data which would define
    the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    This must surely have been solved with analogue CCD cameras for live
    broadcast in the 1980s and earlier tube devices.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Alan Browne on Wed Nov 3 09:24:39 2021
    On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.

    No it's only mention regarding the Heisenberg compensator after the scientest got involved
    in keeping the next generation series more credible from a technology POV.


    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.
    An array can theoretically be devised that samples all 30Mpixels simultaneously for an arbitrary shutter period.

    Theoretically is not practically though.
    Why use a shutter anyway ?

    Better to sample each pixel for a nano second why have a shutter at all?


    It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")

    that signal would take a difernt time to reach each pixel, although you could possibley compensate
    in a similar way to the Heisenberg compensator in star trek.

    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.

    Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'


    This is independent of the time to offload the data which would define
    the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    The clock speed of the processor normally dictates the speed.
    That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
    the way we increase so called speed is by adding more cores.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Incubus on Wed Nov 3 09:38:14 2021
    On Wednesday, 3 November 2021 at 16:15:11 UTC, Incubus wrote:
    On 2021-11-03, Alan Browne <bitb...@blackhole.com> wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.

    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.
    Damn. I thought it was a documentary, like The X Files.

    Or like Fringe. otr battlestar Galatica

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.

    An array can theoretically be devised that samples all 30Mpixels simultaneously for an arbitrary shutter period. It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")
    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    This is independent of the time to offload the data which would define
    the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.
    This must surely have been solved with analogue CCD cameras for live broadcast in the 1980s and earlier tube devices.

    off loading the data is not problem that was solved in the long long ago in the before time,
    by developing the image with magic chemicals ;-)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Browne@21:1/5 to Incubus on Wed Nov 3 12:49:23 2021
    On 2021-11-03 12:15, Incubus wrote:
    On 2021-11-03, Alan Browne <bitbucket@blackhole.com> wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.

    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.

    Damn. I thought it was a documentary, like The X Files.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.

    An array can theoretically be devised that samples all 30Mpixels
    simultaneously for an arbitrary shutter period. It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")
    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    This is independent of the time to offload the data which would define
    the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    This must surely have been solved with analogue CCD cameras for live broadcast in the 1980s and earlier tube devices.


    Frame Transfer CCD. As I describe-ish above, but far coarser with a
    frame rate of ~30 fps.


    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Browne@21:1/5 to Whisky-dave on Wed Nov 3 12:39:56 2021
    On 2021-11-03 12:24, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.

    No it's only mention regarding the Heisenberg compensator after the scientest got involved

    A Heisenberg compensator violates ....

    in keeping the next generation series more credible from a technology POV.

    When I was a child I knew there was 0 technology credibility in ST. Han
    Solo's "in 17 parsecs" line was more credible where heaps of silliness
    comes in (and yes, they "compensated" for that goof in the space opera
    "Solo".

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.
    An array can theoretically be devised that samples all 30Mpixels
    simultaneously for an arbitrary shutter period.

    Theoretically is not practically though.

    It's quite feasible. It's just more transistors, "traces" and passive components. About 60 .. 120M more which is absolutely trivial in
    today's chips, esp, a camera sensor.

    Why use a shutter anyway ?

    Better to sample each pixel for a nano second why have a shutter at all?

    Depends on the shutter period. So even if the inter pixel sampling
    interval is 1 ns, the exposure still needs to be much longer for a
    viable (low noise) period.



    It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")

    that signal would take a difernt time to reach each pixel, although you could possibley compensate
    in a similar way to the Heisenberg compensator in star trek.

    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.

    In an electronic circuit the propagation is somewhat slower than that,
    but devised correctly, you would get the sample trigger everywhere
    needed with negligible delay wrt the shutter period (which is in the
    many microseconds and slower domain).

    Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'

    You're way out there ...




    This is independent of the time to offload the data which would define
    the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    The clock speed of the processor normally dictates the speed.

    Capturing the image is completely independent of processor speed.

    Processor speed goes to offloading, displaying, storing the image after
    the fact.

    That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
    the way we increase so called speed is by adding more cores.

    Camera processors have no need to be very fast at all. Certainly not up
    in the 5GHz range. Because: battery life.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    Could you get a proper news reader?



    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Alan Browne on Wed Nov 3 09:57:29 2021
    On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:
    On 2021-11-03 12:24, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.

    No it's only mention regarding the Heisenberg compensator after the scientest got involved
    A Heisenberg compensator violates ....

    There's nothing wrong with a bit of violation between consenting adults.

    in keeping the next generation series more credible from a technology POV.
    When I was a child I knew there was 0 technology credibility in ST. Han Solo's "in 17 parsecs" line was more credible where heaps of silliness
    comes in (and yes, they "compensated" for that goof in the space opera "Solo".

    Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
    why of the reaseans I think star wars is science fantasy and star trek is science fiction.
    Some people can't tell the differnce between the two.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.
    An array can theoretically be devised that samples all 30Mpixels
    simultaneously for an arbitrary shutter period.

    Theoretically is not practically though.
    It's quite feasible. It's just more transistors, "traces" and passive components. About 60 .. 120M more which is absolutely trivial in
    today's chips, esp, a camera sensor.

    But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
    but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.

    Why use a shutter anyway ?

    Better to sample each pixel for a nano second why have a shutter at all?
    Depends on the shutter period. So even if the inter pixel sampling
    interval is 1 ns, the exposure still needs to be much longer for a
    viable (low noise) period.

    Which is partially while you get a blur effect.



    It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")

    that signal would take a difernt time to reach each pixel, although you could possibley compensate
    in a similar way to the Heisenberg compensator in star trek.

    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.
    In an electronic circuit the propagation is somewhat slower than that,
    but devised correctly, you would get the sample trigger everywhere
    needed with negligible delay wrt the shutter period (which is in the
    many microseconds and slower domain).

    But not yet.


    Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'
    You're way out there ...

    if it;'s so easy to do with justa few million more transistors they;d have done it.



    This is independent of the time to offload the data which would define
    the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    The clock speed of the processor normally dictates the speed.
    Capturing the image is completely independent of processor speed.

    Processor speed goes to offloading, displaying, storing the image after
    the fact.

    but you need the rest of the circuit to read all teh pixels at exactly the same time.
    File could do that.

    That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
    the way we increase so called speed is by adding more cores.
    Camera processors have no need to be very fast at all. Certainly not up
    in the 5GHz range. Because: battery life.

    They;d need to be fast , processor in camera are getting faster.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens
    Could you get a proper news reader?

    No.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Browne@21:1/5 to Whisky-dave on Wed Nov 3 13:49:41 2021
    On 2021-11-03 12:57, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:
    On 2021-11-03 12:24, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    Whatever happens in Star Trek wrt to the Heisenberg principle is likely >>>> (at best) a misconstrued attempt at putting "science" into space opera. >>>
    No it's only mention regarding the Heisenberg compensator after the scientest got involved
    A Heisenberg compensator violates ....

    There's nothing wrong with a bit of violation between consenting adults.

    in keeping the next generation series more credible from a technology POV. >> When I was a child I knew there was 0 technology credibility in ST. Han
    Solo's "in 17 parsecs" line was more credible where heaps of silliness
    comes in (and yes, they "compensated" for that goof in the space opera
    "Solo".

    Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
    why of the reaseans I think star wars is science fantasy and star trek is science fiction.
    Some people can't tell the differnce between the two.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.
    An array can theoretically be devised that samples all 30Mpixels
    simultaneously for an arbitrary shutter period.

    Theoretically is not practically though.
    It's quite feasible. It's just more transistors, "traces" and passive
    components. About 60 .. 120M more which is absolutely trivial in
    today's chips, esp, a camera sensor.

    But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
    but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.

    This is NOT a CPU issue. It is a devoted device sampling gate issue.
    As long as you can get the signal to where it is needed the sample is
    started (and ended when the sampling signal goes to the opposite state).

    A strobe (sampling signal) is akin to the processor timing signals all
    over a CPU chip. So while they won't be perfectly simultaneous, they
    will be at close to 0 lag pixel to pixel. The device does not "address"
    each pixel, each pixel receives the sample signal simultaneously (with
    minor variation due to propagation - sub ns level).

    Why use a shutter anyway ?

    Better to sample each pixel for a nano second why have a shutter at all?
    Depends on the shutter period. So even if the inter pixel sampling
    interval is 1 ns, the exposure still needs to be much longer for a
    viable (low noise) period.

    Which is partially while you get a blur effect.

    Not at all. Rolling shutter is due to the CPU sampling one row at a
    time; in turn each row being off loaded 1 pixel at a time.

    What I'm describing is trigger the senor to sample all pixels at once
    (frozen for the same exposure period), and then offloading the entire image.




    It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")

    that signal would take a difernt time to reach each pixel, although you could possibley compensate
    in a similar way to the Heisenberg compensator in star trek.

    (There may be a trivial propagation delay in the nano second or less
    scale of the sampling signal).

    Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.
    In an electronic circuit the propagation is somewhat slower than that,
    but devised correctly, you would get the sample trigger everywhere
    needed with negligible delay wrt the shutter period (which is in the
    many microseconds and slower domain).

    But not yet.

    Of course "yet". It's just more costly to do.



    Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'
    You're way out there ...

    if it;'s so easy to do with justa few million more transistors they;d have done it.

    $




    This is independent of the time to offload the data which would define >>>> the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    The clock speed of the processor normally dictates the speed.
    Capturing the image is completely independent of processor speed.

    Processor speed goes to offloading, displaying, storing the image after
    the fact.

    but you need the rest of the circuit to read all teh pixels at exactly the same time.
    File could do that.

    A DMA transfer of 30M pixels to a DMA array of some number would take
    little time.


    That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
    the way we increase so called speed is by adding more cores.
    Camera processors have no need to be very fast at all. Certainly not up
    in the 5GHz range. Because: battery life.

    They;d need to be fast , processor in camera are getting faster.

    Again, image sampling ≠ image transfer / storage. It is an independent
    of the CPU operation.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From geoff@21:1/5 to Whisky-dave on Thu Nov 4 10:57:21 2021
    On 4/11/2021 2:21 am, Whisky-dave wrote:
    On Tuesday, 2 November 2021 at 14:48:57 UTC, Savageduck wrote:
    On Nov 2, 2021, Whisky-dave wrote
    (in article<8ec0fee5-b272-42a7...@googlegroups.com>):
    On Monday, 1 November 2021 at 13:53:31 UTC, Incubus wrote:
    On 2021-10-29, Alfred Molon<alfred...@yahoo.com> wrote:
    Am 29.10.2021 um 16:34 schrieb Incubus:
    I think newer sensors don't have the issue with propellors, rotor blades >>>>>> etc. I don't know how LED lighting will work.

    I think the Z9 doesn't have a global shutter, only a fast readout
    sensor. Fast moving things could be a problem.
    I can't imagine they would release if it fast moving things are a
    problem given that it is designed for fast moving things.

    I guess it depends on how fast a thing is actually moving as to whether it affects the final image.
    But it's only 120 fps my 6 yearv old iphone can do 240 fps lower resolution sure.

    fastest shutter speed is 1/32000 fast but not so incredable that nothing will be have motion blur.
    The issue isn’t motion blur that is the problem with an electronic shutter.

    I think the problem will have the same effect but the term shutter is at fault here, which is why there is confusion in terms.

    Agreed.

    Probably has already been sated in this thread, but the rolling or
    global 'scanning' of the CCD is a completely separate issue to whether
    or not there is (also) a mechanical shutter.

    geoff

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Alan Browne on Thu Nov 4 08:07:25 2021
    On Wednesday, 3 November 2021 at 17:49:48 UTC, Alan Browne wrote:
    On 2021-11-03 12:57, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:
    On 2021-11-03 12:24, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    Whatever happens in Star Trek wrt to the Heisenberg principle is likely >>>> (at best) a misconstrued attempt at putting "science" into space opera. >>>
    No it's only mention regarding the Heisenberg compensator after the scientest got involved
    A Heisenberg compensator violates ....

    There's nothing wrong with a bit of violation between consenting adults.

    in keeping the next generation series more credible from a technology POV.
    When I was a child I knew there was 0 technology credibility in ST. Han >> Solo's "in 17 parsecs" line was more credible where heaps of silliness
    comes in (and yes, they "compensated" for that goof in the space opera
    "Solo".

    Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
    why of the reaseans I think star wars is science fantasy and star trek is science fiction.
    Some people can't tell the differnce between the two.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.
    An array can theoretically be devised that samples all 30Mpixels
    simultaneously for an arbitrary shutter period.

    Theoretically is not practically though.
    It's quite feasible. It's just more transistors, "traces" and passive
    components. About 60 .. 120M more which is absolutely trivial in
    today's chips, esp, a camera sensor.

    But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
    but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.
    This is NOT a CPU issue. It is a devoted device sampling gate issue.

    And what does that actually mean in real terms it's something that can sample
    a 'signal' from a pixel and what is that signal ?

    Is it a hello world I'm here ?


    As long as you can get the signal to where it is needed the sample is started (and ended when the sampling signal goes to the opposite state).

    So what is this magic signal ?


    A strobe (sampling signal) is akin to the processor timing signals all
    over a CPU chip. So while they won't be perfectly simultaneous, they
    will be at close to 0 lag pixel to pixel. The device does not "address"
    each pixel, each pixel receives the sample signal simultaneously (with
    minor variation due to propagation - sub ns level).

    But this magic device would need at least 30 million inputs one for each pixel.


    Why use a shutter anyway ?

    Better to sample each pixel for a nano second why have a shutter at all? >> Depends on the shutter period. So even if the inter pixel sampling
    interval is 1 ns, the exposure still needs to be much longer for a
    viable (low noise) period.

    Which is partially while you get a blur effect.
    Not at all. Rolling shutter is due to the CPU sampling one row at a
    time; in turn each row being off loaded 1 pixel at a time.

    we had similar with a slit used for a shutter now since the 70s, well when I bought my first SLR
    proabbbly longer. In fact the fiorst plate camera had this effect too, but it wasn;t noticable with exposures in miniutes or hours.

    A camera with a vertical plane shutter had a different effect from a horizonal plane shutter.


    What I'm describing is trigger the senor to sample all pixels at once (frozen for the same exposure period), and then offloading the entire image.

    but how do you do this is where the problem starts, especailly at a high enough quality.
    This si why cameras are limited by frame rates.
    Not sure what the best is currently but 8k at 120 fps isn't that amazing.

    Now 30MP at 10k fps would actually be impressive, but why isn't it done ?




    It would require that
    all pixel sites be triggered by a single sampling signal ("strobe")

    that signal would take a difernt time to reach each pixel, although you could possibley compensate
    in a similar way to the Heisenberg compensator in star trek.

    (There may be a trivial propagation delay in the nano second or less >>>> scale of the sampling signal).

    Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.
    In an electronic circuit the propagation is somewhat slower than that,
    but devised correctly, you would get the sample trigger everywhere
    needed with negligible delay wrt the shutter period (which is in the
    many microseconds and slower domain).

    But not yet.
    Of course "yet". It's just more costly to do.

    Yes well outside the budget of people and most companies so little point as yet in making such a camera.


    Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'
    You're way out there ...

    if it;'s so easy to do with justa few million more transistors they;d have done it.
    $

    Yep same old story I'd like a lamborghini sian fkp 37 if it wasn't for the $ they'd need to make more than just 63 of them too, so sold out so I can't have one.


    This is independent of the time to offload the data which would define >>>> the frame rate (or specifically the inter-frame delay). Thus
    independent of the processor speed.

    The clock speed of the processor normally dictates the speed.
    Capturing the image is completely independent of processor speed.

    Processor speed goes to offloading, displaying, storing the image after >> the fact.

    but you need the rest of the circuit to read all teh pixels at exactly the same time.
    File could do that.
    A DMA transfer of 30M pixels to a DMA array of some number would take
    little time.

    But you need far more than just 30M, that wouldn't even give you a monochrome image just B&W.
    Just is the pixel on or off.



    That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
    the way we increase so called speed is by adding more cores.
    Camera processors have no need to be very fast at all. Certainly not up >> in the 5GHz range. Because: battery life.

    They;d need to be fast , processor in camera are getting faster.
    Again, image sampling ≠ image transfer / storage. It is an independent
    of the CPU operation.

    No it's not in the real world.
    But what is image sampling then ?


    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Browne@21:1/5 to Whisky-dave on Thu Nov 4 17:30:12 2021
    On 2021-11-04 11:07, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 17:49:48 UTC, Alan Browne wrote:
    On 2021-11-03 12:57, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:
    On 2021-11-03 12:24, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
    On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    Whatever happens in Star Trek wrt to the Heisenberg principle is likely >>>>>> (at best) a misconstrued attempt at putting "science" into space opera. >>>>>
    No it's only mention regarding the Heisenberg compensator after the scientest got involved
    A Heisenberg compensator violates ....

    There's nothing wrong with a bit of violation between consenting adults. >>>
    in keeping the next generation series more credible from a technology POV.
    When I was a child I knew there was 0 technology credibility in ST. Han >>>> Solo's "in 17 parsecs" line was more credible where heaps of silliness >>>> comes in (and yes, they "compensated" for that goof in the space opera >>>> "Solo".

    Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
    why of the reaseans I think star wars is science fantasy and star trek is science fiction.
    Some people can't tell the differnce between the two.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.
    An array can theoretically be devised that samples all 30Mpixels
    simultaneously for an arbitrary shutter period.

    Theoretically is not practically though.
    It's quite feasible. It's just more transistors, "traces" and passive
    components. About 60 .. 120M more which is absolutely trivial in
    today's chips, esp, a camera sensor.

    But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
    but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.
    This is NOT a CPU issue. It is a devoted device sampling gate issue.

    And what does that actually mean in real terms it's something that can sample a 'signal' from a pixel and what is that signal ?

    The CPU ( or some other arbitrary origin signal ) tells the device
    (sensor array to begin sampling. It does. All pixels at once (in this
    one shot version).

    So each sensor "well" is cleared and begins accumulating charge from
    that moment until the trigger says "stop" (change of state). So all
    wells accumulate over the same period of time.

    When the period ends, then the entire array is read by the CPU. This
    takes non-zero time. But the information at each site is from the same
    period of time. No rolling shutter.

    As long as you can get the signal to where it is needed the sample is
    started (and ended when the sampling signal goes to the opposite state).

    So what is this magic signal ?

    Trigger, strobe, whatever name you want. No magic.



    A strobe (sampling signal) is akin to the processor timing signals all
    over a CPU chip. So while they won't be perfectly simultaneous, they
    will be at close to 0 lag pixel to pixel. The device does not "address"
    each pixel, each pixel receives the sample signal simultaneously (with
    minor variation due to propagation - sub ns level).

    But this magic device would need at least 30 million inputs one for each pixel.

    No magic.

    What I'm describing is trigger the senor to sample all pixels at once
    (frozen for the same exposure period), and then offloading the entire image.

    but how do you do this is where the problem starts, especailly at a high enough quality.

    It's just a matter of a lot of traces on the silicon. Lots. And each
    pixel well would need 2 .. 4 transistors to take in the signal, clear
    the well, and wait until the signal clears.

    Added complexity. Not "magic".


    This si why cameras are limited by frame rates.
    Not sure what the best is currently but 8k at 120 fps isn't that amazing.

    Now 30MP at 10k fps would actually be impressive, but why isn't it done ?

    That makes for very short exposures per frame. Aka: high noise.

    Another issue : it's one thing for the sensor to capture the image in
    situ. Another thing to read it off.

    But not yet.
    Of course "yet". It's just more costly to do.

    Yes well outside the budget of people and most companies so little point as yet in making such a camera.

    Markets are for testing.



    Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'
    You're way out there ...

    if it;'s so easy to do with justa few million more transistors they;d have done it.
    $

    Yep same old story I'd like a lamborghini sian fkp 37 if it wasn't for the $ they'd need to make more than just 63 of them too, so sold out so I can't have one.

    They are crapily built cars that devalue very quickly (for the most
    part) and that are extremely costly to operate and maintain. Only a
    rare model will climb in value. Esp. if you don't actually use it.
    A DMA transfer of 30M pixels to a DMA array of some number would take
    little time.

    But you need far more than just 30M, that wouldn't even give you a monochrome image just B&W.
    Just is the pixel on or off.

    RGB arrays are not a novel concept ...

    That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
    the way we increase so called speed is by adding more cores.
    Camera processors have no need to be very fast at all. Certainly not up >>>> in the 5GHz range. Because: battery life.

    They;d need to be fast , processor in camera are getting faster.
    Again, image sampling ≠ image transfer / storage. It is an independent
    of the CPU operation.

    No it's not in the real world.

    There are thousands of sensor types that operate independent of the
    system CPU. The CPU might set up the sampling, gain, etc., but other
    than triggering a sample period or reading data, setting parameters,
    they don't need to be part of the actual sampling.


    But what is image sampling then ?

    What happens on the image sensor chip independently of the CPU.


    Done with this.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Browne@21:1/5 to Whisky-dave on Tue Nov 9 08:53:16 2021
    On 2021-11-09 08:18, Whisky-dave wrote:
    On Thursday, 4 November 2021 at 21:30:20 UTC, Alan Browne wrote:

    Done with this.

    Yeah well, it's still not a global shutter it's a rolling shutter, which doesn't sample all
    the sensor data at the same time.

    The discussion was a hypothetical. But you knew that. Right?

    Really done.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Alan Browne on Tue Nov 9 05:18:44 2021
    On Thursday, 4 November 2021 at 21:30:20 UTC, Alan Browne wrote:
    On 2021-11-04 11:07, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 17:49:48 UTC, Alan Browne wrote:
    On 2021-11-03 12:57, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:
    On 2021-11-03 12:24, Whisky-dave wrote:
    On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote: >>>>>> On 2021-11-03 09:21, Whisky-dave wrote:

    My 'image' of a global shutter must be different then.
    I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
    Whatever happens in Star Trek wrt to the Heisenberg principle is likely
    (at best) a misconstrued attempt at putting "science" into space opera.

    No it's only mention regarding the Heisenberg compensator after the scientest got involved
    A Heisenberg compensator violates ....

    There's nothing wrong with a bit of violation between consenting adults. >>>
    in keeping the next generation series more credible from a technology POV.
    When I was a child I knew there was 0 technology credibility in ST. Han >>>> Solo's "in 17 parsecs" line was more credible where heaps of silliness >>>> comes in (and yes, they "compensated" for that goof in the space opera >>>> "Solo".

    Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
    why of the reaseans I think star wars is science fantasy and star trek is science fiction.
    Some people can't tell the differnce between the two.

    You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
    even with the fastest proccessor we have today.
    An array can theoretically be devised that samples all 30Mpixels >>>>>> simultaneously for an arbitrary shutter period.

    Theoretically is not practically though.
    It's quite feasible. It's just more transistors, "traces" and passive >>>> components. About 60 .. 120M more which is absolutely trivial in
    today's chips, esp, a camera sensor.

    But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
    but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.
    This is NOT a CPU issue. It is a devoted device sampling gate issue.

    And what does that actually mean in real terms it's something that can sample
    a 'signal' from a pixel and what is that signal ?
    The CPU ( or some other arbitrary origin signal ) tells the device
    (sensor array to begin sampling. It does. All pixels at once (in this
    one shot version).

    So each sensor "well" is cleared and begins accumulating charge from
    that moment until the trigger says "stop" (change of state). So all
    wells accumulate over the same period of time.

    When the period ends, then the entire array is read by the CPU. This
    takes non-zero time. But the information at each site is from the same period of time. No rolling shutter.

    As long as you can get the signal to where it is needed the sample is
    started (and ended when the sampling signal goes to the opposite state).

    So what is this magic signal ?
    Trigger, strobe, whatever name you want. No magic.


    A strobe (sampling signal) is akin to the processor timing signals all
    over a CPU chip. So while they won't be perfectly simultaneous, they
    will be at close to 0 lag pixel to pixel. The device does not "address" >> each pixel, each pixel receives the sample signal simultaneously (with
    minor variation due to propagation - sub ns level).

    But this magic device would need at least 30 million inputs one for each pixel.
    No magic.
    What I'm describing is trigger the senor to sample all pixels at once
    (frozen for the same exposure period), and then offloading the entire image.

    but how do you do this is where the problem starts, especailly at a high enough quality.
    It's just a matter of a lot of traces on the silicon. Lots. And each
    pixel well would need 2 .. 4 transistors to take in the signal, clear
    the well, and wait until the signal clears.

    Added complexity. Not "magic".
    This si why cameras are limited by frame rates.
    Not sure what the best is currently but 8k at 120 fps isn't that amazing.

    Now 30MP at 10k fps would actually be impressive, but why isn't it done ?
    That makes for very short exposures per frame. Aka: high noise.

    Another issue : it's one thing for the sensor to capture the image in
    situ. Another thing to read it off.
    But not yet.
    Of course "yet". It's just more costly to do.

    Yes well outside the budget of people and most companies so little point as yet in making such a camera.
    Markets are for testing.


    Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'
    You're way out there ...

    if it;'s so easy to do with justa few million more transistors they;d have done it.
    $

    Yep same old story I'd like a lamborghini sian fkp 37 if it wasn't for the $
    they'd need to make more than just 63 of them too, so sold out so I can't have one.
    They are crapily built cars that devalue very quickly (for the most
    part) and that are extremely costly to operate and maintain. Only a
    rare model will climb in value. Esp. if you don't actually use it.
    A DMA transfer of 30M pixels to a DMA array of some number would take
    little time.

    But you need far more than just 30M, that wouldn't even give you a monochrome image just B&W.
    Just is the pixel on or off.
    RGB arrays are not a novel concept ...
    That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
    the way we increase so called speed is by adding more cores.
    Camera processors have no need to be very fast at all. Certainly not up >>>> in the 5GHz range. Because: battery life.

    They;d need to be fast , processor in camera are getting faster.
    Again, image sampling ≠ image transfer / storage. It is an independent >> of the CPU operation.

    No it's not in the real world.
    There are thousands of sensor types that operate independent of the
    system CPU. The CPU might set up the sampling, gain, etc., but other
    than triggering a sample period or reading data, setting parameters,
    they don't need to be part of the actual sampling.
    But what is image sampling then ?
    What happens on the image sensor chip independently of the CPU.


    Done with this.

    Yeah well, it's still not a global shutter it's a rolling shutter, which doesn't sample all
    the sensor data at the same time.

    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Whisky-dave@21:1/5 to Alan Browne on Wed Nov 10 05:39:18 2021
    On Tuesday, 9 November 2021 at 13:53:22 UTC, Alan Browne wrote:
    On 2021-11-09 08:18, Whisky-dave wrote:
    On Thursday, 4 November 2021 at 21:30:20 UTC, Alan Browne wrote:

    Done with this.

    Yeah well, it's still not a global shutter it's a rolling shutter, which doesn't sample all
    the sensor data at the same time.
    The discussion was a hypothetical. But you knew that. Right?

    Really done.

    As I said not currently possible not even with the latest Nikon Z9 or whatever it's called.
    It;'s just ditched a mechanical shutter, I can not use my mechanical shutter in my EOS M6 mkII.


    As for hyperthetical well star trek is like that with a star trek camera hyperthetically
    you could go back in time and take a photo of something that happend before phtography came about.
    hyperthetically all you need is to get really close to a black hole, or exceed the speed of light,
    but not sure if that's possibe
    But thre's is the terminator method but you have to be naked for that to work.

    hyperthetical is very close to fiction, and fantasy is even further away from current reality.

    Hyperthetically sucha sensor could be deleloped but it would need a much better and father processor,
    to store whatever the sensor senses, the most efficint way would be to store that digitaly,
    which would require a really fast A-D converter for each pixel, otherwise it would take a significant to
    before you could take another picture perhaps a second+ or with current tech.

    Which is why sampling 8K of 30M sensor can only just be done fast enough
    for 60/120 fps.
    But you can buy camera with global shutters but tehy are limited to about 4K, which is quite low for a photograph and they cost about £30k.







    --
    "...there are many humorous things in this world; among them the white
    man's notion that he is less savage than the other savages."
    -Samuel Clemens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)