https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
On 2021-10-28, RichA <rander3128@gmail.com> wrote:
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release.
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
On 2021-10-28 10:47, Incubus wrote:
On 2021-10-28, RichA <rande...@gmail.com> wrote:
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release."Belong"? That's a pretty emotional statement. The only way to lead is
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
to innovate and make sales.
Also your food chain analogy is just wrong.
On 2021-10-28 10:47, Incubus wrote:
On 2021-10-28, RichA <rander3128@gmail.com> wrote:
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release.
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
"Belong"? That's a pretty emotional statement. The only way to lead is
to innovate and make sales.
Also your food chain analogy is just wrong.
On Thursday, 28 October 2021 at 13:37:38 UTC-4, Alan Browne wrote:
On 2021-10-28 10:47, Incubus wrote:
On 2021-10-28, RichA <rande...@gmail.com> wrote:"Belong"? That's a pretty emotional statement. The only way to lead is
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release.
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
to innovate and make sales.
Also your food chain analogy is just wrong.
Nikon and Canon HORRIBLY dragged their feet on mirrorless.
On 2021-10-28, RichA <rande...@gmail.com> wrote:
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release.
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
On Thursday, 28 October 2021 at 15:47:51 UTC+1, Incubus wrote:
On 2021-10-28, RichA <rande...@gmail.com> wrote:
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release.
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
My EOS M6 has the an option of using a electronic shutter, and it;s usefull as it doesn't make such
a loudish noise when opertated, BUT certain other functions aren't availble when this is selected noticably
not being able to use the Hi or low frame rates (FPS) when using 'drive' , plus a few others things I've forgotten.
Another was something about distortion of fast moving objects such as propellors.
But I'm not sure how this compares to a mechanical shutter.
I thought about doing a test with my fan heater, but unfortunanly it's a dyson so is effectively fanless
from a photograhy POV. Guess I'll have to wait for a helicopter to fly over Or a drone of course , who do we know with a drone that could pop over to London for 1/2 an hour or so for me to run a test ;-)
I think newer sensors don't have the issue with propellors, rotor blades
etc. I don't know how LED lighting will work.
On Thursday, 28 October 2021 at 15:47:51 UTC+1, Incubus wrote:
On 2021-10-28, RichA <rande...@gmail.com> wrote:
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release.My EOS M6 has the an option of using a electronic shutter, and it;s usefull as it doesn't make such
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
a loudish noise when opertated, BUT certain other functions aren't availble when this is selected noticably
not being able to use the Hi or low frame rates (FPS) when using 'drive' , plus a few others things I've forgotten.
Another was something about distortion of fast moving objects such as propellors.
But I'm not sure how this compares to a mechanical shutter.
I thought about doing a test with my fan heater, but unfortunanly it's a dyson so is effectively fanless
from a photograhy POV. Guess I'll have to wait for a helicopter to fly over Or a drone of course , who do we know with a drone that could pop over to London for 1/2 an hour or so for me to run a test ;-)
Am 29.10.2021 um 16:34 schrieb Incubus:
I think newer sensors don't have the issue with propellors, rotor blades etc. I don't know how LED lighting will work.I think the Z9 doesn't have a global shutter, only a fast readout
sensor. Fast moving things could be a problem.
--
Alfred Molon
I think the Z9 doesn't have a global shutter, only a fast
readout sensor. Fast moving things could be a problem.
Am 29.10.2021 um 16:34 schrieb Incubus:
I think newer sensors don't have the issue with propellors, rotor blades
etc. I don't know how LED lighting will work.
I think the Z9 doesn't have a global shutter, only a fast readout
sensor. Fast moving things could be a problem.
On 2021-10-29, Whisky-dave <whisk...@gmail.com> wrote:
On Thursday, 28 October 2021 at 15:47:51 UTC+1, Incubus wrote:
On 2021-10-28, RichA <rande...@gmail.com> wrote:
https://petapixel.com/2021/10/28/nikon-unveils-the-z9-45-7mp-120fps-8k-and-no-mechanical-shutter/
Nikon have just pissed all over Canon and Sony with this release.
Although I'll be sticking with my D3 where having a pro body is
concerned, I really hope this puts Nikon back where they belong at the
top of the food chain.
My EOS M6 has the an option of using a electronic shutter, and it;s usefull as it doesn't make suchI think newer sensors don't have the issue with propellors, rotor blades
a loudish noise when opertated, BUT certain other functions aren't availble when this is selected noticably
not being able to use the Hi or low frame rates (FPS) when using 'drive' , plus a few others things I've forgotten.
Another was something about distortion of fast moving objects such as propellors.
But I'm not sure how this compares to a mechanical shutter.
I thought about doing a test with my fan heater, but unfortunanly it's a dyson so is effectively fanless
from a photograhy POV. Guess I'll have to wait for a helicopter to fly over Or a drone of course , who do we know with a drone that could pop over to London for 1/2 an hour or so for me to run a test ;-)
etc. I don't know how LED lighting will work.
On 2021-10-29, Alfred Molon <alfred...@yahoo.com> wrote:
Am 29.10.2021 um 16:34 schrieb Incubus:
I think newer sensors don't have the issue with propellors, rotor blades >> etc. I don't know how LED lighting will work.
I think the Z9 doesn't have a global shutter, only a fast readoutI can't imagine they would release if it fast moving things are a
sensor. Fast moving things could be a problem.
problem given that it is designed for fast moving things.
On Monday, 1 November 2021 at 13:53:31 UTC, Incubus wrote:
On 2021-10-29, Alfred Molon<alfred...@yahoo.com> wrote:
Am 29.10.2021 um 16:34 schrieb Incubus:
I think newer sensors don't have the issue with propellors, rotor blades
etc. I don't know how LED lighting will work.
I think the Z9 doesn't have a global shutter, only a fast readoutI can't imagine they would release if it fast moving things are a
sensor. Fast moving things could be a problem.
problem given that it is designed for fast moving things.
I guess it depends on how fast a thing is actually moving as to whether it affects the final image.
But it's only 120 fps my 6 yearv old iphone can do 240 fps lower resolution sure.
fastest shutter speed is 1/32000 fast but not so incredable that nothing will be have motion blur.
On Nov 2, 2021, Whisky-dave wrote
(in article<8ec0fee5-b272-42a7...@googlegroups.com>):
On Monday, 1 November 2021 at 13:53:31 UTC, Incubus wrote:
On 2021-10-29, Alfred Molon<alfred...@yahoo.com> wrote:
Am 29.10.2021 um 16:34 schrieb Incubus:
I think newer sensors don't have the issue with propellors, rotor blades
etc. I don't know how LED lighting will work.
I think the Z9 doesn't have a global shutter, only a fast readout sensor. Fast moving things could be a problem.I can't imagine they would release if it fast moving things are a problem given that it is designed for fast moving things.
I guess it depends on how fast a thing is actually moving as to whether it affects the final image.
But it's only 120 fps my 6 yearv old iphone can do 240 fps lower resolution sure.
fastest shutter speed is 1/32000 fast but not so incredable that nothing will be have motion blur.The issue isn’t motion blur that is the problem with an electronic shutter.
The problem is the rolling shutter effect as we are yet to see a global shutter in a pro, or consumer camera.
<https://en.wikipedia.org/wiki/Rolling_shutter>
As for high frame rates of 120 fps, or 240 fps that is a video concern rather than a stills photography issue.
...and here is what rolling shutter can look like with an electronic shutter image. Check thehummingbird wing tip.
<https://photos.smugmug.com/photos/i-9d3QSvv/0/5bfddd31/O/i-9d3QSvv.jpg>
--
Regards,
Savageduck
My 'image' of a global shutter must be different then.
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
even with the fastest proccessor we have today.
On 2021-11-03 09:21, Whisky-dave wrote:
My 'image' of a global shutter must be different then.
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
Whatever happens in Star Trek wrt to the Heisenberg principle is likely
(at best) a misconstrued attempt at putting "science" into space opera.
You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
even with the fastest proccessor we have today.
An array can theoretically be devised that samples all 30Mpixels simultaneously for an arbitrary shutter period. It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
(There may be a trivial propagation delay in the nano second or less
scale of the sampling signal).
This is independent of the time to offload the data which would define
the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
On 2021-11-03 09:21, Whisky-dave wrote:
My 'image' of a global shutter must be different then.Whatever happens in Star Trek wrt to the Heisenberg principle is likely
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
(at best) a misconstrued attempt at putting "science" into space opera.
You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived itAn array can theoretically be devised that samples all 30Mpixels simultaneously for an arbitrary shutter period.
even with the fastest proccessor we have today.
It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
(There may be a trivial propagation delay in the nano second or less
scale of the sampling signal).
This is independent of the time to offload the data which would define
the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens
On 2021-11-03, Alan Browne <bitb...@blackhole.com> wrote:
On 2021-11-03 09:21, Whisky-dave wrote:
My 'image' of a global shutter must be different then.
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
Whatever happens in Star Trek wrt to the Heisenberg principle is likelyDamn. I thought it was a documentary, like The X Files.
(at best) a misconstrued attempt at putting "science" into space opera.
You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
even with the fastest proccessor we have today.
An array can theoretically be devised that samples all 30Mpixels simultaneously for an arbitrary shutter period. It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
(There may be a trivial propagation delay in the nano second or less
scale of the sampling signal).
This is independent of the time to offload the data which would defineThis must surely have been solved with analogue CCD cameras for live broadcast in the 1980s and earlier tube devices.
the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
On 2021-11-03, Alan Browne <bitbucket@blackhole.com> wrote:
On 2021-11-03 09:21, Whisky-dave wrote:
My 'image' of a global shutter must be different then.
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
Whatever happens in Star Trek wrt to the Heisenberg principle is likely
(at best) a misconstrued attempt at putting "science" into space opera.
Damn. I thought it was a documentary, like The X Files.
You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived it
even with the fastest proccessor we have today.
An array can theoretically be devised that samples all 30Mpixels
simultaneously for an arbitrary shutter period. It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
(There may be a trivial propagation delay in the nano second or less
scale of the sampling signal).
This is independent of the time to offload the data which would define
the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
This must surely have been solved with analogue CCD cameras for live broadcast in the 1980s and earlier tube devices.
On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
On 2021-11-03 09:21, Whisky-dave wrote:
My 'image' of a global shutter must be different then.Whatever happens in Star Trek wrt to the Heisenberg principle is likely
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
(at best) a misconstrued attempt at putting "science" into space opera.
No it's only mention regarding the Heisenberg compensator after the scientest got involved
in keeping the next generation series more credible from a technology POV.
You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived itAn array can theoretically be devised that samples all 30Mpixels
even with the fastest proccessor we have today.
simultaneously for an arbitrary shutter period.
Theoretically is not practically though.
Why use a shutter anyway ?
Better to sample each pixel for a nano second why have a shutter at all?
It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
that signal would take a difernt time to reach each pixel, although you could possibley compensate
in a similar way to the Heisenberg compensator in star trek.
(There may be a trivial propagation delay in the nano second or less
scale of the sampling signal).
Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.
Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'
This is independent of the time to offload the data which would define
the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
The clock speed of the processor normally dictates the speed.
That why faster processors are used, we are pretty close to maxium presently at ~5Ghz
the way we increase so called speed is by adding more cores.
--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens
On 2021-11-03 12:24, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:
On 2021-11-03 09:21, Whisky-dave wrote:
My 'image' of a global shutter must be different then.Whatever happens in Star Trek wrt to the Heisenberg principle is likely
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
(at best) a misconstrued attempt at putting "science" into space opera.
No it's only mention regarding the Heisenberg compensator after the scientest got involvedA Heisenberg compensator violates ....
in keeping the next generation series more credible from a technology POV.When I was a child I knew there was 0 technology credibility in ST. Han Solo's "in 17 parsecs" line was more credible where heaps of silliness
comes in (and yes, they "compensated" for that goof in the space opera "Solo".
You just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived itAn array can theoretically be devised that samples all 30Mpixels
even with the fastest proccessor we have today.
simultaneously for an arbitrary shutter period.
Theoretically is not practically though.It's quite feasible. It's just more transistors, "traces" and passive components. About 60 .. 120M more which is absolutely trivial in
today's chips, esp, a camera sensor.
Why use a shutter anyway ?
Better to sample each pixel for a nano second why have a shutter at all?Depends on the shutter period. So even if the inter pixel sampling
interval is 1 ns, the exposure still needs to be much longer for a
viable (low noise) period.
It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
that signal would take a difernt time to reach each pixel, although you could possibley compensate
in a similar way to the Heisenberg compensator in star trek.
(There may be a trivial propagation delay in the nano second or less
scale of the sampling signal).
Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.In an electronic circuit the propagation is somewhat slower than that,
but devised correctly, you would get the sample trigger everywhere
needed with negligible delay wrt the shutter period (which is in the
many microseconds and slower domain).
Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'You're way out there ...
This is independent of the time to offload the data which would define
the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
The clock speed of the processor normally dictates the speed.Capturing the image is completely independent of processor speed.
Processor speed goes to offloading, displaying, storing the image after
the fact.
That why faster processors are used, we are pretty close to maxium presently at ~5GhzCamera processors have no need to be very fast at all. Certainly not up
the way we increase so called speed is by adding more cores.
in the 5GHz range. Because: battery life.
Could you get a proper news reader?--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens
--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens
On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:
On 2021-11-03 12:24, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:A Heisenberg compensator violates ....
On 2021-11-03 09:21, Whisky-dave wrote:No it's only mention regarding the Heisenberg compensator after the scientest got involved
My 'image' of a global shutter must be different then.Whatever happens in Star Trek wrt to the Heisenberg principle is likely >>>> (at best) a misconstrued attempt at putting "science" into space opera. >>>
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
There's nothing wrong with a bit of violation between consenting adults.
in keeping the next generation series more credible from a technology POV. >> When I was a child I knew there was 0 technology credibility in ST. HanSolo's "in 17 parsecs" line was more credible where heaps of silliness
comes in (and yes, they "compensated" for that goof in the space opera
"Solo".
Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
why of the reaseans I think star wars is science fantasy and star trek is science fiction.
Some people can't tell the differnce between the two.
It's quite feasible. It's just more transistors, "traces" and passiveYou just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived itAn array can theoretically be devised that samples all 30Mpixels
even with the fastest proccessor we have today.
simultaneously for an arbitrary shutter period.
Theoretically is not practically though.
components. About 60 .. 120M more which is absolutely trivial in
today's chips, esp, a camera sensor.
But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.
Why use a shutter anyway ?Depends on the shutter period. So even if the inter pixel sampling
Better to sample each pixel for a nano second why have a shutter at all?
interval is 1 ns, the exposure still needs to be much longer for a
viable (low noise) period.
Which is partially while you get a blur effect.
In an electronic circuit the propagation is somewhat slower than that,
It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
that signal would take a difernt time to reach each pixel, although you could possibley compensate
in a similar way to the Heisenberg compensator in star trek.
(There may be a trivial propagation delay in the nano second or less
scale of the sampling signal).
Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.
but devised correctly, you would get the sample trigger everywhere
needed with negligible delay wrt the shutter period (which is in the
many microseconds and slower domain).
But not yet.
You're way out there ...
Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'
if it;'s so easy to do with justa few million more transistors they;d have done it.
Capturing the image is completely independent of processor speed.
This is independent of the time to offload the data which would define >>>> the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
The clock speed of the processor normally dictates the speed.
Processor speed goes to offloading, displaying, storing the image after
the fact.
but you need the rest of the circuit to read all teh pixels at exactly the same time.
File could do that.
That why faster processors are used, we are pretty close to maxium presently at ~5GhzCamera processors have no need to be very fast at all. Certainly not up
the way we increase so called speed is by adding more cores.
in the 5GHz range. Because: battery life.
They;d need to be fast , processor in camera are getting faster.
On Tuesday, 2 November 2021 at 14:48:57 UTC, Savageduck wrote:
On Nov 2, 2021, Whisky-dave wrote
(in article<8ec0fee5-b272-42a7...@googlegroups.com>):
On Monday, 1 November 2021 at 13:53:31 UTC, Incubus wrote:The issue isn’t motion blur that is the problem with an electronic shutter.
On 2021-10-29, Alfred Molon<alfred...@yahoo.com> wrote:
Am 29.10.2021 um 16:34 schrieb Incubus:I can't imagine they would release if it fast moving things are a
I think newer sensors don't have the issue with propellors, rotor blades >>>>>> etc. I don't know how LED lighting will work.
I think the Z9 doesn't have a global shutter, only a fast readout
sensor. Fast moving things could be a problem.
problem given that it is designed for fast moving things.
I guess it depends on how fast a thing is actually moving as to whether it affects the final image.
But it's only 120 fps my 6 yearv old iphone can do 240 fps lower resolution sure.
fastest shutter speed is 1/32000 fast but not so incredable that nothing will be have motion blur.
I think the problem will have the same effect but the term shutter is at fault here, which is why there is confusion in terms.
On 2021-11-03 12:57, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:
On 2021-11-03 12:24, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:A Heisenberg compensator violates ....
On 2021-11-03 09:21, Whisky-dave wrote:No it's only mention regarding the Heisenberg compensator after the scientest got involved
My 'image' of a global shutter must be different then.Whatever happens in Star Trek wrt to the Heisenberg principle is likely >>>> (at best) a misconstrued attempt at putting "science" into space opera. >>>
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
There's nothing wrong with a bit of violation between consenting adults.
in keeping the next generation series more credible from a technology POV.When I was a child I knew there was 0 technology credibility in ST. Han >> Solo's "in 17 parsecs" line was more credible where heaps of silliness
comes in (and yes, they "compensated" for that goof in the space opera
"Solo".
Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
why of the reaseans I think star wars is science fantasy and star trek is science fiction.
Some people can't tell the differnce between the two.
It's quite feasible. It's just more transistors, "traces" and passiveYou just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived itAn array can theoretically be devised that samples all 30Mpixels
even with the fastest proccessor we have today.
simultaneously for an arbitrary shutter period.
Theoretically is not practically though.
components. About 60 .. 120M more which is absolutely trivial in
today's chips, esp, a camera sensor.
But that won't do the job, the new macbooks M1 Max processor has 57 billion transistorsThis is NOT a CPU issue. It is a devoted device sampling gate issue.
but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.
As long as you can get the signal to where it is needed the sample is started (and ended when the sampling signal goes to the opposite state).
A strobe (sampling signal) is akin to the processor timing signals all
over a CPU chip. So while they won't be perfectly simultaneous, they
will be at close to 0 lag pixel to pixel. The device does not "address"
each pixel, each pixel receives the sample signal simultaneously (with
minor variation due to propagation - sub ns level).
Why use a shutter anyway ?interval is 1 ns, the exposure still needs to be much longer for a
Better to sample each pixel for a nano second why have a shutter at all? >> Depends on the shutter period. So even if the inter pixel sampling
viable (low noise) period.
Which is partially while you get a blur effect.Not at all. Rolling shutter is due to the CPU sampling one row at a
time; in turn each row being off loaded 1 pixel at a time.
What I'm describing is trigger the senor to sample all pixels at once (frozen for the same exposure period), and then offloading the entire image.
In an electronic circuit the propagation is somewhat slower than that,It would require that
all pixel sites be triggered by a single sampling signal ("strobe")
that signal would take a difernt time to reach each pixel, although you could possibley compensate
in a similar way to the Heisenberg compensator in star trek.
(There may be a trivial propagation delay in the nano second or less >>>> scale of the sampling signal).
Nano second is no longer that trival as light moves a whole foot every nano second electric current somewhat slower.
but devised correctly, you would get the sample trigger everywhere
needed with negligible delay wrt the shutter period (which is in the
many microseconds and slower domain).
But not yet.Of course "yet". It's just more costly to do.
Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'You're way out there ...
if it;'s so easy to do with justa few million more transistors they;d have done it.$
Capturing the image is completely independent of processor speed.This is independent of the time to offload the data which would define >>>> the frame rate (or specifically the inter-frame delay). Thus
independent of the processor speed.
The clock speed of the processor normally dictates the speed.
Processor speed goes to offloading, displaying, storing the image after >> the fact.
but you need the rest of the circuit to read all teh pixels at exactly the same time.A DMA transfer of 30M pixels to a DMA array of some number would take
File could do that.
little time.
That why faster processors are used, we are pretty close to maxium presently at ~5GhzCamera processors have no need to be very fast at all. Certainly not up >> in the 5GHz range. Because: battery life.
the way we increase so called speed is by adding more cores.
They;d need to be fast , processor in camera are getting faster.Again, image sampling ≠ image transfer / storage. It is an independent
of the CPU operation.
--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens
On Wednesday, 3 November 2021 at 17:49:48 UTC, Alan Browne wrote:
On 2021-11-03 12:57, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:This is NOT a CPU issue. It is a devoted device sampling gate issue.
On 2021-11-03 12:24, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote:A Heisenberg compensator violates ....
On 2021-11-03 09:21, Whisky-dave wrote:No it's only mention regarding the Heisenberg compensator after the scientest got involved
My 'image' of a global shutter must be different then.Whatever happens in Star Trek wrt to the Heisenberg principle is likely >>>>>> (at best) a misconstrued attempt at putting "science" into space opera. >>>>>
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
There's nothing wrong with a bit of violation between consenting adults. >>>
in keeping the next generation series more credible from a technology POV.When I was a child I knew there was 0 technology credibility in ST. Han >>>> Solo's "in 17 parsecs" line was more credible where heaps of silliness >>>> comes in (and yes, they "compensated" for that goof in the space opera >>>> "Solo".
Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
why of the reaseans I think star wars is science fantasy and star trek is science fiction.
Some people can't tell the differnce between the two.
It's quite feasible. It's just more transistors, "traces" and passiveYou just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived itAn array can theoretically be devised that samples all 30Mpixels
even with the fastest proccessor we have today.
simultaneously for an arbitrary shutter period.
Theoretically is not practically though.
components. About 60 .. 120M more which is absolutely trivial in
today's chips, esp, a camera sensor.
But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.
And what does that actually mean in real terms it's something that can sample a 'signal' from a pixel and what is that signal ?
As long as you can get the signal to where it is needed the sample is
started (and ended when the sampling signal goes to the opposite state).
So what is this magic signal ?
A strobe (sampling signal) is akin to the processor timing signals all
over a CPU chip. So while they won't be perfectly simultaneous, they
will be at close to 0 lag pixel to pixel. The device does not "address"
each pixel, each pixel receives the sample signal simultaneously (with
minor variation due to propagation - sub ns level).
But this magic device would need at least 30 million inputs one for each pixel.
What I'm describing is trigger the senor to sample all pixels at once
(frozen for the same exposure period), and then offloading the entire image.
but how do you do this is where the problem starts, especailly at a high enough quality.
This si why cameras are limited by frame rates.
Not sure what the best is currently but 8k at 120 fps isn't that amazing.
Now 30MP at 10k fps would actually be impressive, but why isn't it done ?
But not yet.Of course "yet". It's just more costly to do.
Yes well outside the budget of people and most companies so little point as yet in making such a camera.
$Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'You're way out there ...
if it;'s so easy to do with justa few million more transistors they;d have done it.
Yep same old story I'd like a lamborghini sian fkp 37 if it wasn't for the $ they'd need to make more than just 63 of them too, so sold out so I can't have one.
A DMA transfer of 30M pixels to a DMA array of some number would take
little time.
But you need far more than just 30M, that wouldn't even give you a monochrome image just B&W.
Just is the pixel on or off.
Again, image sampling ≠ image transfer / storage. It is an independentThat why faster processors are used, we are pretty close to maxium presently at ~5GhzCamera processors have no need to be very fast at all. Certainly not up >>>> in the 5GHz range. Because: battery life.
the way we increase so called speed is by adding more cores.
They;d need to be fast , processor in camera are getting faster.
of the CPU operation.
No it's not in the real world.
But what is image sampling then ?
On Thursday, 4 November 2021 at 21:30:20 UTC, Alan Browne wrote:
Done with this.
Yeah well, it's still not a global shutter it's a rolling shutter, which doesn't sample all
the sensor data at the same time.
On 2021-11-04 11:07, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 17:49:48 UTC, Alan Browne wrote:
On 2021-11-03 12:57, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 16:40:03 UTC, Alan Browne wrote:This is NOT a CPU issue. It is a devoted device sampling gate issue.
On 2021-11-03 12:24, Whisky-dave wrote:
On Wednesday, 3 November 2021 at 15:29:08 UTC, Alan Browne wrote: >>>>>> On 2021-11-03 09:21, Whisky-dave wrote:A Heisenberg compensator violates ....
My 'image' of a global shutter must be different then.Whatever happens in Star Trek wrt to the Heisenberg principle is likely
I assumed a global shutter would suffer from Heisenberg princible just like in star trek.
(at best) a misconstrued attempt at putting "science" into space opera.
No it's only mention regarding the Heisenberg compensator after the scientest got involved
There's nothing wrong with a bit of violation between consenting adults. >>>
in keeping the next generation series more credible from a technology POV.When I was a child I knew there was 0 technology credibility in ST. Han >>>> Solo's "in 17 parsecs" line was more credible where heaps of silliness >>>> comes in (and yes, they "compensated" for that goof in the space opera >>>> "Solo".
Hans solo wasn't in star trek and parsecs is a distanace not a speed I knew that too, and it's
why of the reaseans I think star wars is science fantasy and star trek is science fiction.
Some people can't tell the differnce between the two.
It's quite feasible. It's just more transistors, "traces" and passive >>>> components. About 60 .. 120M more which is absolutely trivial inYou just can;t have such a sensor you can't sample ~30meg pixels instantaneously, presently anyway so I doubt Nikon has achived itAn array can theoretically be devised that samples all 30Mpixels >>>>>> simultaneously for an arbitrary shutter period.
even with the fastest proccessor we have today.
Theoretically is not practically though.
today's chips, esp, a camera sensor.
But that won't do the job, the new macbooks M1 Max processor has 57 billion transistors
but it won't be able to sample a sensor fast enough for ever pixel to have the same timestamp.
And what does that actually mean in real terms it's something that can sampleThe CPU ( or some other arbitrary origin signal ) tells the device
a 'signal' from a pixel and what is that signal ?
(sensor array to begin sampling. It does. All pixels at once (in this
one shot version).
So each sensor "well" is cleared and begins accumulating charge from
that moment until the trigger says "stop" (change of state). So all
wells accumulate over the same period of time.
When the period ends, then the entire array is read by the CPU. This
takes non-zero time. But the information at each site is from the same period of time. No rolling shutter.
As long as you can get the signal to where it is needed the sample is
started (and ended when the sampling signal goes to the opposite state).
So what is this magic signal ?Trigger, strobe, whatever name you want. No magic.
A strobe (sampling signal) is akin to the processor timing signals all
over a CPU chip. So while they won't be perfectly simultaneous, they
will be at close to 0 lag pixel to pixel. The device does not "address" >> each pixel, each pixel receives the sample signal simultaneously (with
minor variation due to propagation - sub ns level).
But this magic device would need at least 30 million inputs one for each pixel.No magic.
What I'm describing is trigger the senor to sample all pixels at once
(frozen for the same exposure period), and then offloading the entire image.
but how do you do this is where the problem starts, especailly at a high enough quality.It's just a matter of a lot of traces on the silicon. Lots. And each
pixel well would need 2 .. 4 transistors to take in the signal, clear
the well, and wait until the signal clears.
Added complexity. Not "magic".
This si why cameras are limited by frame rates.
Not sure what the best is currently but 8k at 120 fps isn't that amazing.
Now 30MP at 10k fps would actually be impressive, but why isn't it done ?That makes for very short exposures per frame. Aka: high noise.
Another issue : it's one thing for the sensor to capture the image in
situ. Another thing to read it off.
But not yet.Of course "yet". It's just more costly to do.
Yes well outside the budget of people and most companies so little point as yet in making such a camera.Markets are for testing.
$Trival but you'll need something similar to rotating mirrors to achieve it as did the ultra FPS camera used for photographing a light beam 'traveling'You're way out there ...
if it;'s so easy to do with justa few million more transistors they;d have done it.
Yep same old story I'd like a lamborghini sian fkp 37 if it wasn't for the $They are crapily built cars that devalue very quickly (for the most
they'd need to make more than just 63 of them too, so sold out so I can't have one.
part) and that are extremely costly to operate and maintain. Only a
rare model will climb in value. Esp. if you don't actually use it.
A DMA transfer of 30M pixels to a DMA array of some number would take
little time.
But you need far more than just 30M, that wouldn't even give you a monochrome image just B&W.RGB arrays are not a novel concept ...
Just is the pixel on or off.
Again, image sampling ≠ image transfer / storage. It is an independent >> of the CPU operation.That why faster processors are used, we are pretty close to maxium presently at ~5GhzCamera processors have no need to be very fast at all. Certainly not up >>>> in the 5GHz range. Because: battery life.
the way we increase so called speed is by adding more cores.
They;d need to be fast , processor in camera are getting faster.
No it's not in the real world.There are thousands of sensor types that operate independent of the
system CPU. The CPU might set up the sampling, gain, etc., but other
than triggering a sample period or reading data, setting parameters,
they don't need to be part of the actual sampling.
But what is image sampling then ?What happens on the image sensor chip independently of the CPU.
Done with this.
--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens
On 2021-11-09 08:18, Whisky-dave wrote:
On Thursday, 4 November 2021 at 21:30:20 UTC, Alan Browne wrote:
Done with this.
Yeah well, it's still not a global shutter it's a rolling shutter, which doesn't sample allThe discussion was a hypothetical. But you knew that. Right?
the sensor data at the same time.
Really done.
--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 293 |
Nodes: | 16 (2 / 14) |
Uptime: | 238:13:54 |
Calls: | 6,624 |
Files: | 12,172 |
Messages: | 5,319,933 |