I notice that there is a modern trend to include the last few seconds
of the preparations of an interview ("Everyone ready?" and then a clapperboard) in documentary interviews, as the voiceover is
introducing a speaker. They did it on a documentary the other year
about the night Britannia Bridge (Anglesey) caught fire, and also in
the Channel 4 documentary last week about the Falklands War.
Both programmes looked to be made on video - I doubt whether film is
used much (or at all) for documentaries now. So why was a clapperboard
being used? I can see that the numbers on it could be useful for shot identification, but wouldn't the sound be recorded by the camera (eg
by a sound mixer feeding the camera's audio track). Or is sound
sometimes recorded separately (probably on disk rather than tape,
these days) rather than in-camera?
NY wrote:
why was a clapperboard being used?
To sync up multiple mics recorded separately with the video?
why was a clapperboard being used?
There seems to be more tolerance of such mismatch than in the golden
age. I suppose part of it is that there is just more material about now. (Tolerance among the originators I mean, of course; it still bugs _me_!)
On 31/03/2022 13:18, NY wrote:[]
Both programmes looked to be made on video - I doubt whether film is
used much (or at all) for documentaries now. So why was a clapperboard >>being used? I can see that the numbers on it could be useful for shot >>identification, but wouldn't the sound be recorded by the camera (eg
by a sound mixer feeding the camera's audio track). Or is sound
sometimes recorded separately (probably on disk rather than tape,
these days) rather than in-camera?
None of that prevents the audio and video getting out of sync. In fact
it's more of a problem today, than it was even 20 years ago !
On 31/03/2022 18:25, J. P. Gilliver (John) wrote:
There seems to be more tolerance of such mismatch than in the golden
age. I suppose part of it is that there is just more material about now.
(Tolerance among the originators I mean, of course; it still bugs _me_!)
In some cases, the material may have been obtained via cellphone or zoom,
and there has been no time to re-sync.
The general public are also now used to seeing poorly time-aligned
cellphone footage on Youtube and similar platforms, so no longer bother complaining.
On Thu, 31 Mar 2022 20:02:14 +0100, "NY" <me@privacy.invalid> wrote:
One interesting thing that I noticed when I was watching Lewis (as in >>Inspector Morse) being filmed: although it was being filmed on film not >>video, I wasn't aware of clapperboards being used at all. Somehow the
sound
was being recorded on a separate recorder in a way that could be >>synchronised easily in post-production. Is a clapperboard needed if the >>camera and sound recorder are both fed from the same timecode source which >>marks each frame of film and each "frame" of sound data with a code which >>is
known to be in sync. If so, why are clapperboards still needed?
Clapperboards shouldn't be needed at all if the timecode generators in
any unconnected equipment are properly adjusted. They were only ever
needed in the days of cine film because the sound and picture were
recorded on different equipment. Television was originally live, and
then when a method of recording it was invented, it recorded sound and picture on the same machine, usually from static equipment connected
by cables in a studio. I think it must have been the trend to use
multiple portable equipment - cameras and microphones - without cables
that regenerated the need for an identifiable event on both sound and
vision, or maybe they just didn't trust themselves to set up the
timecode generators properly.
One interesting thing that I noticed when I was watching Lewis (as in >Inspector Morse) being filmed: although it was being filmed on film not >video, I wasn't aware of clapperboards being used at all. Somehow the sound >was being recorded on a separate recorder in a way that could be
synchronised easily in post-production. Is a clapperboard needed if the >camera and sound recorder are both fed from the same timecode source which >marks each frame of film and each "frame" of sound data with a code which is >known to be in sync. If so, why are clapperboards still needed?
I've seen film news crews with a cameraman and sound recordist tethered by >umbilical cord. And yet they still use clapperboards. Maybe as belt and >braces. Did the cable tend *only* to send sync pulses to cater for film that >may not run at *exactly* 25 fps, to alter the tape recorder's speed, but >without the ability to generate frame-accurate timecodes to label the tape?
On Thu, 31 Mar 2022 21:04:16 +0100, "NY" <me@privacy.invalid> wrote:
I've seen film news crews with a cameraman and sound recordist
tethered by umbilical cord. And yet they still use clapperboards.
Maybe as belt and braces. Did the cable tend only to send sync
pulses to cater for film that may not run at exactly 25 fps, to
alter the tape recorder's speed, but without the ability to
generate frame-accurate timecodes to label the tape?
Recording synchronous sound with film usually meant using a film
camera and a tape recorder with very accurate and stable motor speeds governed by quartz crystal oscillators. The tape recording would later
be copied onto fully coated magnetic film stock ("Sepmag" as the BBC
called it) with the same gauge as the picture film for use on the
editing table. Picture and sound were thus recorded on separate
machines and so needed an identifiable event on both sound and vision
to be able to synchronise them later, hence the clapperboard. There
wasn't normally any connection between the camera and the sound
recorder, but the stability provided by the crystal oscillator enabled
them to keep pace for longer than the maximum running time of a roll
of film (about 10 minutes), which was usually longer than the running
time of a typical drama scene (about 3 or 4 minutes).
SMPTE timecode didn't appear until some time in the 60s or maybe early
70s as I recall, at first only used with electronic recording
equipment. Film cameras that could record the code optically on the
film didn't appear until some time later.
I don't know what an umbilical cable between a film camera and a sound recorder would be carrying. My best guess is it would be to enable use
of a standard stereo tape recorder (i.e. not a special moviemaking one
with crystal control built in) so the cable would be carrying frame
pulses that the machine would record on one of the audio tracks.
Rod.
The umbilical cord carried a 50Hz signal derived from the camera's
motor. This was recorded on the tape recorder's pilot track. (Nagra
neopilot) It also carried a bleep siganal that recorded a tone on the recorder's audio track at the same time as a flash was filmed in the
camera.
This was before crystal oscillators were used.
"Ashley Booth" <removetab@snglinks.com> wrote in message >news:jao0sqF1bl1U1@mid.individual.net...
The umbilical cord carried a 50Hz signal derived from the camera's
motor. This was recorded on the tape recorder's pilot track. (Nagra
neopilot) It also carried a bleep siganal that recorded a tone on the
recorder's audio track at the same time as a flash was filmed in the
camera.
This was before crystal oscillators were used.
Ah, do both the camera and the recorder have *separate* crystals?
I always
thought that just one device (eg the camera) had the crystal, and the >umbilical cord was feeding this frequency as a reference to the other device >to control its motor. I can see the advantage of both devices having their >own crystal (as long as both are accurately the same frequency) because it >allows the recorder to be untethered from the camera.
When the sound and pictures are being synchronised at the editing stage, how >is the audio allowed to "slip" relative to the film to get them in sync? Is >the sound still on magnetic *tape* at that stage, or has it been dubbed onto >sprocketed magnetic film? Presumably you want to be able to adjust the >position of the synchronising clap on the tape to within a smaller
resolution than the nearest frame (1/25 second or 80 milliseconds). Or is a >mismatch of up to +/- 40 msec deemed to be close enough not to be noticed?
I realise that once the sound and pictures have been synchronised, the final >master will either keep separate films (sepmag/sepopt) or else be >dubbed/printed onto an optical track on the film (commag/comopt). Was news >film always played out as sepmag, to keep things as simple as possible?
When the sound and pictures are being synchronised at the editing stage, >>how
is the audio allowed to "slip" relative to the film to get them in sync?
Is
the sound still on magnetic *tape* at that stage, or has it been dubbed >>onto
sprocketed magnetic film? Presumably you want to be able to adjust the >>position of the synchronising clap on the tape to within a smaller >>resolution than the nearest frame (1/25 second or 80 milliseconds). Or is
a
mismatch of up to +/- 40 msec deemed to be close enough not to be noticed?
An editing table takes a cutting copy of the film, and a magnetically
coated sprocketed copy of the sound. They are brought into
synchronisation for editing by painstaking use of of chinagraph
pencils and sticky tape, and the mechanism of the table keeps them
geared together. If the editor needs to remove or add a piece of
picture film as part of the edit, they have to remove or add pieces of magnetic film to match, so the entire reels on the editing table
remain in sync. Yes, it's laborious, but that's what they did.
I realise that once the sound and pictures have been synchronised, the >>final
master will either keep separate films (sepmag/sepopt) or else be >>dubbed/printed onto an optical track on the film (commag/comopt). Was news >>film always played out as sepmag, to keep things as simple as possible?
Synchronous sound was unusual on news film reports. Usually it was
just effects tracks and voiceovers that were added later, sometimes
with the presenter doing the voiceover live.
The film was normally reversal stock, i.e. there was no negative but
the actual film that was used in the camera became the transmitted
copy. This was quicker to process but more vulnerable to loss or
damage during editing of course, but for news the speed was usually considered paramount, and the presenter could talk over it. Reversal
stock was available with a magnetic stripe, but I think this was
mostly used by amateur moviemakers, not professional news crews.
An editing table takes a cutting copy of the film, and a magnetically
coated sprocketed copy of the sound. They are brought into
synchronisation for editing by painstaking use of of chinagraph
pencils and sticky tape, and the mechanism of the table keeps them
geared together. If the editor needs to remove or add a piece of
picture film as part of the edit, they have to remove or add pieces of
magnetic film to match, so the entire reels on the editing table
remain in sync. Yes, it's laborious, but that's what they did.
Could the sprockets of the sound film be moved relative to those of the >picture film (eg by a clutch that was temporarily disengaged) or could you >only sync the sound to the nearest sprocket on the sound film?
I've seen film news crews with a cameraman and sound recordist tethered[]
by umbilical cord. And yet they still use clapperboards. Maybe as belt
and braces.
Now that we can do everything with so much more precision using
software editing of digital recordings, you'd think that timing errors
would be less frequent and less noticeable, but I get the impression
that this is not the case...
On 01/04/2022 15:18, NY wrote:
I imagine that because it is no longer necessary to synchronise the sound
with the film, and the sound and pictures are "welded together" into a
single AVI/MPG/whatever file or live data-stream, people don't bother to
check whether they are precisely in sync, given that by definition they
are
always*almost* in sync.
But we still get sound and vision out of sync quite regularly.
I imagine that because it is no longer necessary to synchronise the sound with the film, and the sound and pictures are "welded together" into a
single AVI/MPG/whatever file or live data-stream, people don't bother to check whether they are precisely in sync, given that by definition they are always*almost* in sync.
"Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message >news:5hmd4h1fodfs1d2ljpnr5vmessakfvceo3@4ax.com...[]
[]The film was normally reversal stock, i.e. there was no negative but
the actual film that was used in the camera became the transmitted
copy. This was quicker to process but more vulnerable to loss or
damage during editing of course, but for news the speed was usually
considered paramount, and the presenter could talk over it. Reversal
stock was available with a magnetic stripe, but I think this was
mostly used by amateur moviemakers, not professional news crews.
Yes, I believe they usually used Ektachrome 320 which was
tungsten-balanced, with a daylight filter for outdoors where there was
more daylight and so the loss of 2 (?) stops of speed due to the blue
filter was less critical, leaving maximum speed for indoor tungsten-lit >shots.
"MB" <MB@nospam.net> wrote in message news:t272t0$95k$1@dont-email.me...
On 01/04/2022 15:18, NY wrote:
I imagine that because it is no longer necessary to synchronise the sound >>> with the film, and the sound and pictures are "welded together" into a
single AVI/MPG/whatever file or live data-stream, people don't bother to >>> check whether they are precisely in sync, given that by definition
they are
always*almost* in sync.
But we still get sound and vision out of sync quite regularly.
That was my point: people mistakenly think that sound and vision can't
get out of sync these days and therefore don't check before going live.
On Thu, 31 Mar 2022 at 21:04:16, NY <me@privacy.invalid> wrote (my
responses usually FOLLOW):
[]
I've seen film news crews with a cameraman and sound recordist tethered[]
by umbilical cord. And yet they still use clapperboards. Maybe as belt
and braces.
I suspect it's more a matter of using the board as shot identification -
i. e. it's what's written on the board (or shown in its display if a
modern fancy one) that's more important. Adding the "clack" adds almost >nothing to the time taken, so may as well be done for the rare case it
_is_ useful (and I suspect for some makes them feel they are doing
"real" film production!); if it's loud enough, it probably also has an
effect similar to someone shouting "quiet on set", in the situations
where more than one man and his dog are present anyway.
The film was normally reversal stock, i.e. there was no negative but
the actual film that was used in the camera became the transmitted
copy. This was quicker to process but more vulnerable to loss or
damage during editing of course, but for news the speed was usually considered paramount, and the presenter could talk over it. Reversal
stock was available with a magnetic stripe, but I think this was
mostly used by amateur moviemakers, not professional news crews.
"MB" <MB@nospam.net> wrote in message
news:t272t0$95k$1@dont-email.me...
But we still get sound and vision out of sync
quite regularly.
That was my point: people mistakenly think that
sound and vision can't get out of sync these days
and therefore don't check before going live.
On 01/04/2022 12:08, Roderick Stewart wrote:
The film was normally reversal stock, i.e. there was no negative but
the actual film that was used in the camera became the transmitted
copy. This was quicker to process but more vulnerable to loss or
damage during editing of course, but for news the speed was usually
considered paramount, and the presenter could talk over it. Reversal
stock was available with a magnetic stripe, but I think this was
mostly used by amateur moviemakers, not professional news crews.
I worked on the final years of film for TV news.
COMMAG was the _normal_ method of recording sync sound for BBC news crews, right up until the end of film for news. It had nothing to do with unions, restrictive practices or whatever, instead a large amount of very
expensive kit, that would have to be replaced by rather inferior
(initially) video equipment.
Recordists used a separate (BBC designed & built) recording amplifier for
the recordist (a bit bigger than an ASC Minx mixer, if you remember
those). It had one or two mic amps with a mixer circuit and very limited
EQ (HPF for stripping wind noise IIRC, although that was available on the T-power box for gun mics). It had a yellow-spot PPM and headphone amp, and the all-important record head driver circuit for the camera.
A clapperboard can contain a lot of useful information, a greyscale for picture matching, and also quietens things down on location, and it can
sync quite a few cameras, as long as it can be clearly seen by them.
The alternative is a timecode display on a tablet. This works well (longitudinal TC sent to camera and audio tracks), and cameras can simply frame up on it after any stop-start, but it doesn't have the authority of
a clapperboard (for the humans), and you still need cables or radio links
to carry the LTC.
That would explain the umbilical cord: not to keep
the camera and recorder in sync, but to feed the
sound signal from the soundman's mike and mixer to
the recorder in the camera. Presumably it meant
that any sync sound would get cut at a different
point to the corresponding pictures, unless the
commag track was laid off to magnetic film which
was cut synchronously with the film. I can't
imagine there'd be time for that in a news studio.
Interesting that you say that ENG equipment was
initially inferior. OK, you got smearing and stuck
images (photographers' flashguns), but my
impression was that the picture was generally a
lot better because it didn't have the huge film
grain and the drab muddy colours.
I wonder why there is the trend in the last few years to show the
preparation for the interview ("are you sitting comfortably?", camera making >final focus and framing adjustments, clapperboard) as filler before the >interview begins as the interviewee is being introduced by the narrator.
Will it become as much of a cliche as the establish shot of an interview >walking somewhere (assumed to be going to the interview) before the closeup >of them being interviewed.
Like the trend a few decades ago for "funny camera angles". I remember an item on Top Gear or Tomorrow's World in which all the shots of the presenter had been tilted by 45 degrees so his head was in the top left/right corner
of the frame and his legs/body were in the bottom right/left. For no good reason, other than to be "clever" or "arty".
It was usually Tungsten-balanced Ektachrome 160 or equivalent (E6
process), occasionally pushed 1 or 2 stops. This was done so that the correction filter in the camera(Wratten 80A?) was added in daylight,
rather than indoors where there were already low light conditions. Film
did need a lot of additional light indoors, hence redheads, etc.
Also, don't underestimate the losses in TK: IMHO Telecine didn't come
close to doing film justice until Cintel Mk.IIIs came in. The digital
systems available for the last 20+ years give excellent results.
On Sat, 2 Apr 2022 11:11:15 +0100, "NY" <me@privacy.invalid> wrote:
I wonder why there is the trend in the last few years to show the >>preparation for the interview ("are you sitting comfortably?", camera >>making
final focus and framing adjustments, clapperboard) as filler before the >>interview begins as the interviewee is being introduced by the narrator. >>Will it become as much of a cliche as the establish shot of an interview >>walking somewhere (assumed to be going to the interview) before the
closeup
of them being interviewed.
It's because in the absence of real creativity, style is inversely proportional to content.
I remember Jools Holland did an item on a Channel 4 programme years ago
where he compared film with video - to show that film was "better". But
it was a very unfair test. The film camera was viewing the scene with
the sun roughly behind it so the subject (himself standing beside a motorbike) and the background were both lit by the same amount. The
video camera was looking in roughly the opposite direction (so each
camera could see the other in its shot) and the background was lit far
more brightly than the subject (which had been exposed for) so the
background was overexposed. With the two cameras side-by-side, both
seeing the same subject from the same direction and lighting, the test
would have been fairer because it would have removed one very big variable.
On 03/04/2022 11:59, NY wrote:
Like the trend a few decades ago for "funny camera angles". I
remember an
item on Top Gear or Tomorrow's World in which all the shots of the
presenter
had been tilted by 45 degrees so his head was in the top left/right
corner
of the frame and his legs/body were in the bottom right/left. For no
good
reason, other than to be "clever" or "arty".
Still happens, saw something a few days ago that was all shot at an
angle.
The Ipcress File ?
On 03/04/2022 16:50, Mark Carver wrote:
The Ipcress File ?
I think it was a short "film" on The One Show or something similar.
It was so distracting that I cannot remember and might have switched
off / over.
"Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:tqii4hlbnhvirdtuvs0h9q24i945pnnhnr@4ax.com... >On Sat, 2 Apr
2022 11:11:15 +0100, "NY" <me@privacy.invalid> wrote:
I wonder why there is the trend in the last few years to show the preparation for the interview ("are you sitting comfortably?",
camera making final focus and framing adjustments, clapperboard)
as filler before the interview begins as the interviewee is being introduced by the narrator. Will it become as much of a cliche
as the establish shot of an interview walking somewhere (assumed
to be going to the interview) before the closeup of them being interviewed.
It's because in the absence of real creativity, style is inversely proportional to content.
Like the trend a few decades ago for "funny camera angles". I
remember an item on Top Gear or Tomorrow's World in which all the
shots of the presenter had been tilted by 45 degrees so his head was
in the top left/right corner of the frame and his legs/body were in
the bottom right/left. For no good reason, other than to be "clever"
or "arty".
Then there was a documentary about the attempted kidnapping of
Princess Anne, and every single talking-head shot started with the interviewee in focus, then the image would become slightly
defocussed, go back into focus and out again, and then become sharp
for the rest of the interview. The first time, I thought the camera
operator was "hunting" the focus slightly, but when it happened every
time, always with very similar timing, I realised it was someone
pissing about in post-processing :-( Likewise for interviews that
switch between colour and BW for different camera angles, or with
fake venetian blind effect to distinguish modern footage from archive.
The funniest (but only if you are in the know) was a retrospective
about the Iranian Embassy Siege in which all the archive video
footage had had fake film grain and dirt added to it to say "this is archive".
"SimonM" <somewhere@large.in.the.world> wrote in message >news:t29pog$brk$1@dont-email.me...
It was usually Tungsten-balanced Ektachrome 160 or equivalent (E6
process), occasionally pushed 1 or 2 stops. This was done so that the
correction filter in the camera(Wratten 80A?) was added in daylight,
rather than indoors where there were already low light conditions. Film
did need a lot of additional light indoors, hence redheads, etc.
I called it Ektachrome 320. I was wrong. It was 160: 320 was 160 already >pushed by 1 stop ;-)
Also, don't underestimate the losses in TK: IMHO Telecine didn't come
close to doing film justice until Cintel Mk.IIIs came in. The digital
systems available for the last 20+ years give excellent results.
I first realised that when I saw restored (re-telecined) versions of
episodes of The Sweeney and compared them with original versions as
broadcast by ITV4. OK, so ITV4's version was 544x576 pixels rather than >720x576, so there was a loss of horizontal resolution (the episodes were
also butchered to fit a 52-minute episode into a modern 46-minute "1 hour >with 3 breaks" slot).
Leaving aside reduction in horizontal resolution, the most obvious thing was >much greater detail in shadows and highlights. This is the same as if you >compare a scan of a print from a 35 mm (still) negative and a scan (with >reversal) of the negative.
My impression is that film+TK and video have both improved to the point that >it is nowhere near as obvious as a viewer which is which these days. Compare >that with the often-seen shots of the Iranian Embassy Siege in London. >Normally you see the footage from the full-size TV cameras, recorded on VT: >sharp, vibrant but a lot of highlight-crushing on the light-coloured stones >of the building. Occasionally you see footage from ground level from 16 mm >film which looks very drab and grainy.
I remember Jools Holland did an item on a Channel 4 programme years ago
where he compared film with video - to show that film was "better". But it >was a very unfair test. The film camera was viewing the scene with the sun >roughly behind it so the subject (himself standing beside a motorbike) and >the background were both lit by the same amount. The video camera was
looking in roughly the opposite direction (so each camera could see the
other in its shot) and the background was lit far more brightly than the >subject (which had been exposed for) so the background was overexposed. With >the two cameras side-by-side, both seeing the same subject from the same >direction and lighting, the test would have been fairer because it would
have removed one very big variable.
For many years, one of the most obvious differences between original
video and film when both are shown on television, was that film only
captures the action 25 times per second with the camera effectively
blind for half the time because of a mechanical shutter, while a
television camera is collecting light all the time and displaying new pictorial information at twice the rate, that is 50 times per second.
Even if the exposure, detail, and steadiness of a film had all been perfected, anything that moved would unavoidably look jerky in a way
that never happened with original video material.
Then somebody discovered that digital cameras could be switched to
read out their information every frame instead of every field - half
the normal rate - and it would look jerky like film, and for some
reason they liked this. Despite it being a less realistic depiction of movement, it looked "filmic", which is good, it seems. Perhaps there
was a psychological effect whereby something that looked as if it had
been made on film implied that more money had been spent on the
production, because this was usually the case with film that had been
shot for the cinema. Whatever the reason, it became popular amongst
some programme makers, and is still used for a lot of productions
today, even though I doubt that any are still made using actual film.
It's even possible for a modern digital video camera to use an
electronic equivalent of the mechanical shutter in a film camera, so
that it effectively misses half the action in the same way. Some
programme makers like this too.
I've forgotten that one. Yes there was the increased jerkiness because
you had 25 full-res pictures per second rather than 50 half-res
pictures per second, so you lost some of the fluidity of motion.
Mini-cams, used for POV shots in action sequences (eg of presenter
going down a zip wire) or cockpit views in small aircraft, often
default to a very high shutter speed. You can distinguish that footage >because objects look unnaturally sharp (no motion blur) and you may get >strobing on aircraft propellers.
I think blur is _desirable_, certainly at normal (25-60) frame rates, as
it makes the intermittent-motion less noticeable - rather like the way
still images of a two-level something - printed text, or a black-and-white cartoon/graph/similar - are often acceptable at a lower resolution if
using more levels of greyscale. Obviously only within limits - better for
a panning shot, you don't want a disappearing vehicle to leave a ghost
trail like something out of hyperspace. (Unless you're making that sort of material of course!) But blur IMO makes the non-continuous nature of film _or_ video less noticeable - fast stills draw attention to it.
It's even possible for a modern digital video camera to use an
electronic equivalent of the mechanical shutter in a film camera, so
that it effectively misses half the action in the same way. Some
programme makers like this too.
Mini-cams, used for POV shots in action sequences (eg of presenter going
down a zip wire) or cockpit views in small aircraft, often default to a very >high shutter speed. You can distinguish that footage because objects look >unnaturally sharp (no motion blur) and you may get strobing on aircraft >propellers.
Some cameras now appear to use the electronic "shutter" as a means of controlling exposure, rather than the traditional iris mechanism,
probably because being electronic it requires no moving parts and is therefore cheaper to implement. The inevitable result is that the
camera is only sensitive to light for a variable percentage of the
time, and therefore only capturing part of the action. This is worst
for high brightness situations, for example an aeroplane propellor
against the sky, which is often captured as a series of visibly
separate images (distorted as well, because of the way the shutter
works) rather than a continuous blur as the eye would see.
If the shutter speed is fixed at 1/25 second, you
need to be able to vary the iris by a very large
amount to cater for going from bright sunlight to
a dim interior. I imagine most lenses would not
produce a very good picture (because of
diffraction) at f100 ;-)
Some cinema lenses are made with apertures of better than f/0.9 (they
aren't cheap!), to give the DoP best control of depth of field, and that's with a 35mm sensor/gate area. Obviously as the sensor size diminishes, the DoF increases, ceteris paribus, which is undesirable.
All that said, when I see aerial photography with curvy propellers, for example, I do wonder if the camera people know where their ND filters are
(or how to use them), or if it's deliberate...
I imagine it's unwanted but unavoidable, unless broadcasters can find minicams which can be stopped down or fitted with ND filters to prevent
the camera automatically compensating for bright light by reducing
exposure time.
think it's easily variable in the middle of a shot!). I can put my DSLR[]
in aperture priority mode, and have it vary the effective shutter speed
for me, but it's reactive*, and all the artefacts NY mentions will be
there. It's harder than one anticipates too, because there's actually a >fairly small dynamic range on the sensor.
On 06/04/2022 17:53, NY wrote:
I imagine it's unwanted but unavoidable, unlessJust out of interest, how neutral can you make an
broadcasters can find
minicams which can be stopped down or fitted
with ND filters to prevent
the camera automatically compensating for bright
light by reducing
exposure time.
LCD ND filter? It could even be controlled to keep
the light level at the sensor constant.
On 06/04/2022 18:05, John Williamson wrote:
On 06/04/2022 17:53, NY wrote:
I imagine it's unwanted but unavoidable, unless broadcasters canJust out of interest, how neutral can you make an LCD ND filter? It
find
minicams which can be stopped down or fitted with ND filters to
prevent
the camera automatically compensating for bright light by reducing
exposure time.
could even be controlled to keep the light level at the sensor
constant.
Dunno about LCD - they can be pretty clear - we have a see-through LCD
clock around somewhere, and the glass doesn't have any obvious tint to
it.
There are also "adjustable" ND filters. I think it's done with a couple
of linear polarizers
It suddenly occurs to me that I might buy one and get two normal
polarizers out of it.
Separately I did a Google search for linear polarizers this morning
and didn't turn up anything. I do have one but it's for a Cokin
medium-format system of old, and a bit cumbersome for what I want.
There are also "adjustable" ND filters. I think it's done with a couple of linear polarizers
Several of our freelancers had DXC3000s!
On Sat, 2 Apr 2022 16:22:55 +0100, SimonM
<somewhere@large.in.the.world> wrote:
Several of our freelancers had DXC3000s!
It pains me to tell you that we still have at least one of
those bloody things on the shelves in our 'old junk' store.
God alone knows why.
On Sat, 2 Apr 2022 16:22:55 +0100, SimonM
<somewhere@large.in.the.world> wrote:
Several of our freelancers had DXC3000s!
It pains me to tell you that we still have at least one of
those bloody things on the shelves in our 'old junk' store.
God alone knows why.
Several of our freelancers had DXC3000s!
It pains me to tell you that we still have at least one of
those bloody things on the shelves in our 'old junk' store.
God alone knows why.
Heck, they were obsolete by around 1988!
Since the licence fee is probably going (yeah,
right), it must be time for a BBC Bristol eBay
account, Shirley?
I vaguely remember the picture couldn't be matched
to the 2001s (in either direction).
Was it the newsroom one or the one stuck up in the
corner of the grid in St.B, or something else?.
PS: I couldn't get to Chris W.'s funeral (we had
family over from the US). Did he get a good
send-off? Very much hope so.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 297 |
Nodes: | 16 (2 / 14) |
Uptime: | 101:08:15 |
Calls: | 6,659 |
Calls today: | 1 |
Files: | 12,209 |
Messages: | 5,334,859 |