• Use of clapperboard in modern (eg 2022) interviews that are made on vid

    From NY@21:1/5 to All on Thu Mar 31 13:18:30 2022
    I notice that there is a modern trend to include the last few seconds of the preparations of an interview ("Everyone ready?" and then a clapperboard) in documentary interviews, as the voiceover is introducing a speaker. They did
    it on a documentary the other year about the night Britannia Bridge
    (Anglesey) caught fire, and also in the Channel 4 documentary last week
    about the Falklands War.

    Both programmes looked to be made on video - I doubt whether film is used
    much (or at all) for documentaries now. So why was a clapperboard being
    used? I can see that the numbers on it could be useful for shot
    identification, but wouldn't the sound be recorded by the camera (eg by a
    sound mixer feeding the camera's audio track). Or is sound sometimes
    recorded separately (probably on disk rather than tape, these days) rather
    than in-camera?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to All on Thu Mar 31 13:22:22 2022
    On 31/03/2022 13:18, NY wrote:
    I notice that there is a modern trend to include the last few seconds
    of the preparations of an interview ("Everyone ready?" and then a clapperboard) in documentary interviews, as the voiceover is
    introducing a speaker. They did it on a documentary the other year
    about the night Britannia Bridge (Anglesey) caught fire, and also in
    the Channel 4 documentary last week about the Falklands War.

    Both programmes looked to be made on video - I doubt whether film is
    used much (or at all) for documentaries now. So why was a clapperboard
    being used? I can see that the numbers on it could be useful for shot identification, but wouldn't the sound be recorded by the camera (eg
    by a sound mixer feeding the camera's audio track). Or is sound
    sometimes recorded separately (probably on disk rather than tape,
    these days) rather than in-camera?

    None of that prevents the audio and video getting out of sync. In fact
    it's more of a problem today, than it was even 20 years ago !

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to Andy Burns on Thu Mar 31 13:53:39 2022
    On 31/03/2022 13:20, Andy Burns wrote:
    NY wrote:

    why was a clapperboard being used?

    To sync up multiple mics recorded separately with the video?

    and/or multiple cameras

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andy Burns@21:1/5 to All on Thu Mar 31 13:20:53 2022
    NY wrote:

    why was a clapperboard being used?

    To sync up multiple mics recorded separately with the video?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Williamson@21:1/5 to All on Thu Mar 31 18:49:41 2022
    On 31/03/2022 18:25, J. P. Gilliver (John) wrote:

    There seems to be more tolerance of such mismatch than in the golden
    age. I suppose part of it is that there is just more material about now. (Tolerance among the originators I mean, of course; it still bugs _me_!)

    In some cases, the material may have been obtained via cellphone or
    zoom, and there has been no time to re-sync.

    The general public are also now used to seeing poorly time-aligned
    cellphone footage on Youtube and similar platforms, so no longer bother complaining.

    --
    Tciao for Now!

    John.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver (John)@21:1/5 to mark.carver@invalid.invalid on Thu Mar 31 18:25:44 2022
    On Thu, 31 Mar 2022 at 13:22:22, Mark Carver
    <mark.carver@invalid.invalid> wrote (my responses usually FOLLOW):
    On 31/03/2022 13:18, NY wrote:
    []
    Both programmes looked to be made on video - I doubt whether film is
    used much (or at all) for documentaries now. So why was a clapperboard >>being used? I can see that the numbers on it could be useful for shot >>identification, but wouldn't the sound be recorded by the camera (eg
    by a sound mixer feeding the camera's audio track). Or is sound
    sometimes recorded separately (probably on disk rather than tape,
    these days) rather than in-camera?

    None of that prevents the audio and video getting out of sync. In fact
    it's more of a problem today, than it was even 20 years ago !

    There seems to be more tolerance of such mismatch than in the golden
    age. I suppose part of it is that there is just more material about now. (Tolerance among the originators I mean, of course; it still bugs _me_!)
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    He [Alfred Kinsey] wouldn't ask 'Have you ever slept with a horse?' He would say, 'When did you first sleep with a horse?' [RT 2018/5/5-11]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to John Williamson on Thu Mar 31 20:02:14 2022
    "John Williamson" <johnwilliamson@btinternet.com> wrote in message news:jam7tqFli1qU1@mid.individual.net...
    On 31/03/2022 18:25, J. P. Gilliver (John) wrote:

    There seems to be more tolerance of such mismatch than in the golden
    age. I suppose part of it is that there is just more material about now.
    (Tolerance among the originators I mean, of course; it still bugs _me_!)

    In some cases, the material may have been obtained via cellphone or zoom,
    and there has been no time to re-sync.

    The general public are also now used to seeing poorly time-aligned
    cellphone footage on Youtube and similar platforms, so no longer bother complaining.

    I use Bluetooth earbuds to listen to my laptop when I'm watching recorded TV programmes on it. I quickly learned that the earbuds add a delay to the
    sound, and a bit of tweaking in VLC's Tools | Adjustments was needed to add
    a 200 millisecond delay to the pictures to bring them back into sync with
    the delayed sound. 200 msec doesn't sound much, but it is very noticeable.

    At least I knew the direction of the bad synchronisation - there was no way that the sound would be *before* the pictures, so I only had to increase the control in one direction until things looked right. Normally when sound and pictures are out of sync in a video file (assuming analogue headphones or speakers that don't introduce a delay) I have the greatest difficulty in working out whether the sound is leading or lagging, if it's a small amount.

    I'm surprised when people are being interviewed over cellphone or Zoom, the studio doesn't get them to do a simple clapperboard test (even clapping the hands) to make it easier for someone in the studio to resync the sound and vision if they happen to be out. Obviously this only works if the interview
    is recorded and there is time to precede the interview with a clapperboard
    and to resync afterwards.



    I noticed that the clapperboard which was used on the Falklands programme
    had LEDs which lit up when they detected the clap to make it easier for the editor to see when the clap occurred - easier than trying to see the first frame where the zig-zags on the clapper first come together.

    I imagine that when filming actors/reporters who are out of mike cable or
    even radio range, it is easier to record the sound locally, completely unsynchronised, and do a clapperboard test. I experimented with that as a "class exercise" when I was teaching myself to use Premiere Elements: record the sound and vision on a normal camcorder, but also record the sound on my mobile phone. Ditch the camcorder sound, and then try to move the phone
    sound up and down the timeline until it matched the picture: I tapped a
    pencil on the desk to provide the sound and pictures to sync and then
    reframed the camera.


    One interesting thing that I noticed when I was watching Lewis (as in
    Inspector Morse) being filmed: although it was being filmed on film not
    video, I wasn't aware of clapperboards being used at all. Somehow the sound
    was being recorded on a separate recorder in a way that could be
    synchronised easily in post-production. Is a clapperboard needed if the
    camera and sound recorder are both fed from the same timecode source which marks each frame of film and each "frame" of sound data with a code which is known to be in sync. If so, why are clapperboards still needed?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Thu Mar 31 21:04:16 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:orvb4h51a113o9hhe4bobvtneqnvh7rkij@4ax.com...
    On Thu, 31 Mar 2022 20:02:14 +0100, "NY" <me@privacy.invalid> wrote:

    One interesting thing that I noticed when I was watching Lewis (as in >>Inspector Morse) being filmed: although it was being filmed on film not >>video, I wasn't aware of clapperboards being used at all. Somehow the
    sound
    was being recorded on a separate recorder in a way that could be >>synchronised easily in post-production. Is a clapperboard needed if the >>camera and sound recorder are both fed from the same timecode source which >>marks each frame of film and each "frame" of sound data with a code which >>is
    known to be in sync. If so, why are clapperboards still needed?

    Clapperboards shouldn't be needed at all if the timecode generators in
    any unconnected equipment are properly adjusted. They were only ever
    needed in the days of cine film because the sound and picture were
    recorded on different equipment. Television was originally live, and
    then when a method of recording it was invented, it recorded sound and picture on the same machine, usually from static equipment connected
    by cables in a studio. I think it must have been the trend to use
    multiple portable equipment - cameras and microphones - without cables
    that regenerated the need for an identifiable event on both sound and
    vision, or maybe they just didn't trust themselves to set up the
    timecode generators properly.

    I've seen film news crews with a cameraman and sound recordist tethered by umbilical cord. And yet they still use clapperboards. Maybe as belt and
    braces. Did the cable tend *only* to send sync pulses to cater for film that may not run at *exactly* 25 fps, to alter the tape recorder's speed, but without the ability to generate frame-accurate timecodes to label the tape?

    Did 16mm film news crews ever have one person doing both jobs - filming with
    a camera and recording sound on a separate recorder over their shoulder,
    maybe from a camera-mounted mike - or was it always a two-man job (partly to keep unions happy, partly for technical reasons)? I realise that modern ENG crews record sound directly onto the camera, either by cable-connected mike
    or radio-linked mike that a second person may control if it's not mounted on-camera.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Thu Mar 31 20:30:11 2022
    On Thu, 31 Mar 2022 20:02:14 +0100, "NY" <me@privacy.invalid> wrote:

    One interesting thing that I noticed when I was watching Lewis (as in >Inspector Morse) being filmed: although it was being filmed on film not >video, I wasn't aware of clapperboards being used at all. Somehow the sound >was being recorded on a separate recorder in a way that could be
    synchronised easily in post-production. Is a clapperboard needed if the >camera and sound recorder are both fed from the same timecode source which >marks each frame of film and each "frame" of sound data with a code which is >known to be in sync. If so, why are clapperboards still needed?

    Clapperboards shouldn't be needed at all if the timecode generators in
    any unconnected equipment are properly adjusted. They were only ever
    needed in the days of cine film because the sound and picture were
    recorded on different equipment. Television was originally live, and
    then when a method of recording it was invented, it recorded sound and
    picture on the same machine, usually from static equipment connected
    by cables in a studio. I think it must have been the trend to use
    multiple portable equipment - cameras and microphones - without cables
    that regenerated the need for an identifiable event on both sound and
    vision, or maybe they just didn't trust themselves to set up the
    timecode generators properly.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Fri Apr 1 10:08:37 2022
    On Thu, 31 Mar 2022 21:04:16 +0100, "NY" <me@privacy.invalid> wrote:

    I've seen film news crews with a cameraman and sound recordist tethered by >umbilical cord. And yet they still use clapperboards. Maybe as belt and >braces. Did the cable tend *only* to send sync pulses to cater for film that >may not run at *exactly* 25 fps, to alter the tape recorder's speed, but >without the ability to generate frame-accurate timecodes to label the tape?

    Recording synchronous sound with film usually meant using a film
    camera and a tape recorder with very accurate and stable motor speeds
    governed by quartz crystal oscillators. The tape recording would later
    be copied onto fully coated magnetic film stock ("Sepmag" as the BBC
    called it) with the same gauge as the picture film for use on the
    editing table. Picture and sound were thus recorded on separate
    machines and so needed an identifiable event on both sound and vision
    to be able to synchronise them later, hence the clapperboard. There
    wasn't normally any connection between the camera and the sound
    recorder, but the stability provided by the crystal oscillator enabled
    them to keep pace for longer than the maximum running time of a roll
    of film (about 10 minutes), which was usually longer than the running
    time of a typical drama scene (about 3 or 4 minutes).

    SMPTE timecode didn't appear until some time in the 60s or maybe early
    70s as I recall, at first only used with electronic recording
    equipment. Film cameras that could record the code optically on the
    film didn't appear until some time later.

    I don't know what an umbilical cable between a film camera and a sound
    recorder would be carrying. My best guess is it would be to enable use
    of a standard stereo tape recorder (i.e. not a special moviemaking one
    with crystal control built in) so the cable would be carrying frame
    pulses that the machine would record on one of the audio tracks.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ashley Booth@21:1/5 to Roderick Stewart on Fri Apr 1 10:02:02 2022
    Roderick Stewart wrote:

    On Thu, 31 Mar 2022 21:04:16 +0100, "NY" <me@privacy.invalid> wrote:

    I've seen film news crews with a cameraman and sound recordist
    tethered by umbilical cord. And yet they still use clapperboards.
    Maybe as belt and braces. Did the cable tend only to send sync
    pulses to cater for film that may not run at exactly 25 fps, to
    alter the tape recorder's speed, but without the ability to
    generate frame-accurate timecodes to label the tape?

    Recording synchronous sound with film usually meant using a film
    camera and a tape recorder with very accurate and stable motor speeds governed by quartz crystal oscillators. The tape recording would later
    be copied onto fully coated magnetic film stock ("Sepmag" as the BBC
    called it) with the same gauge as the picture film for use on the
    editing table. Picture and sound were thus recorded on separate
    machines and so needed an identifiable event on both sound and vision
    to be able to synchronise them later, hence the clapperboard. There
    wasn't normally any connection between the camera and the sound
    recorder, but the stability provided by the crystal oscillator enabled
    them to keep pace for longer than the maximum running time of a roll
    of film (about 10 minutes), which was usually longer than the running
    time of a typical drama scene (about 3 or 4 minutes).

    SMPTE timecode didn't appear until some time in the 60s or maybe early
    70s as I recall, at first only used with electronic recording
    equipment. Film cameras that could record the code optically on the
    film didn't appear until some time later.

    I don't know what an umbilical cable between a film camera and a sound recorder would be carrying. My best guess is it would be to enable use
    of a standard stereo tape recorder (i.e. not a special moviemaking one
    with crystal control built in) so the cable would be carrying frame
    pulses that the machine would record on one of the audio tracks.

    Rod.

    The umbilical cord carried a 50Hz signal derived from the camera's
    motor. This was recorded on the tape recorder's pilot track. (Nagra
    neopilot) It also carried a bleep siganal that recorded a tone on the recorder's audio track at the same time as a flash was filmed in the
    camera.

    This was before crystal oscillators were used.

    Ashley
    Nagra and Sondor (Sepmag) repairer 1969-1974.

    --


    --
    This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Ashley Booth on Fri Apr 1 11:41:01 2022
    "Ashley Booth" <removetab@snglinks.com> wrote in message news:jao0sqF1bl1U1@mid.individual.net...

    The umbilical cord carried a 50Hz signal derived from the camera's
    motor. This was recorded on the tape recorder's pilot track. (Nagra
    neopilot) It also carried a bleep siganal that recorded a tone on the recorder's audio track at the same time as a flash was filmed in the
    camera.

    This was before crystal oscillators were used.

    Ah, do both the camera and the recorder have *separate* crystals? I always thought that just one device (eg the camera) had the crystal, and the
    umbilical cord was feeding this frequency as a reference to the other device
    to control its motor. I can see the advantage of both devices having their
    own crystal (as long as both are accurately the same frequency) because it allows the recorder to be untethered from the camera.

    When the sound and pictures are being synchronised at the editing stage, how
    is the audio allowed to "slip" relative to the film to get them in sync? Is
    the sound still on magnetic *tape* at that stage, or has it been dubbed onto sprocketed magnetic film? Presumably you want to be able to adjust the
    position of the synchronising clap on the tape to within a smaller
    resolution than the nearest frame (1/25 second or 80 milliseconds). Or is a mismatch of up to +/- 40 msec deemed to be close enough not to be noticed?

    I realise that once the sound and pictures have been synchronised, the final master will either keep separate films (sepmag/sepopt) or else be dubbed/printed onto an optical track on the film (commag/comopt). Was news
    film always played out as sepmag, to keep things as simple as possible?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Fri Apr 1 12:08:44 2022
    On Fri, 1 Apr 2022 11:41:01 +0100, "NY" <me@privacy.invalid> wrote:

    "Ashley Booth" <removetab@snglinks.com> wrote in message >news:jao0sqF1bl1U1@mid.individual.net...

    The umbilical cord carried a 50Hz signal derived from the camera's
    motor. This was recorded on the tape recorder's pilot track. (Nagra
    neopilot) It also carried a bleep siganal that recorded a tone on the
    recorder's audio track at the same time as a flash was filmed in the
    camera.

    This was before crystal oscillators were used.

    Ah, do both the camera and the recorder have *separate* crystals?

    Yes

    I always
    thought that just one device (eg the camera) had the crystal, and the >umbilical cord was feeding this frequency as a reference to the other device >to control its motor. I can see the advantage of both devices having their >own crystal (as long as both are accurately the same frequency) because it >allows the recorder to be untethered from the camera.

    Exactly that. The stability of the crystals was more than good enough
    for the longest continuous take anyone was likely to do.

    When the sound and pictures are being synchronised at the editing stage, how >is the audio allowed to "slip" relative to the film to get them in sync? Is >the sound still on magnetic *tape* at that stage, or has it been dubbed onto >sprocketed magnetic film? Presumably you want to be able to adjust the >position of the synchronising clap on the tape to within a smaller
    resolution than the nearest frame (1/25 second or 80 milliseconds). Or is a >mismatch of up to +/- 40 msec deemed to be close enough not to be noticed?

    An editing table takes a cutting copy of the film, and a magnetically
    coated sprocketed copy of the sound. They are brought into
    synchronisation for editing by painstaking use of of chinagraph
    pencils and sticky tape, and the mechanism of the table keeps them
    geared together. If the editor needs to remove or add a piece of
    picture film as part of the edit, they have to remove or add pieces of
    magnetic film to match, so the entire reels on the editing table
    remain in sync. Yes, it's laborious, but that's what they did.

    I realise that once the sound and pictures have been synchronised, the final >master will either keep separate films (sepmag/sepopt) or else be >dubbed/printed onto an optical track on the film (commag/comopt). Was news >film always played out as sepmag, to keep things as simple as possible?

    Synchronous sound was unusual on news film reports. Usually it was
    just effects tracks and voiceovers that were added later, sometimes
    with the presenter doing the voiceover live.

    The film was normally reversal stock, i.e. there was no negative but
    the actual film that was used in the camera became the transmitted
    copy. This was quicker to process but more vulnerable to loss or
    damage during editing of course, but for news the speed was usually
    considered paramount, and the presenter could talk over it. Reversal
    stock was available with a magnetic stripe, but I think this was
    mostly used by amateur moviemakers, not professional news crews.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Fri Apr 1 13:41:36 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:5hmd4h1fodfs1d2ljpnr5vmessakfvceo3@4ax.com...
    When the sound and pictures are being synchronised at the editing stage, >>how
    is the audio allowed to "slip" relative to the film to get them in sync?
    Is
    the sound still on magnetic *tape* at that stage, or has it been dubbed >>onto
    sprocketed magnetic film? Presumably you want to be able to adjust the >>position of the synchronising clap on the tape to within a smaller >>resolution than the nearest frame (1/25 second or 80 milliseconds). Or is
    a
    mismatch of up to +/- 40 msec deemed to be close enough not to be noticed?

    An editing table takes a cutting copy of the film, and a magnetically
    coated sprocketed copy of the sound. They are brought into
    synchronisation for editing by painstaking use of of chinagraph
    pencils and sticky tape, and the mechanism of the table keeps them
    geared together. If the editor needs to remove or add a piece of
    picture film as part of the edit, they have to remove or add pieces of magnetic film to match, so the entire reels on the editing table
    remain in sync. Yes, it's laborious, but that's what they did.

    Could the sprockets of the sound film be moved relative to those of the
    picture film (eg by a clutch that was temporarily disengaged) or could you
    only sync the sound to the nearest sprocket on the sound film?

    I've seen a bit on Youtube about editing film on a Steenbeck and with a splicing block.

    Then we get into all the extra complications of A/B-roll editing of
    negatives, where alternate shots were spliced into two different films, with appropriate lengths of black film to match the length of film-with-pictures
    on the other roll, and then the two films were each printed in turn onto the same positive - all to avoid the dreaded splicing tape being visible on the print. Some of the Southern TV programmes (eg Freewheelers) that Talking Pictures show evidently didn't have the budget for that extra finesse, and
    you can see splicing tape and joins at every shot change :-(


    I realise that once the sound and pictures have been synchronised, the >>final
    master will either keep separate films (sepmag/sepopt) or else be >>dubbed/printed onto an optical track on the film (commag/comopt). Was news >>film always played out as sepmag, to keep things as simple as possible?

    Synchronous sound was unusual on news film reports. Usually it was
    just effects tracks and voiceovers that were added later, sometimes
    with the presenter doing the voiceover live.

    I suppose the main time that there would be sync sound would be speeches, interviews and reporters' pieces to camera. As you say, a lot of the report just needs effects (either library or recorded "wild") with the reporter's voice dubbed on after editing.

    The film was normally reversal stock, i.e. there was no negative but
    the actual film that was used in the camera became the transmitted
    copy. This was quicker to process but more vulnerable to loss or
    damage during editing of course, but for news the speed was usually considered paramount, and the presenter could talk over it. Reversal
    stock was available with a magnetic stripe, but I think this was
    mostly used by amateur moviemakers, not professional news crews.

    Yes, I believe they usually used Ektachrome 320 which was tungsten-balanced, with a daylight filter for outdoors where there was more daylight and so the loss of 2 (?) stops of speed due to the blue filter was less critical,
    leaving maximum speed for indoor tungsten-lit shots. And then it might be push-processed if light was low, which is why a lot of news film looks
    rather high-contrast, with little shadow and highlight detail and garish colours. I've seen the effect of doing this on 35 mm still photos when I
    took some photos under tungsten light and needed high speed to freeze motion
    as much as possible. The results were not pretty :-(

    Ektachrome was used because it was simple E6 processing, rather than the
    more laborious processing of Kodachrome which was slower film anyway. Did
    any labs other than Kodak (eg at Hemel Hempstead) actually do Kodachrome processing? What processing systems did other reversal films use - eg Agfa
    and Fuji? Were they proprietary or E6?

    Editing striped film had the problem that the sound was recorded with an
    offset (40 frames?) from the corresponding picture so the film had some distance to get back to a stabilised speed after the intermittent motion of
    the gate, so you had to allow for that in choosing an edit point - or copy
    the sound off to separate mag tape, edit it separately and then copy it back
    to the stripe or treat it as sepmag. I imagine striped film was only used
    for news if they needed the most compact setup possible (camera only, mike
    on camera, no separate tape recorder).

    The only time I've seen 16 mm film with sound being filmed/recorded was when
    I was one of several patients of a chiropractor who were filmed being
    treated, for an Anglia TV documentary series in the late 70s about
    alternative therapy. I got my train fare up to London paid for and got a
    free treatment. There was a lot of faffing about setting up very bright
    lights, blue film over a window that was in shot (so the daylight through it matched the tungsten lights) and repetitions while the same thing was filmed from several angles, and lots of jargon that I didn't understand: "wild
    track" sound [effects not intended to be synchronised with the pictures],
    BCU [big close up], end-board [when the sound recordist forgot to use the clapperboard before the shot, so did a clap *after* the shot]. There was
    even a hiatus when a crystal failed (can't remember whether it was camera or recorder) and they had to get another one from the van. It was all very exciting. But when the episode was eventually broadcast, the buggers didn't
    use any of the shots of the patients' treatment or the interviews with us:
    the only bit of the episode that was devoted to my chiropractor was a minute
    or so of an interview with him done on another day. But they still found
    time for lots of padding shots of a gymnast on parallel bars, though she was undoubtedly more photogenic than me! What a con! But par for the course:
    they always shoot more that they need.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Fri Apr 1 14:46:53 2022
    On Fri, 1 Apr 2022 13:41:36 +0100, "NY" <me@privacy.invalid> wrote:

    An editing table takes a cutting copy of the film, and a magnetically
    coated sprocketed copy of the sound. They are brought into
    synchronisation for editing by painstaking use of of chinagraph
    pencils and sticky tape, and the mechanism of the table keeps them
    geared together. If the editor needs to remove or add a piece of
    picture film as part of the edit, they have to remove or add pieces of
    magnetic film to match, so the entire reels on the editing table
    remain in sync. Yes, it's laborious, but that's what they did.

    Could the sprockets of the sound film be moved relative to those of the >picture film (eg by a clutch that was temporarily disengaged) or could you >only sync the sound to the nearest sprocket on the sound film?

    I've never seen the guts of a Steenbeck, but the KEM editing table at
    a place I once worked had a system of toothed belts and clutches
    underneath the deck. I don't remember how closely they could be
    matched, but even if you could synchronise one track with a resolution
    of less than one frame (i.e. one sprocket hole for 16mm), if you then
    had to splice another bit of sound into it, you could only do that to
    the nearest sprocket. The maximum error would be half a frame, or
    20ms, but if that wasn't good enough all you could do would be to
    transfer that piece of audio from the original tape onto sepmag and
    hope that its timing relative to the sprocket holes was different.

    Now that we can do everything with so much more precision using
    software editing of digital recordings, you'd think that timing errors
    would be less frequent and less noticeable, but I get the impression
    that this is not the case...

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver (John)@21:1/5 to me@privacy.invalid on Fri Apr 1 15:37:04 2022
    On Thu, 31 Mar 2022 at 21:04:16, NY <me@privacy.invalid> wrote (my
    responses usually FOLLOW):
    []
    I've seen film news crews with a cameraman and sound recordist tethered
    by umbilical cord. And yet they still use clapperboards. Maybe as belt
    and braces.
    []
    I suspect it's more a matter of using the board as shot identification -
    i. e. it's what's written on the board (or shown in its display if a
    modern fancy one) that's more important. Adding the "clack" adds almost
    nothing to the time taken, so may as well be done for the rare case it
    _is_ useful (and I suspect for some makes them feel they are doing
    "real" film production!); if it's loud enough, it probably also has an
    effect similar to someone shouting "quiet on set", in the situations
    where more than one man and his dog are present anyway.
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    A lot of people think that being skinny is the happy ending, and its not.
    Being happy is the happy ending. - Sarah Millican, in Radio Times 3-9 March 2012

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Fri Apr 1 15:18:14 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:movd4hd4hl90q8jf9icfb9f9blql5aq2b2@4ax.com...
    Now that we can do everything with so much more precision using
    software editing of digital recordings, you'd think that timing errors
    would be less frequent and less noticeable, but I get the impression
    that this is not the case...

    I imagine that because it is no longer necessary to synchronise the sound
    with the film, and the sound and pictures are "welded together" into a
    single AVI/MPG/whatever file or live data-stream, people don't bother to
    check whether they are precisely in sync, given that by definition they are always *almost* in sync.

    It's the standard problem with people forgetting that an automatic process
    that usually works without any human intervention may still need the odd
    tweak. Just like with auto-focus, auto-exposure and auto-white-balance:
    often they get it just right, but sometimes a tweak and a bit of human brain-power is needed.

    Do modern video cameras still display zebra-stripes in the viewfinder (if
    the cameraman enables the feature) to draw attention to one or more of the
    RGB channels being overexposed? I see a lot of documentaries (and even some dramas which have probably been through proper grading) where there are
    areas of people's faces where one or more of the channels has maxed-out. Featureless orange areas on people's faces are not very flattering ;-)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to MB@nospam.net on Fri Apr 1 15:45:36 2022
    "MB" <MB@nospam.net> wrote in message news:t272t0$95k$1@dont-email.me...
    On 01/04/2022 15:18, NY wrote:
    I imagine that because it is no longer necessary to synchronise the sound
    with the film, and the sound and pictures are "welded together" into a
    single AVI/MPG/whatever file or live data-stream, people don't bother to
    check whether they are precisely in sync, given that by definition they
    are
    always*almost* in sync.

    But we still get sound and vision out of sync quite regularly.

    That was my point: people mistakenly think that sound and vision can't get
    out of sync these days and therefore don't check before going live.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From MB@21:1/5 to All on Fri Apr 1 15:40:32 2022
    On 01/04/2022 15:18, NY wrote:
    I imagine that because it is no longer necessary to synchronise the sound with the film, and the sound and pictures are "welded together" into a
    single AVI/MPG/whatever file or live data-stream, people don't bother to check whether they are precisely in sync, given that by definition they are always*almost* in sync.

    But we still get sound and vision out of sync quite regularly.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver (John)@21:1/5 to me@privacy.invalid on Fri Apr 1 16:01:01 2022
    On Fri, 1 Apr 2022 at 13:41:36, NY <me@privacy.invalid> wrote (my
    responses usually FOLLOW):
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message >news:5hmd4h1fodfs1d2ljpnr5vmessakfvceo3@4ax.com...
    []
    The film was normally reversal stock, i.e. there was no negative but
    the actual film that was used in the camera became the transmitted
    copy. This was quicker to process but more vulnerable to loss or
    damage during editing of course, but for news the speed was usually
    considered paramount, and the presenter could talk over it. Reversal
    stock was available with a magnetic stripe, but I think this was
    mostly used by amateur moviemakers, not professional news crews.

    Yes, I believe they usually used Ektachrome 320 which was
    tungsten-balanced, with a daylight filter for outdoors where there was
    more daylight and so the loss of 2 (?) stops of speed due to the blue
    filter was less critical, leaving maximum speed for indoor tungsten-lit >shots.
    []
    Yes, I remember when I was shooting 8mm. For Standard 8, I had a
    clockwork camera (very similar to, or possibly the same model as, Mr. Zapruder's), which had auto-exposure (_big_ selenium cell on the front),
    but which was set for 10 ASA (!), which by the time I was filming only
    Perutz made; I did do some reels of Kodak (Kodachrome IIRR), which was
    25, mostly by looking at where the autoex was going to go (there was a
    little visible needle) and closing down another stop. Needed
    pre-consideration, though, so couldn't be unplanned (unless left where
    set last time and hoped for the best).
    For super 8, about the only stock _available_ was Kodak (Kodachrome
    again I think) 40ASA indoor, and the camera did indeed have a filter
    that in effect gave you 25ASA outdoor; it was in by default, and
    disengaged by inserting a sort of square key into a slot in the bottom
    of the camera handle. IIRR the filter was reddish in colour, but that
    might have been a "blue filter" in that it filtered out some of the blue
    (in the same way a "UV filter" isn't UV in colour).

    [I still have the cameras. Skip if I've mentioned this before: somewhere
    in the last few years, while discussing the Wolverine film-scanner and
    its clones/competitors with a local historian, I was showing him, and
    pressed the trigger - and it whirred away; from winding I'd put into the
    spring decades before. Made me very sad: a piece of precision equipment
    - OK, mass-produced, but all cast and machined metal, not plastic - that
    will never be used again. (The Super 8 one I always felt was inferior -
    OK, better pictures as a larger film area, but IIRR fixed exposure -
    certainly had the "feel" of the 1970s mass market. I used it mainly as
    it was a gift from a great-uncle.)]
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    A lot of people think that being skinny is the happy ending, and its not.
    Being happy is the happy ending. - Sarah Millican, in Radio Times 3-9 March 2012

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver (John)@21:1/5 to me@privacy.invalid on Fri Apr 1 16:04:20 2022
    On Fri, 1 Apr 2022 at 15:45:36, NY <me@privacy.invalid> wrote (my
    responses usually FOLLOW):
    "MB" <MB@nospam.net> wrote in message news:t272t0$95k$1@dont-email.me...
    On 01/04/2022 15:18, NY wrote:
    I imagine that because it is no longer necessary to synchronise the sound >>> with the film, and the sound and pictures are "welded together" into a
    single AVI/MPG/whatever file or live data-stream, people don't bother to >>> check whether they are precisely in sync, given that by definition
    they are
    always*almost* in sync.

    But we still get sound and vision out of sync quite regularly.

    Especially, it seems, on archive material on YouTube, where you'd think
    there wouldn't be time pressure.

    That was my point: people mistakenly think that sound and vision can't
    get out of sync these days and therefore don't check before going live.

    And wouldn't know how (or in some cases have the facility) to fix it if
    they did check and notice.
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    "You _are_ Zaphod Beeblebrox? _The_ Zaphod Beeblebrox?"
    "No, just _a_ Zaphod Beeblebrox. I come in six-packs." (from the link episode)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to G6JPG@255soft.uk on Sat Apr 2 07:49:56 2022
    On Fri, 1 Apr 2022 15:37:04 +0100, "J. P. Gilliver (John)"
    <G6JPG@255soft.uk> wrote:

    On Thu, 31 Mar 2022 at 21:04:16, NY <me@privacy.invalid> wrote (my
    responses usually FOLLOW):
    []
    I've seen film news crews with a cameraman and sound recordist tethered
    by umbilical cord. And yet they still use clapperboards. Maybe as belt
    and braces.
    []
    I suspect it's more a matter of using the board as shot identification -
    i. e. it's what's written on the board (or shown in its display if a
    modern fancy one) that's more important. Adding the "clack" adds almost >nothing to the time taken, so may as well be done for the rare case it
    _is_ useful (and I suspect for some makes them feel they are doing
    "real" film production!); if it's loud enough, it probably also has an
    effect similar to someone shouting "quiet on set", in the situations
    where more than one man and his dog are present anyway.

    I think what you are describing is called "tradition". It's a concept
    that seems curiously out of place in engineering, or any practical
    enterprise, but ultimately human nature prevails everywhere, even when
    we think we're behaving logically.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From SimonM@21:1/5 to Roderick Stewart on Sat Apr 2 08:39:46 2022
    On 01/04/2022 12:08, Roderick Stewart wrote:
    The film was normally reversal stock, i.e. there was no negative but
    the actual film that was used in the camera became the transmitted
    copy. This was quicker to process but more vulnerable to loss or
    damage during editing of course, but for news the speed was usually considered paramount, and the presenter could talk over it. Reversal
    stock was available with a magnetic stripe, but I think this was
    mostly used by amateur moviemakers, not professional news crews.

    I worked on the final years of film for TV news.

    COMMAG was the _normal_ method of recording sync
    sound for BBC news crews, right up until the end
    of film for news. It had nothing to do with
    unions, restrictive practices or whatever, instead
    a large amount of very expensive kit, that would
    have to be replaced by rather inferior (initially)
    video equipment.

    Recordists used a separate (BBC designed & built)
    recording amplifier for the recordist (a bit
    bigger than an ASC Minx mixer, if you remember
    those). It had one or two mic amps with a mixer
    circuit and very limited EQ (HPF for stripping
    wind noise IIRC, although that was available on
    the T-power box for gun mics). It had a
    yellow-spot PPM and headphone amp, and the
    all-important record head driver circuit for the
    camera.

    Obviously there was no "off tape" monitoring, and
    commag head clogs were far from unknown - the mag
    stripe was also noisy and prone to oxide shedding.

    I have a feeling that there were no electronics in
    the camera itself, although there might have been
    a bias oscillator (if not then that too was in the
    recordist's unit).

    I have seen them on eBay from time to time,
    usually wrongly described, as the owners don't
    know what they were for.

    Sound was recorded, IIRC, 40 frames** early. This
    didn't matter in TK, as their playback head was
    correctly positioned, but it did matter in the
    cutting room, as a cut introduced a 40-frame gap
    on the outgoing, and a thump/click/whatever as the
    incoming started under the wrong pictures. 40
    frames is rather a lot - getting on for two seconds.

    So you would often get instructions in the studio
    (from the editor) to hold the channel closed on
    prefade until the end of a spurious sentence, then
    whang* it open just before the first wanted words
    of the interview / piece to camera / whatever, to
    match the shot change into sync vision. I've done
    live "dubs" on air this way. Obviously the
    outgoing has a 40 frame hole, but you'd usually be
    covering GVs with grams or sound lifted from the
    commag earlier (that could be used non-sync, i.e.
    "wild"). If the sync-in was "clean", i.e. it
    didn't have embarrassing stuff at the front, a
    tidy fade-up the tail of the outgoing shot was
    practical.

    The TK (telecine) preview monitor available to the
    studio sound supervisor could have a footage
    counter burned in, which is easier to use than
    seconds, and if there was time, you'd get some
    sort of cue sheet from the editor, (or a tap on
    the shoulder at the right point!).

    The mixer would usually get a grams swinger for
    this sort of thing for specific effects or
    covering atmos), and a rehearsal if it was complex
    but important to do neatly (and there was time
    pre-transmission). In regional news that could be
    an audio assistant, or a floor manager, or the
    actual film editor (best of all, as he/she knew
    the piece intimately).

    S.

    *whang: faster than a fade-up, but not featuring
    any cut button (We don't mix like that, laddie!").

    ** 40 frames is a linear foot of raw 16mm stock.
    Not to be confused with the 35mm foot, the unit
    used in editing/dubbing, which is 16 frames (it's
    a physical 35mm foot and roughly 0.6 sec at
    25fps). 35mm feet are much easier to count in
    one's head, and the only practical thing to use
    when working with mixed formats, e.g. a 16mm
    reduction work print, but 35mm triple-track sound
    (or vice-versa).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From SimonM@21:1/5 to All on Sat Apr 2 08:55:27 2022
    On 01/04/2022 15:45, NY wrote:
    "MB" <MB@nospam.net> wrote in message
    news:t272t0$95k$1@dont-email.me...
    But we still get sound and vision out of sync
    quite regularly.

    That was my point: people mistakenly think that
    sound and vision can't get out of sync these days
    and therefore don't check before going live.

    It's a lot harder to check sync than the analogue
    days. Galleries usually have huge screens
    configured as a bunch of virtual monitors, with
    undefined delays.

    Until recently the Technical Manager and the Sound
    Supervisor in Bristol A still had CRTs,
    specifically for sync checking, but even that only
    means "It's all right leaving me!" and any sync
    issues downstream of studio output cannot be taken
    account of. Paul could give a better explanation,
    if he picks this up.

    But getting back on-topic, clapperboards are still
    very convenient for a raft of reasons. My Canon
    camera is basically a 35mm full-frame stills unit,
    but its video performance is pretty good (will do
    4k). It has timecode, but that's pretty much
    unusable - you cannot jam-sync it to anything, nor
    can you easily get it to display or auto-sync
    stuff in much editing software.

    A clapperboard can contain a lot of useful
    information, a greyscale for picture matching, and
    also quietens things down on location, and it can
    sync quite a few cameras, as long as it can be
    clearly seen by them.

    The alternative is a timecode display on a tablet.
    This works well (longitudinal TC sent to camera
    and audio tracks), and cameras can simply frame up
    on it after any stop-start, but it doesn't have
    the authority of a clapperboard (for the humans),
    and you still need cables or radio links to carry
    the LTC.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to SimonM on Sat Apr 2 11:16:55 2022
    "SimonM" <somewhere@large.in.the.world> wrote in message news:t28uk2$eqg$1@dont-email.me...
    On 01/04/2022 12:08, Roderick Stewart wrote:
    The film was normally reversal stock, i.e. there was no negative but
    the actual film that was used in the camera became the transmitted
    copy. This was quicker to process but more vulnerable to loss or
    damage during editing of course, but for news the speed was usually
    considered paramount, and the presenter could talk over it. Reversal
    stock was available with a magnetic stripe, but I think this was
    mostly used by amateur moviemakers, not professional news crews.

    I worked on the final years of film for TV news.

    COMMAG was the _normal_ method of recording sync sound for BBC news crews, right up until the end of film for news. It had nothing to do with unions, restrictive practices or whatever, instead a large amount of very
    expensive kit, that would have to be replaced by rather inferior
    (initially) video equipment.

    Recordists used a separate (BBC designed & built) recording amplifier for
    the recordist (a bit bigger than an ASC Minx mixer, if you remember
    those). It had one or two mic amps with a mixer circuit and very limited
    EQ (HPF for stripping wind noise IIRC, although that was available on the T-power box for gun mics). It had a yellow-spot PPM and headphone amp, and the all-important record head driver circuit for the camera.

    That would explain the umbilical cord: not to keep the camera and recorder
    in sync, but to feed the sound signal from the soundman's mike and mixer to
    the recorder in the camera. Presumably it meant that any sync sound would
    get cut at a different point to the corresponding pictures, unless the
    commag track was laid off to magnetic film which was cut synchronously with
    the film. I can't imagine there'd be time for that in a news studio.

    Interesting that you say that ENG equipment was initially inferior. OK, you
    got smearing and stuck images (photographers' flashguns), but my impression
    was that the picture was generally a lot better because it didn't have the
    huge film grain and the drab muddy colours.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to SimonM on Sat Apr 2 11:11:15 2022
    "SimonM" <somewhere@large.in.the.world> wrote in message news:t28vhg$v1b$1@dont-email.me...
    A clapperboard can contain a lot of useful information, a greyscale for picture matching, and also quietens things down on location, and it can
    sync quite a few cameras, as long as it can be clearly seen by them.

    The alternative is a timecode display on a tablet. This works well (longitudinal TC sent to camera and audio tracks), and cameras can simply frame up on it after any stop-start, but it doesn't have the authority of
    a clapperboard (for the humans), and you still need cables or radio links
    to carry the LTC.

    The clapperboard in the documentary that I initially referred to displayed a timecode, as well as having LEDs which detected the sound of the clap and
    lit up for a few frames at that point. Any shot-ID writing was too small to read on SD as broadcast, though it may just about have been legible on the
    HD original.

    I wonder why there is the trend in the last few years to show the
    preparation for the interview ("are you sitting comfortably?", camera making final focus and framing adjustments, clapperboard) as filler before the interview begins as the interviewee is being introduced by the narrator.
    Will it become as much of a cliche as the establish shot of an interview walking somewhere (assumed to be going to the interview) before the closeup
    of them being interviewed.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From SimonM@21:1/5 to All on Sat Apr 2 16:22:55 2022
    On 02/04/2022 11:16, NY wrote:

    That would explain the umbilical cord: not to keep
    the camera and recorder in sync, but to feed the
    sound signal from the soundman's mike and mixer to
    the recorder in the camera. Presumably it meant
    that any sync sound would get cut at a different
    point to the corresponding pictures, unless the
    commag track was laid off to magnetic film which
    was cut synchronously with the film. I can't
    imagine there'd be time for that in a news studio.

    It was done occasionally, if there was time and
    the item merited it. In Bristol this required a TK
    channel for the transfer, as the Audio Unit, who
    ran transfer didn't have commag-capable equipment.

    Yes, I explained elsewhere, a 40 frame offset,
    IIRC (but I'd have to check - I have Samuelson's
    book on cameras downstairs somewhere).

    Interesting that you say that ENG equipment was
    initially inferior. OK, you got smearing and stuck
    images (photographers' flashguns), but my
    impression was that the picture was generally a
    lot better because it didn't have the huge film
    grain and the drab muddy colours.

    It was usually Tungsten-balanced Ektachrome 160 or
    equivalent (E6 process), occasionally pushed 1 or
    2 stops. This was done so that the correction
    filter in the camera(Wratten 80A?) was added in
    daylight, rather than indoors where there were
    already low light conditions. Film did need a lot
    of additional light indoors, hence redheads, etc.

    And yes, 160 pushed 2 stops did look grainy. But
    some of the later cameras, for example Aatons,
    were remarkably free of weave, etc. and gave
    pretty good results. I knew at least two cameramen
    who used Aatons in super 16, although they were
    cropped in TK. The full frame was pretty impressive.

    Also, don't underestimate the losses in TK: IMHO
    Telecine didn't come close to doing film justice
    until Cintel Mk.IIIs came in. The digital systems
    available for the last 20+ years give excellent
    results.

    I worked in Ealing's Theatre B for a short while.
    It used to be the music scoring stage, and still
    had a full-throw projection system. The screen was
    huge: about 12ft wide at a guess, with a
    mechanical footage counter below it, which had
    sometimes to be reset manually via a stepladder.
    But the picture, with a good 16mm showprint, was
    stunning - way more detail than from an electronic
    camera. The viewing distance from the mix position
    (control room) was about 75-100 feet, IIRC.

    In contrast, very few early ENG systems had
    Ikegamis or similar. Several of our freelancers
    had DXC3000s! Recording onto Hi-band U-matic.

    It wasn't until Beta SP and later DigiBeta that
    the pictures really improved.

    ENG was cheaper in stock costs and process (no lab
    necessary, for example), but still usually a 2-man
    crew. It was usually also faster (for short
    items).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Sun Apr 3 08:26:11 2022
    On Sat, 2 Apr 2022 11:11:15 +0100, "NY" <me@privacy.invalid> wrote:

    I wonder why there is the trend in the last few years to show the
    preparation for the interview ("are you sitting comfortably?", camera making >final focus and framing adjustments, clapperboard) as filler before the >interview begins as the interviewee is being introduced by the narrator.
    Will it become as much of a cliche as the establish shot of an interview >walking somewhere (assumed to be going to the interview) before the closeup >of them being interviewed.

    It's because in the absence of real creativity, style is inversely
    proportional to content.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From MB@21:1/5 to All on Sun Apr 3 12:02:38 2022
    On 03/04/2022 11:59, NY wrote:
    Like the trend a few decades ago for "funny camera angles". I remember an item on Top Gear or Tomorrow's World in which all the shots of the presenter had been tilted by 45 degrees so his head was in the top left/right corner
    of the frame and his legs/body were in the bottom right/left. For no good reason, other than to be "clever" or "arty".

    Still happens, saw something a few days ago that was all shot at an angle.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to SimonM on Sun Apr 3 11:50:36 2022
    "SimonM" <somewhere@large.in.the.world> wrote in message news:t29pog$brk$1@dont-email.me...
    It was usually Tungsten-balanced Ektachrome 160 or equivalent (E6
    process), occasionally pushed 1 or 2 stops. This was done so that the correction filter in the camera(Wratten 80A?) was added in daylight,
    rather than indoors where there were already low light conditions. Film
    did need a lot of additional light indoors, hence redheads, etc.

    I called it Ektachrome 320. I was wrong. It was 160: 320 was 160 already
    pushed by 1 stop ;-)


    Also, don't underestimate the losses in TK: IMHO Telecine didn't come
    close to doing film justice until Cintel Mk.IIIs came in. The digital
    systems available for the last 20+ years give excellent results.


    I first realised that when I saw restored (re-telecined) versions of
    episodes of The Sweeney and compared them with original versions as
    broadcast by ITV4. OK, so ITV4's version was 544x576 pixels rather than 720x576, so there was a loss of horizontal resolution (the episodes were
    also butchered to fit a 52-minute episode into a modern 46-minute "1 hour
    with 3 breaks" slot).

    Leaving aside reduction in horizontal resolution, the most obvious thing was much greater detail in shadows and highlights. This is the same as if you compare a scan of a print from a 35 mm (still) negative and a scan (with reversal) of the negative.

    My impression is that film+TK and video have both improved to the point that
    it is nowhere near as obvious as a viewer which is which these days. Compare that with the often-seen shots of the Iranian Embassy Siege in London.
    Normally you see the footage from the full-size TV cameras, recorded on VT: sharp, vibrant but a lot of highlight-crushing on the light-coloured stones
    of the building. Occasionally you see footage from ground level from 16 mm
    film which looks very drab and grainy.

    I remember Jools Holland did an item on a Channel 4 programme years ago
    where he compared film with video - to show that film was "better". But it
    was a very unfair test. The film camera was viewing the scene with the sun roughly behind it so the subject (himself standing beside a motorbike) and
    the background were both lit by the same amount. The video camera was
    looking in roughly the opposite direction (so each camera could see the
    other in its shot) and the background was lit far more brightly than the subject (which had been exposed for) so the background was overexposed. With the two cameras side-by-side, both seeing the same subject from the same direction and lighting, the test would have been fairer because it would
    have removed one very big variable.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Sun Apr 3 11:59:30 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:tqii4hlbnhvirdtuvs0h9q24i945pnnhnr@4ax.com...
    On Sat, 2 Apr 2022 11:11:15 +0100, "NY" <me@privacy.invalid> wrote:

    I wonder why there is the trend in the last few years to show the >>preparation for the interview ("are you sitting comfortably?", camera >>making
    final focus and framing adjustments, clapperboard) as filler before the >>interview begins as the interviewee is being introduced by the narrator. >>Will it become as much of a cliche as the establish shot of an interview >>walking somewhere (assumed to be going to the interview) before the
    closeup
    of them being interviewed.

    It's because in the absence of real creativity, style is inversely proportional to content.

    Like the trend a few decades ago for "funny camera angles". I remember an
    item on Top Gear or Tomorrow's World in which all the shots of the presenter had been tilted by 45 degrees so his head was in the top left/right corner
    of the frame and his legs/body were in the bottom right/left. For no good reason, other than to be "clever" or "arty".

    Then there was a documentary about the attempted kidnapping of Princess
    Anne, and every single talking-head shot started with the interviewee in
    focus, then the image would become slightly defocussed, go back into focus
    and out again, and then become sharp for the rest of the interview. The
    first time, I thought the camera operator was "hunting" the focus slightly,
    but when it happened every time, always with very similar timing, I realised
    it was someone pissing about in post-processing :-( Likewise for interviews that switch between colour and BW for different camera angles, or with fake venetian blind effect to distinguish modern footage from archive.

    The funniest (but only if you are in the know) was a retrospective about the Iranian Embassy Siege in which all the archive *video* footage had had fake film grain and dirt added to it to say "this is archive".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Williamson@21:1/5 to All on Sun Apr 3 12:00:45 2022
    On 03/04/2022 11:50, NY wrote:

    I remember Jools Holland did an item on a Channel 4 programme years ago
    where he compared film with video - to show that film was "better". But
    it was a very unfair test. The film camera was viewing the scene with
    the sun roughly behind it so the subject (himself standing beside a motorbike) and the background were both lit by the same amount. The
    video camera was looking in roughly the opposite direction (so each
    camera could see the other in its shot) and the background was lit far
    more brightly than the subject (which had been exposed for) so the
    background was overexposed. With the two cameras side-by-side, both
    seeing the same subject from the same direction and lighting, the test
    would have been fairer because it would have removed one very big variable.

    The camera never lies?

    Except that the cameraman needs to want to tell the truth. I can make
    the camera tell pretty much any story I wish, in still or video format,
    and that's before I start using photoshop on digital or filtering,
    dodging and shading on a print.

    --
    Tciao for Now!

    John.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to All on Sun Apr 3 16:50:10 2022
    On 03/04/2022 12:02, MB wrote:
    On 03/04/2022 11:59, NY wrote:
    Like the trend a few decades ago for "funny camera angles". I
    remember an
    item on Top Gear or Tomorrow's World in which all the shots of the
    presenter
    had been tilted by 45 degrees so his head was in the top left/right
    corner
    of the frame and his legs/body were in the bottom right/left. For no
    good
    reason, other than to be "clever" or "arty".

    Still happens, saw something a few days ago that was all shot at an
    angle.

    The Ipcress File ?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From MB@21:1/5 to Mark Carver on Sun Apr 3 22:11:59 2022
    On 03/04/2022 16:50, Mark Carver wrote:
    The Ipcress File ?

    I think it was a short "film" on The One Show or something similar.


    It was so distracting that I cannot remember and might have switched off
    / over.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Carver@21:1/5 to All on Mon Apr 4 08:52:35 2022
    On 03/04/2022 22:11, MB wrote:
    On 03/04/2022 16:50, Mark Carver wrote:
    The Ipcress File ?

    I think it was a short "film" on The One Show or something similar.


    It was so distracting that I cannot remember and might have switched
    off / over.

    It was used in the original Batman TV series, whenever there was a scene requiring an air of 'menace'. It's quite good for that.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ashley Booth@21:1/5 to All on Mon Apr 4 07:26:46 2022
    NY wrote:

    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:tqii4hlbnhvirdtuvs0h9q24i945pnnhnr@4ax.com... >On Sat, 2 Apr
    2022 11:11:15 +0100, "NY" <me@privacy.invalid> wrote:

    I wonder why there is the trend in the last few years to show the preparation for the interview ("are you sitting comfortably?",
    camera making final focus and framing adjustments, clapperboard)
    as filler before the interview begins as the interviewee is being introduced by the narrator. Will it become as much of a cliche
    as the establish shot of an interview walking somewhere (assumed
    to be going to the interview) before the closeup of them being interviewed.

    It's because in the absence of real creativity, style is inversely proportional to content.

    Like the trend a few decades ago for "funny camera angles". I
    remember an item on Top Gear or Tomorrow's World in which all the
    shots of the presenter had been tilted by 45 degrees so his head was
    in the top left/right corner of the frame and his legs/body were in
    the bottom right/left. For no good reason, other than to be "clever"
    or "arty".

    Then there was a documentary about the attempted kidnapping of
    Princess Anne, and every single talking-head shot started with the interviewee in focus, then the image would become slightly
    defocussed, go back into focus and out again, and then become sharp
    for the rest of the interview. The first time, I thought the camera
    operator was "hunting" the focus slightly, but when it happened every
    time, always with very similar timing, I realised it was someone
    pissing about in post-processing :-( Likewise for interviews that
    switch between colour and BW for different camera angles, or with
    fake venetian blind effect to distinguish modern footage from archive.

    The funniest (but only if you are in the know) was a retrospective
    about the Iranian Embassy Siege in which all the archive video
    footage had had fake film grain and dirt added to it to say "this is archive".



    --


    --
    This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Mon Apr 4 11:36:57 2022
    On Sun, 3 Apr 2022 11:50:36 +0100, "NY" <me@privacy.invalid> wrote:

    "SimonM" <somewhere@large.in.the.world> wrote in message >news:t29pog$brk$1@dont-email.me...
    It was usually Tungsten-balanced Ektachrome 160 or equivalent (E6
    process), occasionally pushed 1 or 2 stops. This was done so that the
    correction filter in the camera(Wratten 80A?) was added in daylight,
    rather than indoors where there were already low light conditions. Film
    did need a lot of additional light indoors, hence redheads, etc.

    I called it Ektachrome 320. I was wrong. It was 160: 320 was 160 already >pushed by 1 stop ;-)


    Also, don't underestimate the losses in TK: IMHO Telecine didn't come
    close to doing film justice until Cintel Mk.IIIs came in. The digital
    systems available for the last 20+ years give excellent results.


    I first realised that when I saw restored (re-telecined) versions of
    episodes of The Sweeney and compared them with original versions as
    broadcast by ITV4. OK, so ITV4's version was 544x576 pixels rather than >720x576, so there was a loss of horizontal resolution (the episodes were
    also butchered to fit a 52-minute episode into a modern 46-minute "1 hour >with 3 breaks" slot).

    Leaving aside reduction in horizontal resolution, the most obvious thing was >much greater detail in shadows and highlights. This is the same as if you >compare a scan of a print from a 35 mm (still) negative and a scan (with >reversal) of the negative.

    My impression is that film+TK and video have both improved to the point that >it is nowhere near as obvious as a viewer which is which these days. Compare >that with the often-seen shots of the Iranian Embassy Siege in London. >Normally you see the footage from the full-size TV cameras, recorded on VT: >sharp, vibrant but a lot of highlight-crushing on the light-coloured stones >of the building. Occasionally you see footage from ground level from 16 mm >film which looks very drab and grainy.

    I remember Jools Holland did an item on a Channel 4 programme years ago
    where he compared film with video - to show that film was "better". But it >was a very unfair test. The film camera was viewing the scene with the sun >roughly behind it so the subject (himself standing beside a motorbike) and >the background were both lit by the same amount. The video camera was
    looking in roughly the opposite direction (so each camera could see the
    other in its shot) and the background was lit far more brightly than the >subject (which had been exposed for) so the background was overexposed. With >the two cameras side-by-side, both seeing the same subject from the same >direction and lighting, the test would have been fairer because it would
    have removed one very big variable.

    For many years, one of the most obvious differences between original
    video and film when both are shown on television, was that film only
    captures the action 25 times per second with the camera effectively
    blind for half the time because of a mechanical shutter, while a
    television camera is collecting light all the time and displaying new
    pictorial information at twice the rate, that is 50 times per second.
    Even if the exposure, detail, and steadiness of a film had all been
    perfected, anything that moved would unavoidably look jerky in a way
    that never happened with original video material.

    Then somebody discovered that digital cameras could be switched to
    read out their information every frame instead of every field - half
    the normal rate - and it would look jerky like film, and for some
    reason they liked this. Despite it being a less realistic depiction of movement, it looked "filmic", which is good, it seems. Perhaps there
    was a psychological effect whereby something that looked as if it had
    been made on film implied that more money had been spent on the
    production, because this was usually the case with film that had been
    shot for the cinema. Whatever the reason, it became popular amongst
    some programme makers, and is still used for a lot of productions
    today, even though I doubt that any are still made using actual film.

    It's even possible for a modern digital video camera to use an
    electronic equivalent of the mechanical shutter in a film camera, so
    that it effectively misses half the action in the same way. Some
    programme makers like this too.

    It seems odd to me that the computer gaming fraternity spends vast
    amounts of money on ever increasing computing power to achieve higher
    frame rates - over a hundred in many cases - in their quest for more
    realistic depiction of movement, while some video programme makers
    appear to think that updating the picture at lower rates will achieve
    the same thing.

    Thus, in comparing restored film with modern video, we need to be
    careful before we draw conclusions about how much better it has
    become, because we also need to consider the ways in which modern
    video may have become worse.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Mon Apr 4 13:49:07 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:2tfl4hl9ngs26kt4dsu8diarb0jfjicts1@4ax.com...

    For many years, one of the most obvious differences between original
    video and film when both are shown on television, was that film only
    captures the action 25 times per second with the camera effectively
    blind for half the time because of a mechanical shutter, while a
    television camera is collecting light all the time and displaying new pictorial information at twice the rate, that is 50 times per second.
    Even if the exposure, detail, and steadiness of a film had all been perfected, anything that moved would unavoidably look jerky in a way
    that never happened with original video material.

    I've forgotten that one. Yes there was the increased jerkiness because you
    had 25 full-res pictures per second rather than 50 half-res pictures per second, so you lost some of the fluidity of motion.

    Then somebody discovered that digital cameras could be switched to
    read out their information every frame instead of every field - half
    the normal rate - and it would look jerky like film, and for some
    reason they liked this. Despite it being a less realistic depiction of movement, it looked "filmic", which is good, it seems. Perhaps there
    was a psychological effect whereby something that looked as if it had
    been made on film implied that more money had been spent on the
    production, because this was usually the case with film that had been
    shot for the cinema. Whatever the reason, it became popular amongst
    some programme makers, and is still used for a lot of productions
    today, even though I doubt that any are still made using actual film.

    I remember the outcry when Casualty changed to a rather crude filmic effect.
    It wasn't the jerkier motion that was a problem it was the fake film gamma which made the interiors look "stagey" like a studio set rather than
    allowing the viewer to suspend belief and think they were seeing the inside
    of a real hospital. It's hard to describe but it somehow killed the illusion stone dead.

    It may not have been the filmic gamma that was the problem: maybe the gamma change required scenes to be lit differnetly..

    It's even possible for a modern digital video camera to use an
    electronic equivalent of the mechanical shutter in a film camera, so
    that it effectively misses half the action in the same way. Some
    programme makers like this too.

    Mini-cams, used for POV shots in action sequences (eg of presenter going
    down a zip wire) or cockpit views in small aircraft, often default to a very high shutter speed. You can distinguish that footage because objects look unnaturally sharp (no motion blur) and you may get strobing on aircraft propellers.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver (John)@21:1/5 to me@privacy.invalid on Mon Apr 4 14:07:12 2022
    On Mon, 4 Apr 2022 at 13:49:07, NY <me@privacy.invalid> wrote (my
    responses usually FOLLOW):
    []
    I've forgotten that one. Yes there was the increased jerkiness because
    you had 25 full-res pictures per second rather than 50 half-res
    pictures per second, so you lost some of the fluidity of motion.

    And some - admittedly I haven't noticed it of late on YouTube, so maybe people/equipment are getting better - archive material, presumably from originally interlaced source material, has noticeable jaggies on moving
    things; I presume because 50 (or 60) interlaced has been digitised at,
    or is playing back at, 25 (or 30) (but using both fields to retain
    vertical resolution).
    []
    Mini-cams, used for POV shots in action sequences (eg of presenter
    going down a zip wire) or cockpit views in small aircraft, often
    default to a very high shutter speed. You can distinguish that footage >because objects look unnaturally sharp (no motion blur) and you may get >strobing on aircraft propellers.

    I think it's almost universal in the latter case: propellers and
    helicopter blades either static, or just turning very slowly.

    I first became aware of it on Top Gear - they seemed to use such
    deliberately, especially for any kerb-level shot, so you saw multiple
    fixed wheels - most distracting IMO.

    I think blur is _desirable_, certainly at normal (25-60) frame rates, as
    it makes the intermittent-motion less noticeable - rather like the way
    still images of a two-level something - printed text, or a
    black-and-white cartoon/graph/similar - are often acceptable at a lower resolution if using more levels of greyscale. Obviously only within
    limits - better for a panning shot, you don't want a disappearing
    vehicle to leave a ghost trail like something out of hyperspace. (Unless
    you're making that sort of material of course!) But blur IMO makes the non-continuous nature of film _or_ video less noticeable - fast stills
    draw attention to it.
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    If a tree falls in the forest and it lands on a bear that's having a shit... williamwright <wrightsaerials"f2s.com> in uk.texh.broadcast, 2021-12-8

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to G6JPG@255soft.uk on Mon Apr 4 14:18:29 2022
    "J. P. Gilliver (John)" <G6JPG@255soft.uk> wrote in message news:fx53rghA2uSiFwDa@a.a...

    I think blur is _desirable_, certainly at normal (25-60) frame rates, as
    it makes the intermittent-motion less noticeable - rather like the way
    still images of a two-level something - printed text, or a black-and-white cartoon/graph/similar - are often acceptable at a lower resolution if
    using more levels of greyscale. Obviously only within limits - better for
    a panning shot, you don't want a disappearing vehicle to leave a ghost
    trail like something out of hyperspace. (Unless you're making that sort of material of course!) But blur IMO makes the non-continuous nature of film _or_ video less noticeable - fast stills draw attention to it.

    I imagine it's for this reason that TV cameras (even digital ones) are
    normally set to use a shutter speed of 1/25 second (or 1/50 for interlaced) rather than a much shorter one, even if it requires a very small aperture or strong ND filters to achieve the correct exposure in sunlight.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roderick Stewart@21:1/5 to me@privacy.invalid on Mon Apr 4 15:01:19 2022
    On Mon, 4 Apr 2022 13:49:07 +0100, "NY" <me@privacy.invalid> wrote:

    It's even possible for a modern digital video camera to use an
    electronic equivalent of the mechanical shutter in a film camera, so
    that it effectively misses half the action in the same way. Some
    programme makers like this too.

    Mini-cams, used for POV shots in action sequences (eg of presenter going
    down a zip wire) or cockpit views in small aircraft, often default to a very >high shutter speed. You can distinguish that footage because objects look >unnaturally sharp (no motion blur) and you may get strobing on aircraft >propellers.

    Some cameras now appear to use the electronic "shutter" as a means of controlling exposure, rather than the traditional iris mechanism,
    probably because being electronic it requires no moving parts and is
    therefore cheaper to implement. The inevitable result is that the
    camera is only sensitive to light for a variable percentage of the
    time, and therefore only capturing part of the action. This is worst
    for high brightness situations, for example an aeroplane propellor
    against the sky, which is often captured as a series of visibly
    separate images (distorted as well, because of the way the shutter
    works) rather than a continuous blur as the eye would see.

    Rod.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to Roderick Stewart on Tue Apr 5 10:20:06 2022
    "Roderick Stewart" <rjfs@escapetime.myzen.co.uk> wrote in message news:u3ul4hl676klq2315ehgnjm7usqedcnu60@4ax.com...
    Some cameras now appear to use the electronic "shutter" as a means of controlling exposure, rather than the traditional iris mechanism,
    probably because being electronic it requires no moving parts and is therefore cheaper to implement. The inevitable result is that the
    camera is only sensitive to light for a variable percentage of the
    time, and therefore only capturing part of the action. This is worst
    for high brightness situations, for example an aeroplane propellor
    against the sky, which is often captured as a series of visibly
    separate images (distorted as well, because of the way the shutter
    works) rather than a continuous blur as the eye would see.

    If the shutter speed is fixed at 1/25 second, you need to be able to vary
    the iris by a very large amount to cater for going from bright sunlight to a dim interior. I imagine most lenses would not produce a very good picture (because of diffraction) at f100 ;-)

    But as you say, a minicam which adjusts the shutter speed needs no moving parts.

    The distortion is the same as you got with a focal-plane shutter on a film camera. It's the "rolling shutter" problem which still exists if the CCD
    sensor reads each row of pixels in turn, remaining "open" until that row is read, rather than turning off all the rows at the same instant and then
    reading them out sequentially. Some cameras do one, some do the other. My
    Nikon DSLR, when used as a video camera, suffers from the "rolling shutter" effect which is apparent even with fairly long shutter speeds - passing
    buses etc are blurred (so long shutter speed) but the verticals are sloping.
    I presume a tube TV camera would have had the same problem because the
    sensor was accepting and integrating the light on a given row from the
    instant that a row was scanned until the same time on the next frame, so the bottom line is seeing the image almost 1/25 second later than the top line;
    I imagine it was less noticeable because of motion blur.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From SimonM@21:1/5 to All on Wed Apr 6 09:26:57 2022
    On 05/04/2022 10:20, NY wrote:

    If the shutter speed is fixed at 1/25 second, you
    need to be able to vary the iris by a very large
    amount to cater for going from bright sunlight to
    a dim interior. I imagine most lenses would not
    produce a very good picture (because of
    diffraction) at f100 ;-)

    Spot on. And even if you could somehow get around
    diffraction at stops of f/32 or smaller, the
    images would be uncacceptable.

    In most stills photography/video & film the
    subject is accentuated deliberately by limiting
    depth of field. There are exceptions, where you
    actually want everything in focus (from the
    objective lens out to infinity), but generally
    it's true of documentaries and even interview
    situations. In drama, it predominates.

    Some cinema lenses are made with apertures of
    better than f/0.9 (they aren't cheap!), to give
    the DoP best control of depth of field, and that's
    with a 35mm sensor/gate area. Obviously as the
    sensor size diminishes, the DoF increases, ceteris
    paribus, which is undesirable.

    It's one reason I bought a full-frame DSLR - you
    cannot match the in-camera difference compared to
    an APS sensor, never mind smaller sensors still,
    as were common on video cameras in times past.

    Of course you can use ND filters to drop the light
    level, but handling a transition, such as a
    tracking shot through a doorway to/from the open
    air is hard (and anyway, you cannot switch in the
    ND filter in the middle of a shot).

    So a decent range of shutter speed is helpful, and
    film cameras have had some capability in this area
    for decades (although on 16mm I don't think it's
    easily variable in the middle of a shot!). I can
    put my DSLR in aperture priority mode, and have it
    vary the effective shutter speed for me, but it's
    reactive*, and all the artefacts NY mentions will
    be there. It's harder than one anticipates too,
    because there's actually a fairly small dynamic
    range on the sensor.

    All that said, when I see aerial photography with
    curvy propellers, for example, I do wonder if the
    camera people know where their ND filters are (or
    how to use them), or if it's deliberate...

    S.

    *I run Magic Lantern (Canon 6D), which makes many
    video functions programmable. I haven't
    investigated, but it might be possible to do a
    timed shutter speed change. That said it would
    need careful rehearsal so not handy for the
    limited amount of video I do.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to SimonM on Wed Apr 6 17:53:14 2022
    "SimonM" <somewhere@large.in.the.world> wrote in message news:t2jisi$ij7$1@dont-email.me...

    Some cinema lenses are made with apertures of better than f/0.9 (they
    aren't cheap!), to give the DoP best control of depth of field, and that's with a 35mm sensor/gate area. Obviously as the sensor size diminishes, the DoF increases, ceteris paribus, which is undesirable.

    I once saw a design for an attachment which allows lenses designed for 35 mm film to be used on a 16 mm camera or video camera of equivalent sensor size, with an intermediate frosted screen onto which the 35 mm lens projects its image as if it were film (and with the shallow DoF of a large-aperture lens) and which is viewed by a close-up lens on the 16 mm camera. It allows
    shallower DoF than a lens designed for 16 mm can achieve, without
    (apparently) cutting down the amount of light as much as if a smaller
    aperture had been used on the 16 mm lens.

    It works on the principle that a lens that gives a certain field of view on
    a larger film format will have a shallower DoF for the same light
    transmission than a lens for a smaller film format or smaller CCD on a video camera.

    All that said, when I see aerial photography with curvy propellers, for example, I do wonder if the camera people know where their ND filters are
    (or how to use them), or if it's deliberate...

    I imagine it's unwanted but unavoidable, unless broadcasters can find
    minicams which can be stopped down or fitted with ND filters to prevent the camera automatically compensating for bright light by reducing exposure
    time.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Williamson@21:1/5 to All on Wed Apr 6 18:05:54 2022
    On 06/04/2022 17:53, NY wrote:

    I imagine it's unwanted but unavoidable, unless broadcasters can find minicams which can be stopped down or fitted with ND filters to prevent
    the camera automatically compensating for bright light by reducing
    exposure time.



    Just out of interest, how neutral can you make an LCD ND filter? It
    could even be controlled to keep the light level at the sensor constant.

    --
    Tciao for Now!

    John.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver (John)@21:1/5 to All on Wed Apr 6 21:12:35 2022
    On Wed, 6 Apr 2022 at 09:26:57, SimonM <somewhere@large.in.the.world>
    wrote (my responses usually FOLLOW):
    []
    think it's easily variable in the middle of a shot!). I can put my DSLR
    in aperture priority mode, and have it vary the effective shutter speed
    for me, but it's reactive*, and all the artefacts NY mentions will be
    there. It's harder than one anticipates too, because there's actually a >fairly small dynamic range on the sensor.
    []
    My old Bell-and-Howell clockwork standard 8 camera had auto-aperture; it
    didn't have a multiblade iris, just two long narrow curved Vs at
    right-angles (so the aperture was more or less square), connected
    basically _as_ the needle to the light meter connected to the selenium
    cell. OK, it _was_ reactive, but most of the time fast enough.
    Obviously, it was reacting directly to the external light level, not
    anything based on the output of the sensor!, so there wasn't a lag in
    that sense.
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    He [Alfred Kinsey] wouldn't ask 'Have you ever slept with a horse?' He would say, 'When did you first sleep with a horse?' [RT 2018/5/5-11]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From SimonM@21:1/5 to John Williamson on Wed Apr 6 20:56:15 2022
    On 06/04/2022 18:05, John Williamson wrote:
    On 06/04/2022 17:53, NY wrote:

    I imagine it's unwanted but unavoidable, unless
    broadcasters can find
    minicams which can be stopped down or fitted
    with ND filters to prevent
    the camera automatically compensating for bright
    light by reducing
    exposure time.



    Just out of interest, how neutral can you make an
    LCD ND filter? It could even be controlled to keep
    the light level at the sensor constant.

    Dunno about LCD - they can be pretty clear - we
    have a see-through LCD clock around somewhere, and
    the glass doesn't have any obvious tint to it.

    There are also "adjustable" ND filters. I think
    it's done with a couple of linear polarizers

    It suddenly occurs to me that I might buy one and
    get two normal polarizers out of it. Separately I
    did a Google search for linear polarizers this
    morning and didn't turn up anything. I do have one
    but it's for a Cokin medium-format system of old,
    and a bit cumbersome for what I want.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. P. Gilliver (John)@21:1/5 to All on Wed Apr 6 21:21:30 2022
    On Wed, 6 Apr 2022 at 20:56:15, SimonM <somewhere@large.in.the.world>
    wrote (my responses usually FOLLOW):
    On 06/04/2022 18:05, John Williamson wrote:
    On 06/04/2022 17:53, NY wrote:

    I imagine it's unwanted but unavoidable, unless broadcasters can
    find
    minicams which can be stopped down or fitted with ND filters to
    prevent
    the camera automatically compensating for bright light by reducing
    exposure time.



    Just out of interest, how neutral can you make an LCD ND filter? It
    could even be controlled to keep the light level at the sensor
    constant.

    Dunno about LCD - they can be pretty clear - we have a see-through LCD
    clock around somewhere, and the glass doesn't have any obvious tint to
    it.

    Yes, but the segments of that are presumably full on or full off; to
    work as JW has in mind, they'd need to remain neutral over the full
    range.

    I _suspect_ the problem might be response time: I think most LCDs have a quarter to a third of a second response time. That might be fast enough, though, except for things like flash photography - especially if there
    is some outside-wanted-frame detection to give a bit of pre-warning,
    when panning at least.

    There are also "adjustable" ND filters. I think it's done with a couple
    of linear polarizers

    It suddenly occurs to me that I might buy one and get two normal
    polarizers out of it.

    Or a pair of cheap polarized sunglasses ... (-:

    Separately I did a Google search for linear polarizers this morning
    and didn't turn up anything. I do have one but it's for a Cokin
    medium-format system of old, and a bit cumbersome for what I want.

    Though I think crossed-polarizers tend to have a bluish tint, so might
    not be viable for this purpose.
    --
    J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

    He [Alfred Kinsey] wouldn't ask 'Have you ever slept with a horse?' He would say, 'When did you first sleep with a horse?' [RT 2018/5/5-11]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From NY@21:1/5 to SimonM on Thu Apr 7 09:59:05 2022
    "SimonM" <somewhere@large.in.the.world> wrote in message news:t2kr8v$4ld$1@dont-email.me...
    There are also "adjustable" ND filters. I think it's done with a couple of linear polarizers

    Crossed polaroids (in my experience) tend to give a deep blue tint (*), so
    they don't constitute a neutral (no tint) filter. BTDTGTTTS. But that can be corrected for in software.


    (*) The more they are crossed the darker they are (obviously) and the
    stronger the blue cast.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Ratcliffe@21:1/5 to somewhere@large.in.the.world on Sat Apr 30 23:41:39 2022
    On Sat, 2 Apr 2022 16:22:55 +0100, SimonM
    <somewhere@large.in.the.world> wrote:

    Several of our freelancers had DXC3000s!

    It pains me to tell you that we still have at least one of
    those bloody things on the shelves in our 'old junk' store.
    God alone knows why.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From SimonM@21:1/5 to Paul Ratcliffe on Mon May 2 09:13:00 2022
    On 01/05/2022 00:41, Paul Ratcliffe wrote:
    On Sat, 2 Apr 2022 16:22:55 +0100, SimonM
    <somewhere@large.in.the.world> wrote:

    Several of our freelancers had DXC3000s!

    It pains me to tell you that we still have at least one of
    those bloody things on the shelves in our 'old junk' store.
    God alone knows why.


    Heck, they were obsolete by around 1988!

    Since the licence fee is probably going (yeah,
    right), it must be time for a BBC Bristol eBay
    account, Shirley?

    Having seen on YouTube the junk people cherish,
    someone will pay silly money for it, especially if
    it has what passed in the late 1980s for a CCU.

    I vaguely remember the picture couldn't be matched
    to the 2001s (in either direction).

    Was it the newsroom one or the one stuck up in the
    corner of the grid in St.B, or something else?.

    Happy days.

    S.

    PS: I couldn't get to Chris W.'s funeral (we had
    family over from the US). Did he get a good
    send-off? Very much hope so.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Brian Gaff (Sofa)@21:1/5 to Paul Ratcliffe on Mon May 9 15:37:46 2022
    Is it not just to isolate the start/end of a piece fairly accurately for looking at before the digital editing is done?
    Brian

    --

    This newsgroup posting comes to you directly from...
    The Sofa of Brian Gaff...
    briang1@blueyonder.co.uk
    Blind user, so no pictures please
    Note this Signature is meaningless.!
    "Paul Ratcliffe" <abuse@orac12.clara34.co56.uk78> wrote in message news:slrnt6ri9i.4q44.abuse@news.pr.network...
    On Sat, 2 Apr 2022 16:22:55 +0100, SimonM
    <somewhere@large.in.the.world> wrote:

    Several of our freelancers had DXC3000s!

    It pains me to tell you that we still have at least one of
    those bloody things on the shelves in our 'old junk' store.
    God alone knows why.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Ratcliffe@21:1/5 to somewhere@large.in.the.world on Fri May 27 00:02:27 2022
    On Mon, 2 May 2022 09:13:00 +0100, SimonM
    <somewhere@large.in.the.world> wrote:

    Several of our freelancers had DXC3000s!

    It pains me to tell you that we still have at least one of
    those bloody things on the shelves in our 'old junk' store.
    God alone knows why.

    Heck, they were obsolete by around 1988!

    That never stopped us, did it?

    Since the licence fee is probably going (yeah,
    right), it must be time for a BBC Bristol eBay
    account, Shirley?

    Everything must go. Really this time. No reprieve now after
    today's announcements. Closing down sale.

    I vaguely remember the picture couldn't be matched
    to the 2001s (in either direction).

    No, nor to the Links either. Not surprising really though.

    Was it the newsroom one or the one stuck up in the
    corner of the grid in St.B, or something else?.

    Dunno.

    PS: I couldn't get to Chris W.'s funeral (we had
    family over from the US). Did he get a good
    send-off? Very much hope so.

    Yep. Well north of 150. And the sun was out!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)