• Non-paedophile paedophilia

    From GB@21:1/5 to All on Wed Oct 25 13:25:44 2023
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without involving children.

    Should we be encouraging this, rather than discouraging it, as providing
    a relatively harmless outlet for paedophiles?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to NOTsomeone@microsoft.invalid on Wed Oct 25 17:52:33 2023
    On Wed, 25 Oct 2023 13:25:44 +0100, GB <NOTsomeone@microsoft.invalid> wrote:

    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of >celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without >involving children.

    Should we be encouraging this, rather than discouraging it, as providing
    a relatively harmless outlet for paedophiles?

    It's been the case for some time now that pseudo-photographs which depict certain forms of indecent imagery are just as illegal as real photographs showing the same content. Whether it's done with AI or Photoshop doesn't
    really change that.

    The justification for criminalising pseudo-photographs of this nature is twofold. One argument is that such images are often a pathway to real
    images, which do involve actual abuse. It's widely recognised that people
    can become addicted to porn, and those who do often find themselves seeking
    out ever more extreme material as they become desensitised to that which
    they are familiar with. By this argument, criminalising such images does
    help prevent actual abuse.

    This argument is somewhat controversial, but it's not unreasonable to accept that unless it is comprehensively refuted the precautionary principle
    probably falls in favour of criminalisation.

    The other argument is that as long as the origin of the image is irrelevant, the prosecution doesn't have to prove that it is a real photograph and not a pseudo-photograph - it's purely what it looks like to the hypothetical traveller on the Peckham Borisbus that matters, not aspects which can only
    be determined by detailed forensic analysis (and, often not even by that). Pornographic photographs, like ducks, are recognised by sight.

    This has the benefit that there is no possible defence of "it's only a pseudo-photograph", which, in the absence of information leading back to a genuine victim, may be very hard for the prosecution to refute to the
    standard necessary for a criminal conviction. If the CPS had to prove that a photograph was real, then it would be practically impossible to secure a conviction in very many cases. So the criminalisation of pseudo-photographs does, somewhat paradoxically, make it easier to prosecute possession of real photographs of unlawful material.

    Personally, though, I think that illegal AI porn is less of an issue than
    the BBC news article implies. The problem with fighting Child Sexual Abuse Material (CSAM) isn't how it's made, it's how it's distributed. It's just as easy to share digital copies of real abuse as it is of AI-generated abuse.
    So the number of sources isn't really a major issue. It's the number of consumers which is the problem. And I don't think that's going to rise just because people can now generate it using AI as well as with a camera.

    What seems to me to be more of an issue with AI porn is not unlawful porn,
    but lawful porn (that is, depicting adults engaged in consensual sexual activities) which is AI-generated or AI-manipulated to represent real
    people. If someone takes a photo of a person fully clothed, and then runs it through an AI "nudify" app, then the resulting image isn't unlawful porn in itself. But publishing that AI-manipulated photo may well be very
    distressing to that person. The problem is, there are no laws which would currently make it illegal. It isn't unlawful imagery per se, it isn't an invasion of privacy, and it isn't revenge porn. But it will happen. I
    suspect it's already happening, on the kind of websites that I'd prefer not
    to look at (not even in the name of research[1]).

    I suspect, though, that this won't receive any legislative attention until a deepfake of Keir Starmer rogering Rishi Sunak starts to do the rounds. In
    the meantime, "won't somebody think of the children" is a far more potent campaign slogan.

    [1] Actually, if you do a Google[2] image search for female celebrities with safe search turned off, the chances are you'll find one in the results. Fake nudes have been a thing for a while, but AI is making them more widespread.

    [2] Other search engines are available.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to All on Wed Oct 25 18:16:02 2023
    On Wed, 25 Oct 2023 13:25:44 +0100, GB wrote:

    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without involving children.

    Should we be encouraging this, rather than discouraging it, as providing
    a relatively harmless outlet for paedophiles?

    Depends on what you feel society is. Should it do it's best to minimise
    harm. Or should it enforce morality irrespective of harm.

    I rather lost interest in the whole issue when it's clear it's the latter.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From TTman@21:1/5 to All on Wed Oct 25 23:12:29 2023
    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without involving children.

    Should we be encouraging this, rather than discouraging it, as providing
    a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    --
    This email has been checked for viruses by Avast antivirus software. www.avast.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to TTman on Thu Oct 26 06:28:04 2023
    On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:

    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of
    celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without
    involving children.

    Should we be encouraging this, rather than discouraging it, as
    providing a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Well then we are well down the road to censorship pure and simple.
    Because who gets to decide "indecent" ? Bearing in mind the moment the
    public were allowed to judge, an awful lot of what we had been told was indecent was judged not to be.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roger Hayter@21:1/5 to TTman on Thu Oct 26 07:52:56 2023
    On 25 Oct 2023 at 23:12:29 BST, "TTman" <kraken.sankey@gmail.com> wrote:

    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of
    celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without
    involving children.

    Should we be encouraging this, rather than discouraging it, as providing
    a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    That is begging the question of whether drawings and paintings of indecent things should be illegal in the first place.

    --
    Roger Hayter

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to All on Thu Oct 26 11:45:20 2023
    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without involving children.

    Should we be encouraging this, rather than discouraging it, as providing
    a relatively harmless outlet for paedophiles?

    Yes, well, eventually the "accommodation" argument was applied to male homosexuality, fornication and adultery.

    Maybe it's just a matter of time...

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to All on Thu Oct 26 11:50:45 2023
    On 26/10/2023 07:28, Jethro_uk wrote:
    On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:
    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of
    celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images is
    that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced without
    involving children.

    Should we be encouraging this, rather than discouraging it, as
    providing a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Well then we are well down the road to censorship pure and simple.
    Because who gets to decide "indecent" ? Bearing in mind the moment the
    public were allowed to judge, an awful lot of what we had been told was indecent was judged not to be.

    And vice versa: for some, any image of a child's genitalia is indecent,
    even if posted by the child's parents. "Won't the child be embarrassed
    when he grows up." (I don't know whether that's the argument.)

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Norman Wells@21:1/5 to TTman on Thu Oct 26 11:56:13 2023
    On 25/10/2023 23:12, TTman wrote:
    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of
    celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images
    is that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced
    without involving children.

    Should we be encouraging this, rather than discouraging it, as
    providing a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Usually, laws exist to stop others being harmed.

    If no-one is harmed by the activity concerned, why should the law
    prohibit it?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to jethro_uk@hotmailbin.com on Thu Oct 26 13:44:18 2023
    On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:

    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Well then we are well down the road to censorship pure and simple.

    We already have censorship pure and simple. AI hasn't changed that, and
    won't change that.

    Because who gets to decide "indecent" ?

    The court, using the usual yardstick of the disinterested observer. If it
    looks indecent, then it is indecent.

    Bear in mind that "indecent" per se is not the same as illegal. There's a
    lot of indecent imagery which is perfectly legal. There's a massive industry built around generating and distributing such material. It only becomes
    illegal if it depicts certain people, or certain actions, as well as being indecent.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to Norman Wells on Thu Oct 26 20:44:42 2023
    On Thu, 26 Oct 2023 11:56:13 +0100, Norman Wells wrote:

    On 25/10/2023 23:12, TTman wrote:
    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of
    celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female
    singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images
    is that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced
    without involving children.

    Should we be encouraging this, rather than discouraging it, as
    providing a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Usually, laws exist to stop others being harmed.

    If no-one is harmed by the activity concerned, why should the law
    prohibit it?

    Thank you John Stuart Mill.

    Now explain our drugs laws.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to All on Fri Oct 27 10:47:10 2023
    On 26/10/2023 21:44, Jethro_uk wrote:
    On Thu, 26 Oct 2023 11:56:13 +0100, Norman Wells wrote:

    On 25/10/2023 23:12, TTman wrote:
    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of >>>> celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female >>>> singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images
    is that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced
    without involving children.

    Should we be encouraging this, rather than discouraging it, as
    providing a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Usually, laws exist to stop others being harmed.

    If no-one is harmed by the activity concerned, why should the law
    prohibit it?

    Thank you John Stuart Mill.

    Now explain our drugs laws.

    Well it could be said that drugs harm the taker, which TPTB consider
    themselves responsible for. Most porn harms no-one, unless you think the (possibly now grown-up) subject is "re-victimised" every time it's
    viewed, even if unaware.

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to All on Fri Oct 27 10:57:09 2023
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
    wrote:

    And vice versa: for some, any image of a child's genitalia is indecent,
    even if posted by the child's parents. "Won't the child be embarrassed
    when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was some
    form of abuse involved in taking the photo or there is some sexual aspect to the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Drawing the line between lawful and unlawful images can be difficult, not
    least because both "indecent" and "sexual" are often a matter of perception. But the CPS charging guidelines do attempt to address the nuance of it.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Fri Oct 27 11:01:10 2023
    On 26/10/2023 13:44, Mark Goodge wrote:
    On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk <jethro_uk@hotmailbin.com> wrote:

    On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:

    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Well then we are well down the road to censorship pure and simple.

    We already have censorship pure and simple. AI hasn't changed that, and
    won't change that.

    Because who gets to decide "indecent" ?

    The court, using the usual yardstick of the disinterested observer. If it looks indecent, then it is indecent.

    In fact, the jury rather than the court decides what is or is not
    indecent. The jury is expected to use its common sense rather than look
    to any "expert" guidance.


    Bear in mind that "indecent" per se is not the same as illegal. There's a
    lot of indecent imagery which is perfectly legal. There's a massive industry built around generating and distributing such material. It only becomes illegal if it depicts certain people, or certain actions, as well as being indecent.

    Mark


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Fri Oct 27 11:21:28 2023
    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> wrote:

    And vice versa: for some, any image of a child's genitalia is indecent,
    even if posted by the child's parents. "Won't the child be embarrassed
    when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was some form of abuse involved in taking the photo or there is some sexual aspect to the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    Many photos of naked children involve no "abuse" (eg upskirt photo of a
    girl on a swing) but are deemed indecent and criminal by a jury, and
    many would say rightly so.



    Drawing the line between lawful and unlawful images can be difficult, not least because both "indecent" and "sexual" are often a matter of perception. But the CPS charging guidelines do attempt to address the nuance of it.

    Mark


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From AnthonyL@21:1/5 to usenet@listmail.good-stuff.co.uk on Fri Oct 27 11:32:00 2023
    On Wed, 25 Oct 2023 17:52:33 +0100, Mark Goodge <usenet@listmail.good-stuff.co.uk> wrote:


    It's been the case for some time now that pseudo-photographs which depict >certain forms of indecent imagery are just as illegal as real photographs >showing the same content. Whether it's done with AI or Photoshop doesn't >really change that.

    The justification for criminalising pseudo-photographs of this nature is >twofold. One argument is that such images are often a pathway to real
    images, which do involve actual abuse. It's widely recognised that people
    can become addicted to porn, and those who do often find themselves seeking >out ever more extreme material as they become desensitised to that which
    they are familiar with. By this argument, criminalising such images does
    help prevent actual abuse.

    This argument is somewhat controversial, but it's not unreasonable to accept >that unless it is comprehensively refuted the precautionary principle >probably falls in favour of criminalisation.

    The other argument is that as long as the origin of the image is irrelevant, >the prosecution doesn't have to prove that it is a real photograph and not a >pseudo-photograph - it's purely what it looks like to the hypothetical >traveller on the Peckham Borisbus that matters, not aspects which can only
    be determined by detailed forensic analysis (and, often not even by that). >Pornographic photographs, like ducks, are recognised by sight.

    This has the benefit that there is no possible defence of "it's only a >pseudo-photograph", which, in the absence of information leading back to a >genuine victim, may be very hard for the prosecution to refute to the >standard necessary for a criminal conviction. If the CPS had to prove that a >photograph was real, then it would be practically impossible to secure a >conviction in very many cases. So the criminalisation of pseudo-photographs >does, somewhat paradoxically, make it easier to prosecute possession of real >photographs of unlawful material.

    Personally, though, I think that illegal AI porn is less of an issue than
    the BBC news article implies. The problem with fighting Child Sexual Abuse >Material (CSAM) isn't how it's made, it's how it's distributed. It's just as >easy to share digital copies of real abuse as it is of AI-generated abuse.
    So the number of sources isn't really a major issue. It's the number of >consumers which is the problem. And I don't think that's going to rise just >because people can now generate it using AI as well as with a camera.

    What seems to me to be more of an issue with AI porn is not unlawful porn, >but lawful porn (that is, depicting adults engaged in consensual sexual >activities) which is AI-generated or AI-manipulated to represent real
    people. If someone takes a photo of a person fully clothed, and then runs it >through an AI "nudify" app, then the resulting image isn't unlawful porn in >itself. But publishing that AI-manipulated photo may well be very
    distressing to that person. The problem is, there are no laws which would >currently make it illegal. It isn't unlawful imagery per se, it isn't an >invasion of privacy, and it isn't revenge porn. But it will happen. I
    suspect it's already happening, on the kind of websites that I'd prefer not >to look at (not even in the name of research[1]).

    I suspect, though, that this won't receive any legislative attention until a >deepfake of Keir Starmer rogering Rishi Sunak starts to do the rounds. In
    the meantime, "won't somebody think of the children" is a far more potent >campaign slogan.

    [1] Actually, if you do a Google[2] image search for female celebrities with >safe search turned off, the chances are you'll find one in the results. Fake >nudes have been a thing for a while, but AI is making them more widespread.

    [2] Other search engines are available.


    I wonder if one unintended side effect of not being able to readily
    distinguish between fake and real will be that blackmailing could
    become ineffective. "That photo is not of me - you can send it to
    whom you want".


    --
    AnthonyL

    Why ever wait to finish a job before starting the next?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Fredxx@21:1/5 to The Todal on Fri Oct 27 12:54:21 2023
    On 27/10/2023 11:21, The Todal wrote:
    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
    wrote:

    And vice versa: for some, any image of a child's genitalia is indecent,
    even if posted by the child's parents. "Won't the child be embarrassed
    when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was
    some
    form of abuse involved in taking the photo or there is some sexual
    aspect to
    the photo. Photos of naked children at a nudist beach aren't illegal
    per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    Many photos of naked children involve no "abuse" (eg upskirt photo of a
    girl on a swing) but are deemed indecent and criminal by a jury, and
    many would say rightly so.

    Cartoons of naked children can also land you in gaol. Any depiction of indecency can land you in gaol, even if the actors are well over 18.

    I would prefer those with a certain sexual orientation to masturbate to
    these cartoons to satisfy themselves rather than aim directly for
    children. Your morality on the subject suggests otherwise?

    https://www.gazettelive.co.uk/news/teesside-news/anime-fan-convicted-over-illegal-7958896

    But a child was saved!

    Drawing the line between lawful and unlawful images can be difficult, not
    least because both "indecent" and "sexual" are often a matter of
    perception.
    But the CPS charging guidelines do attempt to address the nuance of it.

    Mark




    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From billy bookcase@21:1/5 to AnthonyL on Fri Oct 27 14:19:22 2023
    "AnthonyL" <nospam@please.invalid> wrote in message news:653b9f29.1398867578@news.eternal-september.org...
    On Wed, 25 Oct 2023 17:52:33 +0100, Mark Goodge <usenet@listmail.good-stuff.co.uk> wrote:


    It's been the case for some time now that pseudo-photographs which depict >>certain forms of indecent imagery are just as illegal as real photographs >>showing the same content. Whether it's done with AI or Photoshop doesn't >>really change that.

    The justification for criminalising pseudo-photographs of this nature is >>twofold. One argument is that such images are often a pathway to real >>images, which do involve actual abuse. It's widely recognised that people >>can become addicted to porn, and those who do often find themselves
    seeking
    out ever more extreme material as they become desensitised to that which >>they are familiar with. By this argument, criminalising such images does >>help prevent actual abuse.

    This argument is somewhat controversial, but it's not unreasonable to >>accept
    that unless it is comprehensively refuted the precautionary principle >>probably falls in favour of criminalisation.

    The other argument is that as long as the origin of the image is >>irrelevant,
    the prosecution doesn't have to prove that it is a real photograph and not >>a
    pseudo-photograph - it's purely what it looks like to the hypothetical >>traveller on the Peckham Borisbus that matters, not aspects which can only >>be determined by detailed forensic analysis (and, often not even by that). >>Pornographic photographs, like ducks, are recognised by sight.

    This has the benefit that there is no possible defence of "it's only a >>pseudo-photograph", which, in the absence of information leading back to a >>genuine victim, may be very hard for the prosecution to refute to the >>standard necessary for a criminal conviction. If the CPS had to prove that >>a
    photograph was real, then it would be practically impossible to secure a >>conviction in very many cases. So the criminalisation of
    pseudo-photographs
    does, somewhat paradoxically, make it easier to prosecute possession of >>real
    photographs of unlawful material.

    Personally, though, I think that illegal AI porn is less of an issue than >>the BBC news article implies. The problem with fighting Child Sexual Abuse >>Material (CSAM) isn't how it's made, it's how it's distributed. It's just >>as
    easy to share digital copies of real abuse as it is of AI-generated abuse. >>So the number of sources isn't really a major issue. It's the number of >>consumers which is the problem. And I don't think that's going to rise
    just
    because people can now generate it using AI as well as with a camera.

    What seems to me to be more of an issue with AI porn is not unlawful porn, >>but lawful porn (that is, depicting adults engaged in consensual sexual >>activities) which is AI-generated or AI-manipulated to represent real >>people. If someone takes a photo of a person fully clothed, and then runs >>it
    through an AI "nudify" app, then the resulting image isn't unlawful porn
    in
    itself. But publishing that AI-manipulated photo may well be very >>distressing to that person. The problem is, there are no laws which would >>currently make it illegal. It isn't unlawful imagery per se, it isn't an >>invasion of privacy, and it isn't revenge porn. But it will happen. I >>suspect it's already happening, on the kind of websites that I'd prefer
    not
    to look at (not even in the name of research[1]).

    I suspect, though, that this won't receive any legislative attention until >>a
    deepfake of Keir Starmer rogering Rishi Sunak starts to do the rounds. In >>the meantime, "won't somebody think of the children" is a far more potent >>campaign slogan.

    [1] Actually, if you do a Google[2] image search for female celebrities >>with
    safe search turned off, the chances are you'll find one in the results. >>Fake
    nudes have been a thing for a while, but AI is making them more
    widespread.

    [2] Other search engines are available.


    I wonder if one unintended side effect of not being able to readily distinguish between fake and real will be that blackmailing could
    become ineffective. "That photo is not of me - you can send it to
    whom you want".

    Blackmailers are more likely to rely on video; which nowadays is
    much easier to obtain than was formerly the case with film.
    Which was often used nevertheless. At least according to a 60's
    film starring Nigel Patrick as a framed Detective Inspector

    Such that had Randy Andy ever actually been filmed in action, there
    would simply be no room for the doubts which otherwise persist.
    In some people's minds at least.


    bb

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Adam Funk@21:1/5 to All on Fri Oct 27 15:23:05 2023
    On 2023-10-26, Jethro_uk wrote:

    On Thu, 26 Oct 2023 11:56:13 +0100, Norman Wells wrote:

    On 25/10/2023 23:12, TTman wrote:
    On 25/10/2023 13:25, GB wrote:
    "Paedophiles using AI to turn singers and film stars into kids.

    Paedophiles are using artificial intelligence (AI) to create images of >>>> celebrities as children.

    The Internet Watch Foundation (IWF) said images of a well-known female >>>> singer reimagined as a child are being shared by predators."


    https://www.bbc.co.uk/news/technology-67172231


    One of the arguments for prosecuting possession of paedophile images
    is that the making and dissemination of those images involves harm to
    children. Yet, child pornography can now apparently be produced
    without involving children.

    Should we be encouraging this, rather than discouraging it, as
    providing a relatively harmless outlet for paedophiles?


    Surely if the image is 'indecent' does it matter if the face is 'real'
    or AI ?

    Usually, laws exist to stop others being harmed.

    If no-one is harmed by the activity concerned, why should the law
    prohibit it?

    Thank you John Stuart Mill.

    Now explain our drugs laws.

    Economic productivity and political power.

    Stoners don't pull the ploughs of the economy as hard as the rest of
    us oxen.

    Timothy Leary said "Kids who take psychedelics won't fight your wars
    or join your corporations," so Nixon called him "the most dangerous
    man in America".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to Mark Goodge on Fri Oct 27 14:31:36 2023
    On Fri, 27 Oct 2023 10:57:09 +0100, Mark Goodge wrote:

    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> wrote:

    And vice versa: for some, any image of a child's genitalia is indecent, >>even if posted by the child's parents. "Won't the child be embarrassed
    when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was
    some form of abuse involved in taking the photo or there is some sexual aspect to the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Drawing the line between lawful and unlawful images can be difficult,
    not least because both "indecent" and "sexual" are often a matter of perception.
    But the CPS charging guidelines do attempt to address the nuance of it.

    Aren't we now going to get another category of "legal, but harmful" ?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to The Todal on Fri Oct 27 14:33:19 2023
    On Fri, 27 Oct 2023 11:01:10 +0100, The Todal wrote:

    On 26/10/2023 13:44, Mark Goodge wrote:
    On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:

    Surely if the image is 'indecent' does it matter if the face is
    'real'
    or AI ?

    Well then we are well down the road to censorship pure and simple.

    We already have censorship pure and simple. AI hasn't changed that, and
    won't change that.

    Because who gets to decide "indecent" ?

    The court, using the usual yardstick of the disinterested observer. If
    it looks indecent, then it is indecent.

    In fact, the jury rather than the court decides what is or is not
    indecent. The jury is expected to use its common sense rather than look
    to any "expert" guidance.

    As I said. It was a jury that declined to find Lady Chatterlys Lover
    obscene. They didn't even ask their wives or servants (which was pretty
    much a paradigm of how the law was supposed to work ...)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Fredxx on Fri Oct 27 16:22:33 2023
    On 27/10/2023 12:54, Fredxx wrote:
    On 27/10/2023 11:21, The Todal wrote:
    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
    wrote:

    And vice versa: for some, any image of a child's genitalia is indecent, >>>> even if posted by the child's parents. "Won't the child be embarrassed >>>> when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was
    some
    form of abuse involved in taking the photo or there is some sexual
    aspect to
    the photo. Photos of naked children at a nudist beach aren't illegal
    per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    Many photos of naked children involve no "abuse" (eg upskirt photo of
    a girl on a swing) but are deemed indecent and criminal by a jury, and
    many would say rightly so.

    Cartoons of naked children can also land you in gaol. Any depiction of indecency can land you in gaol, even if the actors are well over 18.

    I would prefer those with a certain sexual orientation to masturbate to
    these cartoons to satisfy themselves rather than aim directly for
    children. Your morality on the subject suggests otherwise?

    *My* morality? I resent the implication that my own opinions are those
    that are held by many people. I try not to agree with the majority.

    It is probably that the CPS will pick and choose which images require a prosecution and which are too tame. Faced with a defendant and a
    prosecutor I think most juries will tamely accept that a photograph of a
    child is indecent.

    I am reminded of the remarks of the Court of Appeal in 2000 when hearing
    an appeal against conviction pursued by David Mould (the chap
    subsequently convicted of rather more serious offences).

    quote

    We turn to the two other grounds which Mr Burton has argued before us.
    He submits, first of all, that the photograph itself could not possibly
    ever be said to be indecent. He submits that similar photographs can be
    found in medical text books. To label this photograph as indecent would
    mean that photographs of a similar kind in medical text books would also
    be indecent.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical
    text books, then many of the photographs with which these courts are all
    too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this
    as part of some medical research. We take the view that a jury was
    entitled to reach the conclusion that this photograph was indecent as
    the prosecution alleged.




    https://www.gazettelive.co.uk/news/teesside-news/anime-fan-convicted-over-illegal-7958896

    But a child was saved!

    Drawing the line between lawful and unlawful images can be difficult,
    not
    least because both "indecent" and "sexual" are often a matter of
    perception.
    But the CPS charging guidelines do attempt to address the nuance of it.

    Mark






    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From GB@21:1/5 to The Todal on Fri Oct 27 17:01:34 2023
    On 27/10/2023 16:22, The Todal wrote:

    quote

    We turn to the two other grounds which Mr Burton has argued before us.
    He submits, first of all, that the photograph itself could not possibly
    ever be said to be indecent. He submits that similar photographs can be
    found in medical text books. To label this photograph as indecent would
    mean that photographs of a similar kind in medical text books would also
    be indecent.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical
    text books, then many of the photographs with which these courts are all
    too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this
    as part of some medical research. We take the view that a jury was
    entitled to reach the conclusion that this photograph was indecent as
    the prosecution alleged.



    Is that really saying that a photo in one context is indecent, and in a different context it isn't indecent? Or, is it saying that the photo is indecent, but there's a defence for owning it in some circumstances?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Fredxx@21:1/5 to The Todal on Fri Oct 27 17:37:02 2023
    On 27/10/2023 16:22, The Todal wrote:
    On 27/10/2023 12:54, Fredxx wrote:
    On 27/10/2023 11:21, The Todal wrote:
    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>> wrote:

    And vice versa: for some, any image of a child's genitalia is
    indecent,
    even if posted by the child's parents. "Won't the child be embarrassed >>>>> when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there
    was some
    form of abuse involved in taking the photo or there is some sexual
    aspect to
    the photo. Photos of naked children at a nudist beach aren't illegal
    per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    Many photos of naked children involve no "abuse" (eg upskirt photo of
    a girl on a swing) but are deemed indecent and criminal by a jury,
    and many would say rightly so.

    Cartoons of naked children can also land you in gaol. Any depiction of
    indecency can land you in gaol, even if the actors are well over 18.

    I would prefer those with a certain sexual orientation to masturbate
    to these cartoons to satisfy themselves rather than aim directly for
    children. Your morality on the subject suggests otherwise?

    *My* morality?  I resent the implication that my own opinions are those
    that are held by many people. I try not to agree with the majority.

    My apologies, I took it from your phraseology as if you were endorsing
    the findings of juries.

    It is probably that the CPS will pick and choose which images require a prosecution and which are too tame. Faced with a defendant and a
    prosecutor I think most juries will tamely accept that a photograph of a child is indecent.

    I am reminded of the remarks of the Court of Appeal in 2000 when hearing
    an appeal against conviction pursued by David Mould (the chap
    subsequently convicted of rather more serious offences).

    quote

    We turn to the two other grounds which Mr Burton has argued before us.
    He submits, first of all, that the photograph itself could not possibly
    ever be said to be indecent. He submits that similar photographs can be
    found in medical text books. To label this photograph as indecent would
    mean that photographs of a similar kind in medical text books would also
    be indecent.

    If I recall it was a single photo.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical
    text books, then many of the photographs with which these courts are all
    too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this
    as part of some medical research. We take the view that a jury was
    entitled to reach the conclusion that this photograph was indecent as
    the prosecution alleged.




    https://www.gazettelive.co.uk/news/teesside-news/anime-fan-convicted-over-illegal-7958896

    But a child was saved!

    Drawing the line between lawful and unlawful images can be
    difficult, not
    least because both "indecent" and "sexual" are often a matter of
    perception.
    But the CPS charging guidelines do attempt to address the nuance of it. >>>>
    Mark

    Funnily enough I find the depiction of death through wanton violence
    equally indecent, but hey.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to The Todal on Sat Oct 28 20:43:24 2023
    On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote:

    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
    wrote:

    And vice versa: for some, any image of a child's genitalia is indecent,
    even if posted by the child's parents. "Won't the child be embarrassed
    when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was some >> form of abuse involved in taking the photo or there is some sexual aspect to >> the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    I'm restating a widely held opinion that nudity, even of a child, is not in itself unlawful provided there is no abusive or sexual aspect to the photo.
    You might want to ask yourself why there were no prosecutions over the photo
    by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for example. Or, indeed, comments made in R v Oliver[2] and the judgment in R v
    Graham-Kerr[2].

    It's also worth noting that the IWF, which can be presumed to have at least
    a certain amount of expertise in the matter (and whose report is the basis
    for the news article at the head of this thread), does not consider that websites carrying photos of naked children in a genuinely nudist setting
    should be added to the Cleanfeed blocklist. Indeed, the IWF refers to R v Oliver in their own assessment guidelines[3], as well as the Sentencing Council's Sexual Offences Definitive Guidelines[4]. The latter explicitly mentions the need to avoid criminalising "innocuous" pictures such as those taken by a child's own parents. This concurs with case law that "nakedness
    in a legitimate setting" is not pornographic.

    So, while it may be my opinion, I do think I'm on pretty safe ground in asserting that photos of naked children at a nudist beach aren't illegal per se. That is, of course, contextual, and there will be cases where they are illegal. If the photos are voyeuristic rather than consensual, then I expect the courts would conclude that they are indecent. And if the children in question were persuaded to pose in a sexually suggestive manner then that,
    too, would bring them within the remit of the legislation. But simple, non-sexual nudity, either adult or child, in a situation where nudity is to
    be expected, is not indecent, and consensual photography of such nudity is
    not unlawful either.

    (For avoidance of doubt, I would not permit anyone to take photos of my own children naked, and I would not put them into a situation where there was
    any significant risk that such photos might be taken non-consensually. Nor
    have I ever taken any myself of them naked, or even semi-naked in any
    context other than swimwear. And if they ever want to go to a nudist beach, they'll have to wait until they're old enough to get there on their own, because I'm certainly not taking them. But, nonetheless, nudists do exist,
    and, while I can think of few things less appealing to me than getting my
    kit off in public, it's not for me to judge those who think otherwise. And
    if, for them, nakedness is as normal as being clothed is for me, then it
    would be utterly perverse to suggest that they should be prevented from documenting their activities on camera in exactly the same way that I do
    with my family).

    Many photos of naked children involve no "abuse" (eg upskirt photo of a
    girl on a swing) but are deemed indecent and criminal by a jury, and
    many would say rightly so.

    Upskirts are specifically unlawful, and the reason it's a specific offence
    is because it was difficult to prosecute under existing legislation, particularly when taking into consideration the factors referred to above.

    [1] Very, very NSFW if you feel like Googling it.

    [2] See ulm passim for many references to these.

    [3] https://tinyurl.com/4u9zn7s5 as shortened from https://www.iwf.org.uk/about-us/how-we-assess-and-remove-content/our-mou-the-law-and-assessing-content/

    [4] https://tinyurl.com/53ua8ptz as shortened from https://www.sentencingcouncil.org.uk/wp-content/uploads/Final_Sexual_Offences_Response_to_Consultation_web1.pdf

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to The Todal on Sat Oct 28 21:18:05 2023
    On Fri, 27 Oct 2023 11:01:10 +0100, The Todal <the_todal@icloud.com> wrote:

    On 26/10/2023 13:44, Mark Goodge wrote:
    On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:

    Surely if the image is 'indecent' does it matter if the face is 'real' >>>> or AI ?

    Well then we are well down the road to censorship pure and simple.

    We already have censorship pure and simple. AI hasn't changed that, and
    won't change that.

    Because who gets to decide "indecent" ?

    The court, using the usual yardstick of the disinterested observer. If it
    looks indecent, then it is indecent.

    In fact, the jury rather than the court decides what is or is not
    indecent. The jury is expected to use its common sense rather than look
    to any "expert" guidance.

    The jury is part of the court. It's the part which makes decisions on
    questions of fact in a Crown Court trial. In other types of court,
    magistrates or judges make decisions on questions of fact. Referring to "the court" is merely a simple shorthand for "the people in a court whose responsibility it is to make decisions on questions of fact".

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to NOTsomeone@microsoft.invalid on Sat Oct 28 21:15:08 2023
    On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote:

    On 27/10/2023 16:22, The Todal wrote:

    quote

    We turn to the two other grounds which Mr Burton has argued before us.
    He submits, first of all, that the photograph itself could not possibly
    ever be said to be indecent. He submits that similar photographs can be
    found in medical text books. To label this photograph as indecent would
    mean that photographs of a similar kind in medical text books would also
    be indecent.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical
    text books, then many of the photographs with which these courts are all
    too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this
    as part of some medical research. We take the view that a jury was
    entitled to reach the conclusion that this photograph was indecent as
    the prosecution alleged.

    Is that really saying that a photo in one context is indecent, and in a >different context it isn't indecent?

    Not quite. It's a bit more nuanced than that. It's saying that mere
    similarity to an image known to be lawful does not automatically make a different image lawful (or, indeed, vice versa, similarity to an unlawful
    image does not ipso facto make a new image unlawful). Each image has to be judged on its own merits.

    There are a number of factors involved in determining whether an image is indecent, and one of those is the question of whether the image arises from
    a "legitimate setting". So, for example, there is case law[1] to the effect that a consensual photo of a naked child taken by a swimming instructor at a nudist swimming session is not unlawful, because you expect children participating in a nudist swimming session to be naked and therefore the
    only issue is one of consent. But if a swimming instructor at a normal
    session persuaded a child to remove their swimwear in order to have a photo taken naked, that would be unlawful, because it would not be a legitimate setting. Despite the fact that to someone looking at the two photos they may
    be practically identical, one would be unlawful and the other would not be.

    That's the broad thrust of the court's response to the argument cited above. Although Mr Mould's photos looked similar to those in medical textbooks,
    their setting was different. A photo taken of a naked child for publication
    in a medical textbook is a legitimate setting, a photo taken of a naked
    child for publication in a pornographic magazine is not. Even if, to the observer, the photos are extremely similar.

    It may seem somewhat counterintuitive that identical photos can be either lawful or unlawful depending on the context of how they were taken. But
    that's a necessary distinction if you want to avoid things like
    criminalising parents who take naked photos of their own children. Or,
    indeed, people who take photos for the purposes of education or reportage.
    The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.

    Or, is it saying that the photo is
    indecent, but there's a defence for owning it in some circumstances?

    That's an entirely different argument. There are some defences to possessing
    an unlawful image. But they are, deliberately, few and far between.

    [1] R v Graham-Kerr (1989) 88 Cr App R 302

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roger Hayter@21:1/5 to usenet@listmail.good-stuff.co.uk on Sun Oct 29 09:20:02 2023
    On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge" <usenet@listmail.good-stuff.co.uk> wrote:

    On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote:

    On 27/10/2023 16:22, The Todal wrote:

    quote

    We turn to the two other grounds which Mr Burton has argued before us.
    He submits, first of all, that the photograph itself could not possibly
    ever be said to be indecent. He submits that similar photographs can be
    found in medical text books. To label this photograph as indecent would
    mean that photographs of a similar kind in medical text books would also >>> be indecent.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical
    text books, then many of the photographs with which these courts are all >>> too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this
    as part of some medical research. We take the view that a jury was
    entitled to reach the conclusion that this photograph was indecent as
    the prosecution alleged.

    Is that really saying that a photo in one context is indecent, and in a
    different context it isn't indecent?

    Not quite. It's a bit more nuanced than that. It's saying that mere similarity to an image known to be lawful does not automatically make a different image lawful (or, indeed, vice versa, similarity to an unlawful image does not ipso facto make a new image unlawful). Each image has to be judged on its own merits.

    There are a number of factors involved in determining whether an image is indecent, and one of those is the question of whether the image arises from
    a "legitimate setting". So, for example, there is case law[1] to the effect that a consensual photo of a naked child taken by a swimming instructor at a nudist swimming session is not unlawful, because you expect children participating in a nudist swimming session to be naked and therefore the
    only issue is one of consent. But if a swimming instructor at a normal session persuaded a child to remove their swimwear in order to have a photo taken naked, that would be unlawful, because it would not be a legitimate setting. Despite the fact that to someone looking at the two photos they may be practically identical, one would be unlawful and the other would not be.

    That's the broad thrust of the court's response to the argument cited above. Although Mr Mould's photos looked similar to those in medical textbooks, their setting was different. A photo taken of a naked child for publication in a medical textbook is a legitimate setting, a photo taken of a naked
    child for publication in a pornographic magazine is not. Even if, to the observer, the photos are extremely similar.

    It may seem somewhat counterintuitive that identical photos can be either lawful or unlawful depending on the context of how they were taken. But that's a necessary distinction if you want to avoid things like
    criminalising parents who take naked photos of their own children. Or, indeed, people who take photos for the purposes of education or reportage. The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.

    You could equally avoid criminalising parents by not pretending a photograph
    is indecent when it isn't. How is this logic applied to pseudophotographs and cartoons? Are they only indecent when the possessor is a paedophile? You may notice a certain paradox here.




    Or, is it saying that the photo is
    indecent, but there's a defence for owning it in some circumstances?

    That's an entirely different argument. There are some defences to possessing an unlawful image. But they are, deliberately, few and far between.

    [1] R v Graham-Kerr (1989) 88 Cr App R 302

    Mark


    --
    Roger Hayter

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to Mark Goodge on Sun Oct 29 09:53:46 2023
    On Sat, 28 Oct 2023 20:43:24 +0100, Mark Goodge wrote:

    You might want to ask yourself why there were no prosecutions over the
    photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for
    example. Or,
    indeed, comments made in R v Oliver[2] and the judgment in R v Graham-Kerr[2].

    Or the cover to Blind Faiths eponymous 1969 album

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Roger Hayter on Sun Oct 29 12:38:50 2023
    On 29/10/2023 09:20, Roger Hayter wrote:
    On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge" <usenet@listmail.good-stuff.co.uk> wrote:

    On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote: >>
    On 27/10/2023 16:22, The Todal wrote:

    quote

    We turn to the two other grounds which Mr Burton has argued before us. >>>> He submits, first of all, that the photograph itself could not possibly >>>> ever be said to be indecent. He submits that similar photographs can be >>>> found in medical text books. To label this photograph as indecent would >>>> mean that photographs of a similar kind in medical text books would also >>>> be indecent.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical
    text books, then many of the photographs with which these courts are all >>>> too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this >>>> as part of some medical research. We take the view that a jury was
    entitled to reach the conclusion that this photograph was indecent as
    the prosecution alleged.

    Is that really saying that a photo in one context is indecent, and in a
    different context it isn't indecent?

    Not quite. It's a bit more nuanced than that. It's saying that mere
    similarity to an image known to be lawful does not automatically make a
    different image lawful (or, indeed, vice versa, similarity to an unlawful
    image does not ipso facto make a new image unlawful). Each image has to be >> judged on its own merits.

    There are a number of factors involved in determining whether an image is
    indecent, and one of those is the question of whether the image arises from >> a "legitimate setting". So, for example, there is case law[1] to the effect >> that a consensual photo of a naked child taken by a swimming instructor at a >> nudist swimming session is not unlawful, because you expect children
    participating in a nudist swimming session to be naked and therefore the
    only issue is one of consent. But if a swimming instructor at a normal
    session persuaded a child to remove their swimwear in order to have a photo >> taken naked, that would be unlawful, because it would not be a legitimate
    setting. Despite the fact that to someone looking at the two photos they may >> be practically identical, one would be unlawful and the other would not be. >>
    That's the broad thrust of the court's response to the argument cited above. >> Although Mr Mould's photos looked similar to those in medical textbooks,
    their setting was different. A photo taken of a naked child for publication >> in a medical textbook is a legitimate setting, a photo taken of a naked
    child for publication in a pornographic magazine is not. Even if, to the
    observer, the photos are extremely similar.

    It may seem somewhat counterintuitive that identical photos can be either
    lawful or unlawful depending on the context of how they were taken. But
    that's a necessary distinction if you want to avoid things like
    criminalising parents who take naked photos of their own children. Or,
    indeed, people who take photos for the purposes of education or reportage. >> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.

    You could equally avoid criminalising parents by not pretending a photograph is indecent when it isn't. How is this logic applied to pseudophotographs and cartoons? Are they only indecent when the possessor is a paedophile? You may notice a certain paradox here.


    Medical textbooks may contain many "indecent" photographs. However, if
    there is a legitimate excuse for including them in a publication (eg to
    educate and instruct doctors) then there is unlikely to be a conviction,
    so no need for a prosecution.

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually irrelevant.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Sun Oct 29 12:40:38 2023
    On 28/10/2023 21:18, Mark Goodge wrote:
    On Fri, 27 Oct 2023 11:01:10 +0100, The Todal <the_todal@icloud.com> wrote:

    On 26/10/2023 13:44, Mark Goodge wrote:
    On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:

    Surely if the image is 'indecent' does it matter if the face is 'real' >>>>> or AI ?

    Well then we are well down the road to censorship pure and simple.

    We already have censorship pure and simple. AI hasn't changed that, and
    won't change that.

    Because who gets to decide "indecent" ?

    The court, using the usual yardstick of the disinterested observer. If it >>> looks indecent, then it is indecent.

    In fact, the jury rather than the court decides what is or is not
    indecent. The jury is expected to use its common sense rather than look
    to any "expert" guidance.

    The jury is part of the court. It's the part which makes decisions on questions of fact in a Crown Court trial. In other types of court, magistrates or judges make decisions on questions of fact. Referring to "the court" is merely a simple shorthand for "the people in a court whose responsibility it is to make decisions on questions of fact".

    But when you say "the court" many people might wrongly assume that it's
    the judge who makes the decision about whether or not a photograph is
    indecent.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Sun Oct 29 13:04:10 2023
    On 28/10/2023 20:43, Mark Goodge wrote:
    On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote:

    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
    wrote:

    And vice versa: for some, any image of a child's genitalia is indecent, >>>> even if posted by the child's parents. "Won't the child be embarrassed >>>> when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was some >>> form of abuse involved in taking the photo or there is some sexual aspect to
    the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    I'm restating a widely held opinion that nudity, even of a child, is not in itself unlawful provided there is no abusive or sexual aspect to the photo.

    It may be a widely held opinion somewhere or other in the UK but it has
    no basis in law. However, "sexual aspect" is obviously in the eye of the beholder. In the Mould case it was a naked child simply standing there
    without anything being done to it. It was deemed indecent by a jury and
    there is no good reason to believe that any guidance from the CPS or the
    courts would produce a different jury decision today.

    Perhaps some juries would say "not indecent" but if they say "indecent"
    that's final and no appellate court will overrule that decision.


    You might want to ask yourself why there were no prosecutions over the photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for example. Or, indeed, comments made in R v Oliver[2] and the judgment in R v Graham-Kerr[2].

    I don't need to ask myself. It would be expensive and controversial to prosecute an artist.

    The Graham-Kerr case doesn't say what you seem to think it says.

    Court of Appeal 1988

    A photographer's motive was irrelevant to the issue of whether or not a photograph of a child was an indecent photograph. D was a swimming pool attendant. One photograph was a frontal view of a naked boy, the other
    a rear view. D admitted (to the police) that he found the boy
    attractive. Held, that where the only question for the jury was whether
    the photographs were indecent, D's motives were irrelevant. D's state
    of mind was only relevant if there was a dispute about whether the
    photographs represented what D intended to photograph. In the present
    case there was no such dispute. The only material that should have been
    put before the jury were the photographs. The correct formulation of the
    test for the jury to consider was (1) did D take the photograph
    deliberately and intentionally (2) was the photograph indecent? In
    answering (2) the jury were to apply the test of the recognised
    standards of propriety.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to The Todal on Sun Oct 29 16:56:36 2023
    On Sun, 29 Oct 2023 12:40:38 +0000, The Todal <the_todal@icloud.com> wrote:

    On 28/10/2023 21:18, Mark Goodge wrote:

    The jury is part of the court. It's the part which makes decisions on
    questions of fact in a Crown Court trial. In other types of court,
    magistrates or judges make decisions on questions of fact. Referring to "the >> court" is merely a simple shorthand for "the people in a court whose
    responsibility it is to make decisions on questions of fact".

    But when you say "the court" many people might wrongly assume that it's
    the judge who makes the decision about whether or not a photograph is >indecent.

    This is uk.legal.moderated, where I hope that the majority of participants would not make that mistake.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to Roger Hayter on Sun Oct 29 13:41:55 2023
    On 29 Oct 2023 09:20:02 GMT, Roger Hayter <roger@hayter.org> wrote:

    On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge" ><usenet@listmail.good-stuff.co.uk> wrote:

    It may seem somewhat counterintuitive that identical photos can be either
    lawful or unlawful depending on the context of how they were taken. But
    that's a necessary distinction if you want to avoid things like
    criminalising parents who take naked photos of their own children. Or,
    indeed, people who take photos for the purposes of education or reportage. >> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.

    You could equally avoid criminalising parents by not pretending a photograph >is indecent when it isn't.

    There is an argument for treating all non-sexual nudity, irrespective of context, as lawful, yes. The difficulty with that, though, at least as far
    as real human subjects are concerned, is that just because something is non-sexual (or, at least, not obviously sexual) doesn't mean it's also non-abusive. Particularly with children, voyeuristic or coercive nude
    imagery is abusive, and as things stand legislators are of the opinion that this justifies criminalising it. But that's precisely what creates the
    scenario where two near-identical photographs can have different legal
    status depending on the context in which they were taken.

    I don't think there's an easy solution to this, to be honest. And I think
    that anyone who does think there's an easy solution simply isn't thinking
    about it deeply enough. On the whole, legislation in this respect attempts
    to reflect public morality. But if you ask the public, they will tell you contradictory things. One the one hand, I think there would be a significant majority in favour of criminalising the creation and consumption of photos
    of naked children for the purposes of self-gratification. But, on the other hand, there would also be a significant majority in favour of not criminalising, say, parents who take a photo of their own children in the
    bath, or nudists who take holiday snaps of the family in the buff. The law
    at the moment does try to draw the line in approximately the same place as typical public morality, even though that line is, in reality, very fuzzy.

    How is this logic applied to pseudophotographs and
    cartoons? Are they only indecent when the possessor is a paedophile? You may >notice a certain paradox here.

    It's not the motive of the person in possession of a photo which makes it lawful or unlawful, it's the content and context of the photo itself. With cartoons, that's reasonably straightforward: if it's a cartoon of a child engaging in sexual activity or in a sexual context then it's unlawful, but
    if there are no sexual aspects then it isn't (and note that child nudity
    isn't a required element; a cartoon of a child giving an adult a blow job
    would be unlawful even if the child is depicted as fully clothed). But a pseudo-photograph is much more complicated. The question would be whether
    the context of the photo is a "legitimate setting", in the wording of the relevant case law. But what is the setting of a pseudo-photograph? And how
    can the court make that determination?

    As with real photos, I don't think there's an easy solution to this, at
    least not unless there is a significant shift in public opinion. The CPS guidance does try to navigate the nuances of it, and, for low-risk offenders then there's a fairly high bar to prosecution for simple possession of low-level indecent images - the CPS really doesn't want to get involved in prosecuting genuine nudists or parents, or people who may simply have come across low-level indecency on the web. But, of course, when they do
    prosecute, one of the common defences is "I wasn't really looking for them"
    or "it's just nudism, not abuse", so reported cases may well make appear as
    if they do.

    As with all reported court cases where we don't have a full transcript, you have to read between the lines somewhat and take the media reports with a
    pinch of salt. Just because it looks like someone has been unfairly treated doesn't mean they really have been. And, conversely, just because it looks
    as if someone has got away with it doesn't mean they don't have a genuine defence. The fact that the law on indecent imagery is widely misunderstood doesn't help.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roger Hayter@21:1/5 to The Todal on Sun Oct 29 13:21:19 2023
    On 29 Oct 2023 at 12:38:50 GMT, "The Todal" <the_todal@icloud.com> wrote:

    On 29/10/2023 09:20, Roger Hayter wrote:
    On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge"
    <usenet@listmail.good-stuff.co.uk> wrote:

    On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote:

    On 27/10/2023 16:22, The Todal wrote:

    quote

    We turn to the two other grounds which Mr Burton has argued before us. >>>>> He submits, first of all, that the photograph itself could not possibly >>>>> ever be said to be indecent. He submits that similar photographs can be >>>>> found in medical text books. To label this photograph as indecent would >>>>> mean that photographs of a similar kind in medical text books would also >>>>> be indecent.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical >>>>> text books, then many of the photographs with which these courts are all >>>>> too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this >>>>> as part of some medical research. We take the view that a jury was
    entitled to reach the conclusion that this photograph was indecent as >>>>> the prosecution alleged.

    Is that really saying that a photo in one context is indecent, and in a >>>> different context it isn't indecent?

    Not quite. It's a bit more nuanced than that. It's saying that mere
    similarity to an image known to be lawful does not automatically make a
    different image lawful (or, indeed, vice versa, similarity to an unlawful >>> image does not ipso facto make a new image unlawful). Each image has to be >>> judged on its own merits.

    There are a number of factors involved in determining whether an image is >>> indecent, and one of those is the question of whether the image arises from >>> a "legitimate setting". So, for example, there is case law[1] to the effect >>> that a consensual photo of a naked child taken by a swimming instructor at a
    nudist swimming session is not unlawful, because you expect children
    participating in a nudist swimming session to be naked and therefore the >>> only issue is one of consent. But if a swimming instructor at a normal
    session persuaded a child to remove their swimwear in order to have a photo >>> taken naked, that would be unlawful, because it would not be a legitimate >>> setting. Despite the fact that to someone looking at the two photos they may
    be practically identical, one would be unlawful and the other would not be. >>>
    That's the broad thrust of the court's response to the argument cited above.
    Although Mr Mould's photos looked similar to those in medical textbooks, >>> their setting was different. A photo taken of a naked child for publication >>> in a medical textbook is a legitimate setting, a photo taken of a naked
    child for publication in a pornographic magazine is not. Even if, to the >>> observer, the photos are extremely similar.

    It may seem somewhat counterintuitive that identical photos can be either >>> lawful or unlawful depending on the context of how they were taken. But
    that's a necessary distinction if you want to avoid things like
    criminalising parents who take naked photos of their own children. Or,
    indeed, people who take photos for the purposes of education or reportage. >>> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.

    You could equally avoid criminalising parents by not pretending a photograph >> is indecent when it isn't. How is this logic applied to pseudophotographs and
    cartoons? Are they only indecent when the possessor is a paedophile? You may >> notice a certain paradox here.


    Medical textbooks may contain many "indecent" photographs. However, if
    there is a legitimate excuse for including them in a publication (eg to educate and instruct doctors) then there is unlikely to be a conviction,
    so no need for a prosecution.

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually irrelevant.

    A reasonable POV, but Mark Goodge has just told us the complete opposite;
    that the motive for taking the picture and the circumstances it was taken in are vital to deciding if a picture is indecent. He even had a case to prove it - the first David Mould case.

    --
    Roger Hayter

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to The Todal on Sun Oct 29 16:12:31 2023
    On Sun, 29 Oct 2023 13:04:10 +0000, The Todal <the_todal@icloud.com> wrote:

    On 28/10/2023 20:43, Mark Goodge wrote:
    On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote: >>
    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>> wrote:

    And vice versa: for some, any image of a child's genitalia is indecent, >>>>> even if posted by the child's parents. "Won't the child be embarrassed >>>>> when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was some >>>> form of abuse involved in taking the photo or there is some sexual aspect to
    the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    I'm restating a widely held opinion that nudity, even of a child, is not in >> itself unlawful provided there is no abusive or sexual aspect to the photo.

    It may be a widely held opinion somewhere or other in the UK but it has
    no basis in law.

    Well, the IWF and the CPS both share that view, so I'm inclined to think
    they might be right. I agree that there's no explicit basis for it in
    statute, but both organisations reference case law to that effect.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to The Todal on Sun Oct 29 16:54:53 2023
    On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:

    On 29/10/2023 09:20, Roger Hayter wrote:
    On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge"
    <usenet@listmail.good-stuff.co.uk> wrote:

    On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid>
    wrote:

    On 27/10/2023 16:22, The Todal wrote:

    quote

    We turn to the two other grounds which Mr Burton has argued before
    us. He submits, first of all, that the photograph itself could not
    possibly ever be said to be indecent. He submits that similar
    photographs can be found in medical text books. To label this
    photograph as indecent would mean that photographs of a similar kind >>>>> in medical text books would also be indecent.

    If the test for deciding whether a photograph is indecent or not is
    whether or not it is the kind of photograph which appears in medical >>>>> text books, then many of the photographs with which these courts are >>>>> all too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing
    this as part of some medical research. We take the view that a jury
    was entitled to reach the conclusion that this photograph was
    indecent as the prosecution alleged.

    Is that really saying that a photo in one context is indecent, and in
    a different context it isn't indecent?

    Not quite. It's a bit more nuanced than that. It's saying that mere
    similarity to an image known to be lawful does not automatically make
    a different image lawful (or, indeed, vice versa, similarity to an
    unlawful image does not ipso facto make a new image unlawful). Each
    image has to be judged on its own merits.

    There are a number of factors involved in determining whether an image
    is indecent, and one of those is the question of whether the image
    arises from a "legitimate setting". So, for example, there is case
    law[1] to the effect that a consensual photo of a naked child taken by
    a swimming instructor at a nudist swimming session is not unlawful,
    because you expect children participating in a nudist swimming session
    to be naked and therefore the only issue is one of consent. But if a
    swimming instructor at a normal session persuaded a child to remove
    their swimwear in order to have a photo taken naked, that would be
    unlawful, because it would not be a legitimate setting. Despite the
    fact that to someone looking at the two photos they may be practically
    identical, one would be unlawful and the other would not be.

    That's the broad thrust of the court's response to the argument cited
    above.
    Although Mr Mould's photos looked similar to those in medical
    textbooks, their setting was different. A photo taken of a naked child
    for publication in a medical textbook is a legitimate setting, a photo
    taken of a naked child for publication in a pornographic magazine is
    not. Even if, to the observer, the photos are extremely similar.

    It may seem somewhat counterintuitive that identical photos can be
    either lawful or unlawful depending on the context of how they were
    taken. But that's a necessary distinction if you want to avoid things
    like criminalising parents who take naked photos of their own
    children. Or, indeed, people who take photos for the purposes of
    education or reportage. The (in)famous photo of Phan Thi Kim Phuc
    isn't unlawful, either.

    You could equally avoid criminalising parents by not pretending a
    photograph is indecent when it isn't. How is this logic applied to
    pseudophotographs and cartoons? Are they only indecent when the
    possessor is a paedophile? You may notice a certain paradox here.


    Medical textbooks may contain many "indecent" photographs. However, if
    there is a legitimate excuse for including them in a publication (eg to educate and instruct doctors) then there is unlikely to be a conviction,
    so no need for a prosecution.

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually irrelevant.

    Am I wrong in believing that you are not allowed to use the provenance of
    the image (e.g. the fact it came from a legitimate textbook) in your
    defence ?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to jethro_uk@hotmailbin.com on Sun Oct 29 16:55:16 2023
    On Sun, 29 Oct 2023 09:53:46 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Sat, 28 Oct 2023 20:43:24 +0100, Mark Goodge wrote:

    You might want to ask yourself why there were no prosecutions over the
    photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for
    example. Or,
    indeed, comments made in R v Oliver[2] and the judgment in R v
    Graham-Kerr[2].

    Or the cover to Blind Faiths eponymous 1969 album

    There's something of a fudge there (and also in the case of the notorious "Virgin Killer" sleeve by Scorpions, which did result in the IWF blocking
    the image on Wikipedia), in that it's generally considered not in the public interest to prosecute possession of images that were not unlawful at the
    time they were created or acquired, but may be considered so now. There are quite a lot of newspaper archives, for example, which hold copies of The Sun which predate the change of minimum age from 16 to 18, but there are no
    moves to take any action against them.

    See also photographic work of Lewis Carroll, which includes images which
    would be considered unlawful if taken now, but are nonetheless widely available. Although Carroll's ouvre illustrates one of the issues here.
    Carroll was a prolific photographer; over his lifetime he created around
    3,000 images of a wide variety of subjects including children, adults, landscapes, statues, dogs, trees and even skeletons. Unsurprisingly, though, given the timespan since then, less than a thousand of his images survive.
    But, of those which do survive, more than half are photographs of young
    girls - a disproportionate sample of his work. And almost all of his photos
    of young girls in a state of partial or total undress have survived. It
    seems that when it comes to preserving amateur photography, some subjects
    are more popular than others. Arguments persist as to whether Carroll
    himself was a paedophile. But even if he wasn't, it's likely that many of
    those who collected his photos were.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jethro_uk@21:1/5 to Mark Goodge on Sun Oct 29 19:18:36 2023
    On Sun, 29 Oct 2023 13:41:55 +0000, Mark Goodge wrote:

    On 29 Oct 2023 09:20:02 GMT, Roger Hayter <roger@hayter.org> wrote:

    [quoted text muted]

    There is an argument for treating all non-sexual nudity, irrespective of context, as lawful, yes. The difficulty with that, though, at least as
    far as real human subjects are concerned, is that just because something
    is non-sexual (or, at least, not obviously sexual) doesn't mean it's
    also non-abusive.

    Some people find feet and photos thereof - especially with shoes -
    sexually arousing. As an example of beginning the long journey of reductio
    ad absurdum

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From JNugent@21:1/5 to All on Sun Oct 29 21:19:11 2023
    On 29/10/2023 09:53 am, Jethro_uk wrote:

    On Sat, 28 Oct 2023 20:43:24 +0100, Mark Goodge wrote:

    You might want to ask yourself why there were no prosecutions over the
    photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for
    example. Or,
    indeed, comments made in R v Oliver[2] and the judgment in R v
    Graham-Kerr[2].

    Or the cover to Blind Faiths eponymous 1969 album

    While some sites mask the lower part of the front cover, I see that www.discogs.com has no compunction about showing the whole of it.

    Mind you, neither did any of the big records stores of the time.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to jethro_uk@hotmailbin.com on Sun Oct 29 22:31:01 2023
    On Sun, 29 Oct 2023 16:54:53 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually
    irrelevant.

    Am I wrong in believing that you are not allowed to use the provenance of
    the image (e.g. the fact it came from a legitimate textbook) in your
    defence ?

    It's complicated. Fundamentally, it's the image which is unlawful, not the person in possession of it. That is, an unlawful image is unlawful for
    anyone to possess (unless they have a defence), irrespective of their
    motives. If someone had in their possession a library of medical textbooks which included photos of naked children, then that would not be an offence
    even if they were routinely rubbing one out while looking at those photos.

    However, what complicates it is the fact that the courts have decided that making a digital copy of a photo counts as "making" for the purposes of the law[1]. And the reason this complicates it is that this means that the circumstances of the creation of the copy are relevant, just as much as the circumstances of the creation of the original. So if someone was making a collection of copies of photos sources from medical textbooks, then those collected photos could, potentially, be deemed unlawful even if the
    originals were not.

    [1] FWIW, I think this was a poor decision[2], and I think that the
    legislation should be amended to make it clear that merely making a digital copy of an image is not the same as creating the image in the first place.
    But I suspect that there is little appetite in government circles to make
    such a change.

    [2] Not because I have any particular sympathy for perverts who collect indecent images, but simply because, as an IT professional, the idea that copying a file is the same as creating a file seems to me to be utterly bizarre. And it's also directly opposite to established legislation and case law in the realm of Intellectual Property, where it's firmly established
    that merely making a copy is *not* creating something new.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Roger Hayter on Mon Oct 30 10:55:19 2023
    On 29/10/2023 13:21, Roger Hayter wrote:
    On 29 Oct 2023 at 12:38:50 GMT, "The Todal" <the_todal@icloud.com> wrote:

    On 29/10/2023 09:20, Roger Hayter wrote:
    On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge"
    <usenet@listmail.good-stuff.co.uk> wrote:

    On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote:

    On 27/10/2023 16:22, The Todal wrote:

    quote

    We turn to the two other grounds which Mr Burton has argued before us. >>>>>> He submits, first of all, that the photograph itself could not possibly >>>>>> ever be said to be indecent. He submits that similar photographs can be >>>>>> found in medical text books. To label this photograph as indecent would >>>>>> mean that photographs of a similar kind in medical text books would also >>>>>> be indecent.

    If the test for deciding whether a photograph is indecent or not is >>>>>> whether or not it is the kind of photograph which appears in medical >>>>>> text books, then many of the photographs with which these courts are all >>>>>> too familiar could not be classified as indecent. ... It is not
    suggested that he is the parent of the child or that he was doing this >>>>>> as part of some medical research. We take the view that a jury was >>>>>> entitled to reach the conclusion that this photograph was indecent as >>>>>> the prosecution alleged.

    Is that really saying that a photo in one context is indecent, and in a >>>>> different context it isn't indecent?

    Not quite. It's a bit more nuanced than that. It's saying that mere
    similarity to an image known to be lawful does not automatically make a >>>> different image lawful (or, indeed, vice versa, similarity to an unlawful >>>> image does not ipso facto make a new image unlawful). Each image has to be >>>> judged on its own merits.

    There are a number of factors involved in determining whether an image is >>>> indecent, and one of those is the question of whether the image arises from
    a "legitimate setting". So, for example, there is case law[1] to the effect
    that a consensual photo of a naked child taken by a swimming instructor at a
    nudist swimming session is not unlawful, because you expect children
    participating in a nudist swimming session to be naked and therefore the >>>> only issue is one of consent. But if a swimming instructor at a normal >>>> session persuaded a child to remove their swimwear in order to have a photo
    taken naked, that would be unlawful, because it would not be a legitimate >>>> setting. Despite the fact that to someone looking at the two photos they may
    be practically identical, one would be unlawful and the other would not be.

    That's the broad thrust of the court's response to the argument cited above.
    Although Mr Mould's photos looked similar to those in medical textbooks, >>>> their setting was different. A photo taken of a naked child for publication
    in a medical textbook is a legitimate setting, a photo taken of a naked >>>> child for publication in a pornographic magazine is not. Even if, to the >>>> observer, the photos are extremely similar.

    It may seem somewhat counterintuitive that identical photos can be either >>>> lawful or unlawful depending on the context of how they were taken. But >>>> that's a necessary distinction if you want to avoid things like
    criminalising parents who take naked photos of their own children. Or, >>>> indeed, people who take photos for the purposes of education or reportage. >>>> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.

    You could equally avoid criminalising parents by not pretending a photograph
    is indecent when it isn't. How is this logic applied to pseudophotographs and
    cartoons? Are they only indecent when the possessor is a paedophile? You may
    notice a certain paradox here.


    Medical textbooks may contain many "indecent" photographs. However, if
    there is a legitimate excuse for including them in a publication (eg to
    educate and instruct doctors) then there is unlikely to be a conviction,
    so no need for a prosecution.

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually
    irrelevant.

    A reasonable POV, but Mark Goodge has just told us the complete opposite; that the motive for taking the picture and the circumstances it was taken in are vital to deciding if a picture is indecent. He even had a case to prove it
    - the first David Mould case.


    The first David Mould case shows no such thing.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Mon Oct 30 10:59:42 2023
    On 29/10/2023 16:56, Mark Goodge wrote:
    On Sun, 29 Oct 2023 12:40:38 +0000, The Todal <the_todal@icloud.com> wrote:

    On 28/10/2023 21:18, Mark Goodge wrote:

    The jury is part of the court. It's the part which makes decisions on
    questions of fact in a Crown Court trial. In other types of court,
    magistrates or judges make decisions on questions of fact. Referring to "the
    court" is merely a simple shorthand for "the people in a court whose
    responsibility it is to make decisions on questions of fact".

    But when you say "the court" many people might wrongly assume that it's
    the judge who makes the decision about whether or not a photograph is
    indecent.

    This is uk.legal.moderated, where I hope that the majority of participants would not make that mistake.


    This is uk.legal.moderated where most people aren't lawyers and when you
    say "the court will decide" they would reasonably assume that it would
    be a reliable decision based on expert evidence as to what is or is not indecent, plus case law.

    Rather than, as what actually happens, the decision of 12 randomly
    chosen jurors applying their notion of what society regards as decent,
    without any guidance from experts.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Mon Oct 30 11:15:49 2023
    On 29/10/2023 16:12, Mark Goodge wrote:
    On Sun, 29 Oct 2023 13:04:10 +0000, The Todal <the_todal@icloud.com> wrote:

    On 28/10/2023 20:43, Mark Goodge wrote:
    On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote: >>>
    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>>> wrote:

    And vice versa: for some, any image of a child's genitalia is indecent, >>>>>> even if posted by the child's parents. "Won't the child be embarrassed >>>>>> when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was some
    form of abuse involved in taking the photo or there is some sexual aspect to
    the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at
    the time.

    I'm restating a widely held opinion that nudity, even of a child, is not in >>> itself unlawful provided there is no abusive or sexual aspect to the photo. >>
    It may be a widely held opinion somewhere or other in the UK but it has
    no basis in law.

    Well, the IWF and the CPS both share that view, so I'm inclined to think
    they might be right. I agree that there's no explicit basis for it in statute, but both organisations reference case law to that effect.


    That proves my point.

    Neither the IWF nor the CPS can tell the nation what is or is not
    indecent. All they can do is apply a grading system which they regard as
    useful when deciding whether or not to prosecute.

    If there is a prosecution the views of the IWF and CPS will not be
    admissible in evidence or in guidance from the judge.

    To put it very simply, the CPS decides whether or not to prosecute and
    if there is a prosecution the jury will usually convict. Faced with a defendant, a photograph and a prosecutor the jury will decide that
    although in their ordinary lives the word "indecent" has no meaning,
    they will do their civic duty and convict a defendant whom they assume
    will be a bad 'un who is probably a danger to kids.

    How often have you looked at a photograph or a video and thought "that's indecent"? It is a word that no longer has a clear meaning in ordinary
    life, when we see naked people on dating shows and seemingly even if
    that's indecent (as it probably is) nobody regards it as a reason for
    banning the show.

    However, nobody would ever come up with a better law to catch those who
    collect and exchange photographs of naked children. It would be wrong to
    repeal the law and not substitute a better law.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to Roger Hayter on Mon Oct 30 12:00:56 2023
    On 29/10/2023 13:21, Roger Hayter wrote:
    On 29 Oct 2023 at 12:38:50 GMT, "The Todal" <the_todal@icloud.com> wrote:

    Medical textbooks may contain many "indecent" photographs. However, if
    there is a legitimate excuse for including them in a publication (eg to
    educate and instruct doctors) then there is unlikely to be a conviction,
    so no need for a prosecution.

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually
    irrelevant.

    A reasonable POV, but Mark Goodge has just told us the complete opposite; that the motive for taking the picture and the circumstances it was taken in are vital to deciding if a picture is indecent. He even had a case to prove it
    - the first David Mould case.

    I think it's more the circumstances it is *viewed*.

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to All on Mon Oct 30 12:08:25 2023
    On 29/10/2023 19:18, Jethro_uk wrote:
    On Sun, 29 Oct 2023 13:41:55 +0000, Mark Goodge wrote:

    There is an argument for treating all non-sexual nudity, irrespective of
    context, as lawful, yes. The difficulty with that, though, at least as
    far as real human subjects are concerned, is that just because something
    is non-sexual (or, at least, not obviously sexual) doesn't mean it's
    also non-abusive.

    Some people find feet and photos thereof - especially with shoes -
    sexually arousing. As an example of beginning the long journey of reductio
    ad absurdum

    All sex is evil (except when it isn't).

    Does anyone know the difference between a paraphilia and a fetish?

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roger Hayter@21:1/5 to usenet@listmail.good-stuff.co.uk on Mon Oct 30 12:15:38 2023
    On 29 Oct 2023 at 22:31:01 GMT, "Mark Goodge" <usenet@listmail.good-stuff.co.uk> wrote:

    On Sun, 29 Oct 2023 16:54:53 -0000 (UTC), Jethro_uk <jethro_uk@hotmailbin.com> wrote:

    On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually >>> irrelevant.

    Am I wrong in believing that you are not allowed to use the provenance of
    the image (e.g. the fact it came from a legitimate textbook) in your
    defence ?

    It's complicated. Fundamentally, it's the image which is unlawful, not the person in possession of it. That is, an unlawful image is unlawful for
    anyone to possess (unless they have a defence), irrespective of their motives. If someone had in their possession a library of medical textbooks which included photos of naked children, then that would not be an offence even if they were routinely rubbing one out while looking at those photos.

    However, what complicates it is the fact that the courts have decided that making a digital copy of a photo counts as "making" for the purposes of the law[1]. And the reason this complicates it is that this means that the circumstances of the creation of the copy are relevant, just as much as the circumstances of the creation of the original. So if someone was making a collection of copies of photos sources from medical textbooks, then those collected photos could, potentially, be deemed unlawful even if the
    originals were not.


    Your first two paragraphs contradict one another. First you say it is the
    image itself that is indecent not the circumstances of its production. Then
    you say that when making a copy it is the circumstances of the copy that make it indecent. This is, to say the least, inconsistent. Note, I am not blaming you for this inconsistency. Winston Smith had the same problem when trying to be a loyal citizen and expound doublethink.




    [1] FWIW, I think this was a poor decision[2], and I think that the legislation should be amended to make it clear that merely making a digital copy of an image is not the same as creating the image in the first place. But I suspect that there is little appetite in government circles to make such a change.

    [2] Not because I have any particular sympathy for perverts who collect indecent images, but simply because, as an IT professional, the idea that copying a file is the same as creating a file seems to me to be utterly bizarre. And it's also directly opposite to established legislation and case law in the realm of Intellectual Property, where it's firmly established
    that merely making a copy is *not* creating something new.

    While I agree about copying not being making, I don't think copyright law
    helps us at all. It is no defence to breaching copyright law that one has used the copied item to make an otherwise wholly original and superior work.





    Mark


    --
    Roger Hayter

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roger Hayter@21:1/5 to The Todal on Mon Oct 30 12:23:21 2023
    On 30 Oct 2023 at 10:59:42 GMT, "The Todal" <the_todal@icloud.com> wrote:

    On 29/10/2023 16:56, Mark Goodge wrote:
    On Sun, 29 Oct 2023 12:40:38 +0000, The Todal <the_todal@icloud.com> wrote: >>
    On 28/10/2023 21:18, Mark Goodge wrote:

    The jury is part of the court. It's the part which makes decisions on
    questions of fact in a Crown Court trial. In other types of court,
    magistrates or judges make decisions on questions of fact. Referring to "the
    court" is merely a simple shorthand for "the people in a court whose
    responsibility it is to make decisions on questions of fact".

    But when you say "the court" many people might wrongly assume that it's
    the judge who makes the decision about whether or not a photograph is
    indecent.

    This is uk.legal.moderated, where I hope that the majority of participants >> would not make that mistake.


    This is uk.legal.moderated where most people aren't lawyers and when you
    say "the court will decide" they would reasonably assume that it would
    be a reliable decision based on expert evidence as to what is or is not indecent, plus case law.

    Rather than, as what actually happens, the decision of 12 randomly
    chosen jurors applying their notion of what society regards as decent, without any guidance from experts.

    I would argue that anyone who makes their career out of becoming an expert on the indecency of photographs has an overwhelming vested interest in there
    being such a thing as an "indecent photograph".

    --
    Roger Hayter

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Max Demian on Mon Oct 30 13:18:07 2023
    On 30/10/2023 12:00, Max Demian wrote:
    On 29/10/2023 13:21, Roger Hayter wrote:
    On 29 Oct 2023 at 12:38:50 GMT, "The Todal" <the_todal@icloud.com> wrote:

    Medical textbooks may contain many "indecent" photographs. However, if
    there is a legitimate excuse for including them in a publication (eg to
    educate and instruct doctors) then there is unlikely to be a conviction, >>> so no need for a prosecution.

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually >>> irrelevant.

    A reasonable POV, but Mark Goodge has just told us the complete opposite;
    that the motive for taking the picture and the circumstances it was
    taken in
    are vital to deciding if a picture is indecent. He even had a case to
    prove it
    - the first David Mould case.

    I think it's more the circumstances it is *viewed*.


    No.

    The stages in the decision are: (a) did the defendant intend to take
    that photo (or save it onto his computer? (b) is it indecent (a decision
    purely for the jury based on their understanding of what society regards
    as decent or indecent, and (c) does the defendant have a legitimate
    excuse for possessing the photo which again, is for the jury to decide.

    If the defendant is doing genuine academic research into art or
    pornography that might mean he has a legitimate excuse but it does not
    mean that the photograph has ceased to be indecent. If the defendant is
    a police officer he might try to argue that he had a legitimate excuse
    but that isn't likely to convince anyone if he has broken the guidelines
    of his job and saved the photos to his home computer.

    No defendant is likely to admit that he has the photo for the purposes
    of masturbation. Obviously that wouldn't be regarded as a legitimate
    excuse. Excuses along the lines of "I was abused as a child and I wanted
    to write about it after fully understanding the motives of my abuser"
    are likely to be regarded, by the jury, as self-serving lies.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Roger Hayter@21:1/5 to Max Demian on Mon Oct 30 13:15:32 2023
    On 30 Oct 2023 at 12:08:25 GMT, "Max Demian" <max_demian@bigfoot.com> wrote:

    On 29/10/2023 19:18, Jethro_uk wrote:
    On Sun, 29 Oct 2023 13:41:55 +0000, Mark Goodge wrote:

    There is an argument for treating all non-sexual nudity, irrespective of >>> context, as lawful, yes. The difficulty with that, though, at least as
    far as real human subjects are concerned, is that just because something >>> is non-sexual (or, at least, not obviously sexual) doesn't mean it's
    also non-abusive.

    Some people find feet and photos thereof - especially with shoes -
    sexually arousing. As an example of beginning the long journey of reductio >> ad absurdum

    All sex is evil (except when it isn't).

    Does anyone know the difference between a paraphilia and a fetish?

    I guess that a paraphilia relates to whole organisms, or groups of them;
    while a fetish related to part of an organism or a non-biological object.
    Where plants come in this scheme I am not sure - but they don't seem to figure much. BICBW.

    --
    Roger Hayter

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From billy bookcase@21:1/5 to Max Demian on Mon Oct 30 13:37:59 2023
    "Max Demian" <max_demian@bigfoot.com> wrote in message news:uho6bo$ei7r$2@dont-email.me...

    Does anyone know the difference between a paraphilia and a fetish?

    A paraphilia is what somebody wearing a white coat would call it

    A fetish is what somebody wearing a rubber suit would call it.


    bb

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to The Todal on Mon Oct 30 13:54:19 2023
    On 30/10/2023 13:18, The Todal wrote:

    No defendant is likely to admit that he has the photo for the purposes
    of masturbation.

    Why not? That's what porn is for.

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to Roger Hayter on Mon Oct 30 13:59:23 2023
    On 30/10/2023 13:15, Roger Hayter wrote:
    On 30 Oct 2023 at 12:08:25 GMT, "Max Demian" <max_demian@bigfoot.com> wrote:

    All sex is evil (except when it isn't).

    Does anyone know the difference between a paraphilia and a fetish?

    I guess that a paraphilia relates to whole organisms, or groups of them; while a fetish related to part of an organism or a non-biological object. Where plants come in this scheme I am not sure - but they don't seem to figure
    much. BICBW.

    How about urolagnia? Fetish or paraphilia?

    A recent TV programme cited exhibitionism and voyeurism as examples of paraphilia.

    In the words of the old song, "Let's call the whole thing off."

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Max Demian@21:1/5 to The Todal on Mon Oct 30 14:10:26 2023
    On 30/10/2023 11:15, The Todal wrote:
    On 29/10/2023 16:12, Mark Goodge wrote:
    On Sun, 29 Oct 2023 13:04:10 +0000, The Todal <the_todal@icloud.com>
    wrote:
    On 28/10/2023 20:43, Mark Goodge wrote:

    I'm restating a widely held opinion that nudity, even of a child, is
    not in
    itself unlawful provided there is no abusive or sexual aspect to the
    photo.

    It may be a widely held opinion somewhere or other in the UK but it has
    no basis in law.

    Well, the IWF and the CPS both share that view, so I'm inclined to think
    they might be right. I agree that there's no explicit basis for it in
    statute, but both organisations reference case law to that effect.

    That proves my point.

    Neither the IWF nor the CPS can tell the nation what is or is not
    indecent. All they can do is apply a grading system which they regard as useful when deciding whether or not to prosecute.

    It seems to be rather peculiar that the CPS can only classify something
    after the jury has decided that it is indecent. So if the jury
    perversely decides that some hardcore isn't, after all, indecent, the
    CPS just have to twiddle their thumbs in frustration.

    If there is a prosecution the views of the IWF and CPS will not be
    admissible in evidence or in guidance from the judge.

    To put it very simply, the CPS decides whether or not to prosecute and
    if there is a prosecution the jury will usually convict. Faced with a defendant, a photograph and a prosecutor the jury will decide that
    although in their ordinary lives the word "indecent" has no meaning,
    they will do their civic duty and convict a defendant whom they assume
    will be a bad 'un who is probably a danger to kids.

    How often have you looked at a photograph or a video and thought "that's indecent"?  It is a word that no longer has a clear meaning in ordinary life, when we see naked people on dating shows and seemingly even if
    that's indecent (as it probably is) nobody regards it as a reason for
    banning the show.

    However, nobody would ever come up with a better law to catch those who collect and exchange photographs of naked children. It would be wrong to repeal the law and not substitute a better law.

    It's unclear why the collection and exchange of photographs of naked
    children (for example from old copies of "Health and Efficiency" - which included children to show that nudity isn't all about sex) need to be
    "caught" at all.

    --
    Max Demian

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Mon Oct 30 15:13:39 2023
    On 30/10/2023 15:02, Mark Goodge wrote:
    On 30 Oct 2023 12:15:38 GMT, Roger Hayter <roger@hayter.org> wrote:

    On 29 Oct 2023 at 22:31:01 GMT, "Mark Goodge"
    <usenet@listmail.good-stuff.co.uk> wrote:

    On Sun, 29 Oct 2023 16:54:53 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:

    If you, as an ordinary member of the public, were to scan a photo of a >>>>> naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted. >>>>> The circumstances in which the photo was originally "taken" are actually >>>>> irrelevant.

    Am I wrong in believing that you are not allowed to use the provenance of >>>> the image (e.g. the fact it came from a legitimate textbook) in your
    defence ?

    It's complicated. Fundamentally, it's the image which is unlawful, not the >>> person in possession of it. That is, an unlawful image is unlawful for
    anyone to possess (unless they have a defence), irrespective of their
    motives. If someone had in their possession a library of medical textbooks >>> which included photos of naked children, then that would not be an offence >>> even if they were routinely rubbing one out while looking at those photos. >>>
    However, what complicates it is the fact that the courts have decided that >>> making a digital copy of a photo counts as "making" for the purposes of the >>> law[1]. And the reason this complicates it is that this means that the
    circumstances of the creation of the copy are relevant, just as much as the >>> circumstances of the creation of the original. So if someone was making a >>> collection of copies of photos sources from medical textbooks, then those >>> collected photos could, potentially, be deemed unlawful even if the
    originals were not.


    Your first two paragraphs contradict one another. First you say it is the
    image itself that is indecent not the circumstances of its production.

    No, the circumstances of the production are a part of what makes it
    indecent. That was part of the judgment in R v Graham-Kerr. That is, that a photo of mere nakedness in a "legitimate setting" is not indecent. That is a ruling of fact by a precedent-setting court, and it's not open to a junior court to disregard it.

    Once again, you are wrong. You are misleading readers about the ratio
    decidendi of the Graham-Kerr case. This is irresponsible of you. Some
    might actually act to their detriment as a result of such advice.

    The judges did not say that a photo of mere nakedness in a legitimate
    setting is not indecent. No such precedent has ever been set.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to Roger Hayter on Mon Oct 30 15:02:26 2023
    On 30 Oct 2023 12:15:38 GMT, Roger Hayter <roger@hayter.org> wrote:

    On 29 Oct 2023 at 22:31:01 GMT, "Mark Goodge" ><usenet@listmail.good-stuff.co.uk> wrote:

    On Sun, 29 Oct 2023 16:54:53 -0000 (UTC), Jethro_uk
    <jethro_uk@hotmailbin.com> wrote:

    On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:

    If you, as an ordinary member of the public, were to scan a photo of a >>>> naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted. >>>> The circumstances in which the photo was originally "taken" are actually >>>> irrelevant.

    Am I wrong in believing that you are not allowed to use the provenance of >>> the image (e.g. the fact it came from a legitimate textbook) in your
    defence ?

    It's complicated. Fundamentally, it's the image which is unlawful, not the >> person in possession of it. That is, an unlawful image is unlawful for
    anyone to possess (unless they have a defence), irrespective of their
    motives. If someone had in their possession a library of medical textbooks >> which included photos of naked children, then that would not be an offence >> even if they were routinely rubbing one out while looking at those photos. >>
    However, what complicates it is the fact that the courts have decided that >> making a digital copy of a photo counts as "making" for the purposes of the >> law[1]. And the reason this complicates it is that this means that the
    circumstances of the creation of the copy are relevant, just as much as the >> circumstances of the creation of the original. So if someone was making a
    collection of copies of photos sources from medical textbooks, then those
    collected photos could, potentially, be deemed unlawful even if the
    originals were not.


    Your first two paragraphs contradict one another. First you say it is the >image itself that is indecent not the circumstances of its production.

    No, the circumstances of the production are a part of what makes it
    indecent. That was part of the judgment in R v Graham-Kerr. That is, that a photo of mere nakedness in a "legitimate setting" is not indecent. That is a ruling of fact by a precedent-setting court, and it's not open to a junior court to disregard it.

    The fact that this merely serves to shift the argument onto the question of what settings are legitimate is an entirely different matter.

    Then
    you say that when making a copy it is the circumstances of the copy that make >it indecent. This is, to say the least, inconsistent. Note, I am not blaming >you for this inconsistency. Winston Smith had the same problem when trying to >be a loyal citizen and expound doublethink.

    It's only inconsistent because of the parallel ruling, by a different precedent-setting court, that making a digital copy counts as "making", and
    not merely possession. Which means that if you then apply the principle from Graham-Kerr, then the circumstances of the making of the copy are relevant
    as well as the circumstances of the making of the original.

    FWIW, I think this is bad law, and I think it should be changed. But one of
    the reasons it hasn't been, is that there is yet more case law which has determined that possession requires the image, if digital, to actually be stored somewhere accessible to the defendant - that is, in the defendant's "custody or control". A deleted image is not normally considered to be in
    the possession of the defendant. Nor is it normally considered that an image which exists locally in only the browser cache is in the possession of the defendant. The fact that it is possible to retrieve images from the cache,
    and often possible to retrieve them after they have been deleted, is not in itself enough to support a charge of possession - rather, the prosecution
    has to prove that the defendant does have the requisite knowledge and
    ability to be able to do so.

    "Making", on the other hand, doesn't require any element of current control
    or custody. It is enough that the image was, at some time in the past,
    provably made by the defendant. And the making of an image in the cache, or
    one which has subsequently been deleted, is sufficient to make the offence.
    So in the case of digital images, it's a lot easier to prove making than it
    is to prove possession, particularly when the images have been viewed on a website or other remote system and never specifically downloaded as a local copy other than in the browser or app cache.

    So the CPS typically prefers to prosecute making rather than possession,
    unless they also want to prosecute the more serious offence of distribution
    (in which case posession is a necessary prerequisite). Which means that the circumstances of the making of the copy come into play.

    An interesting consequence of this decision by the CPS is that the
    Sentencing Council guidelines now state that if the only "making" is the
    making of a copy, rather than creating a new, original work, then the
    sentence should be caclulated as if the charge had been possession (which is
    a lesser offence) rather than making.

    [2] Not because I have any particular sympathy for perverts who collect
    indecent images, but simply because, as an IT professional, the idea that
    copying a file is the same as creating a file seems to me to be utterly
    bizarre. And it's also directly opposite to established legislation and case >> law in the realm of Intellectual Property, where it's firmly established
    that merely making a copy is *not* creating something new.

    While I agree about copying not being making, I don't think copyright law >helps us at all. It is no defence to breaching copyright law that one has used >the copied item to make an otherwise wholly original and superior work.

    That's not the point I'm making, sorry. I'm not referring to IP offences,
    but to the more basic principles. One of which is that a simple copy of a
    work does not create a new original work. If you have an image, and you make
    a copy of it, you do not thereby acquire any rights in the copy. All of the rights in the copy rest with the original creator, just as all the rights in the image you took the copy from rest with the original creator.

    Applying that back to indecent images, it seems to me that merely making a
    copy of an indecent image is not making a new indecent image. The maker of
    an image is the person who, at the time the image was made, would have intellectual property rights in that image (irrespective of whether they assert, trade or waive those rights). Lewis Carroll made a whole load of
    images of naked children, but people are not re-making them every time they view them online. Even if a copy ends up in their browser cache.

    The law as it stands, though, is a horrible mess. Particularly when it comes
    to images that might, or might not, be innocuous. But that's partly because public opinion is a horrible mess. It's a highly emotional topic, and
    people's reactions to it are not particularly consistent. And legislators -
    and even judges - are people.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to All on Mon Oct 30 15:29:01 2023
    On 29/10/2023 16:54, Jethro_uk wrote:
    On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:


    Medical textbooks may contain many "indecent" photographs. However, if
    there is a legitimate excuse for including them in a publication (eg to
    educate and instruct doctors) then there is unlikely to be a conviction,
    so no need for a prosecution.

    If you, as an ordinary member of the public, were to scan a photo of a
    naked child from a medical textbook and save it to a folder on your
    computer, then you would be at risk of being prosecuted and convicted.
    The circumstances in which the photo was originally "taken" are actually
    irrelevant.

    Am I wrong in believing that you are not allowed to use the provenance of
    the image (e.g. the fact it came from a legitimate textbook) in your
    defence ?


    That's an interesting question - it should not be conclusive as to
    whether the photograph is indecent but might be relevant to whether you
    have a reasonable excuse for possessing the indecent photograph.

    Consider this case:

    https://www.bailii.org/ew/cases/EWCA/Crim/2011/461.html

    A striking feature of counts 1 to 5 is that all the photographs to which
    they related are contained in books of photographs of well-known
    photographers that are widely available from reputable outlets. The
    books in question are: "The Age of Innocence" by David Hamilton, "At
    Twelve" by Sally Mann, and "Notes" by Jock Sturges. As was set out in
    agreed facts at the trial, those books are available for purchase in
    store or on-line from one or more major retail outlets such as WH Smith, Waterstones, Tesco and Amazon. One of the images in count 3 also appears
    as the front cover of another published book by Sally Mann, "Immediate
    Family", a copy of which was also seized from the appellant's home but
    which did not form the basis of a separate charge. The same photograph
    appeared in an article in The Guardian newspaper in May 2010 and was
    published in the on-line edition of that newspaper. Counsel formerly
    instructed on behalf of the Crown, Miss Oliver, who is unwell and unable
    to appear at today's hearing, informed the court in a written note of
    the result of her inquiries in relation to those other matters. She told
    the court that the publication of the photograph in The Guardian had
    been brought to the attention of the Crown Prosecution Service, which
    decided that no charges should be preferred. She also ascertained that
    the photographs in count 3 have since been on display at an exhibition
    in London between June and September 2010, where prints were on sale to visitors to the exhibition. Those facts were brought to the attention of
    the Crown Prosecution Service, which again decided that no charges
    should be preferred. Whilst that was the position communicated in Miss
    Oliver's written note, Mr Gray informs us today that there has not been
    a final decision in respect of those matters, although he accepts that
    there is no prosecution pending in relation to them.

    snip

    We have no doubt that we should refuse the application for a retrial. In
    our judgment a retrial is not in the interests of justice. We do not
    dispute for one moment that the question of indecency of photographs of
    this kind is one for a properly directed jury. It is, however, very
    unfair for an individual in the position of the appellant to be
    prosecuted for possession of photographs which are contained in widely available books. If it is wished to test whether the photographs in the
    books are indecent, the right way of dealing with the matter is by way
    of prosecution of the publisher or the retailer, not of an individual purchaser. We have already suggested that the decision to proceed
    against the appellant may have been influenced by the addition of count
    6. It seems to us to be particularly unfair to put the appellant through
    the trial process again when, for reasons already covered by us, count 6
    has long dropped out of the picture.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to Roger Hayter on Mon Oct 30 15:44:08 2023
    On 30 Oct 2023 12:23:21 GMT, Roger Hayter <roger@hayter.org> wrote:

    I would argue that anyone who makes their career out of becoming an expert on >the indecency of photographs has an overwhelming vested interest in there >being such a thing as an "indecent photograph".

    I don't think that's necessarily the case. I don't think that the members of the BBFC, for example, have a vested interest in there being material that
    they refuse to certify.

    Also, there's material that even a non-expert can easily identify. A photo
    of a child being raped, for example, doesn't need an expert to tell you that it's unlawful. It's only the borderline cases where an expert can provide useful assistance.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to The Todal on Mon Oct 30 15:37:53 2023
    On Mon, 30 Oct 2023 11:15:49 +0000, The Todal <the_todal@icloud.com> wrote:

    On 29/10/2023 16:12, Mark Goodge wrote:
    On Sun, 29 Oct 2023 13:04:10 +0000, The Todal <the_todal@icloud.com> wrote: >>
    On 28/10/2023 20:43, Mark Goodge wrote:
    On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote:

    On 27/10/2023 10:57, Mark Goodge wrote:
    On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>>>> wrote:

    And vice versa: for some, any image of a child's genitalia is indecent, >>>>>>> even if posted by the child's parents. "Won't the child be embarrassed >>>>>>> when he grows up." (I don't know whether that's the argument.)

    It may well be indecent, but it isn't illegal unless either there was some
    form of abuse involved in taking the photo or there is some sexual aspect to
    the photo. Photos of naked children at a nudist beach aren't illegal per se.

    Sorry, but you are wrong. You are merely offering your own opinion,
    which is quite valueless unless you happen to be serving on a jury at >>>>> the time.

    I'm restating a widely held opinion that nudity, even of a child, is not in
    itself unlawful provided there is no abusive or sexual aspect to the photo.

    It may be a widely held opinion somewhere or other in the UK but it has
    no basis in law.

    Well, the IWF and the CPS both share that view, so I'm inclined to think
    they might be right. I agree that there's no explicit basis for it in
    statute, but both organisations reference case law to that effect.


    That proves my point.

    Neither the IWF nor the CPS can tell the nation what is or is not
    indecent. All they can do is apply a grading system which they regard as >useful when deciding whether or not to prosecute.

    If there is a prosecution the views of the IWF and CPS will not be
    admissible in evidence or in guidance from the judge.

    To put it very simply, the CPS decides whether or not to prosecute and
    if there is a prosecution the jury will usually convict. Faced with a >defendant, a photograph and a prosecutor the jury will decide that
    although in their ordinary lives the word "indecent" has no meaning,
    they will do their civic duty and convict a defendant whom they assume
    will be a bad 'un who is probably a danger to kids.

    According to the CPS (citing case law),

    Whilst members of a jury are representative of the public, it remains
    essential for them to consider the issue of indecency by reference to an
    objective test, rather than applying their wholly subjective views of the
    matter (R v Neal [2011] EWCA Crim 461).

    and

    The provisions are complex, not least because they involve a mix of legal
    and evidential burdens. Careful directions to the jury will be required.

    It seems to me, therefore, that where someone is on trial for possession (or making) an indecent image, then the judge is expected to give them some guidance on how to decide what constitutes "indecent". Of course, in a lot
    of cases, that won't be an issue because the images will clearly be sexual
    in nature. But where there is a possibility that the images are innocuous,
    then the jury is expected to apply an objective test, and the judge is
    entitled to give them guidance on how to apply that test.

    How often have you looked at a photograph or a video and thought "that's >indecent"? It is a word that no longer has a clear meaning in ordinary
    life, when we see naked people on dating shows and seemingly even if
    that's indecent (as it probably is) nobody regards it as a reason for
    banning the show.

    However, nobody would ever come up with a better law to catch those who >collect and exchange photographs of naked children. It would be wrong to >repeal the law and not substitute a better law.

    I'm not suggesting repealing the law. I do think it needs to be updated. I think it needs to redefine "possession" and "making" to be closer to what
    the average person on the Balham Borisbus would understand by them and that would be consistent with other legislation which uses the same terminology
    (for example, that dealing with possessing or making terrorist material, and the definitions of making used in intellectual property law). And I think it also needs to add a new offence of "viewing" (or "consuming") to avoid
    having to decide whether looking at an image online is possessing or making.
    I also think it needs to make the motive of both the creator of the image
    and the consumer of the image a relevant consideration. Although I've cited
    R v Graham-Kerr a lot in support of the assertion that innocuous nudity is
    not unlawful, reading the case as a whole it's hard to avoid the conclusion that Mr Graham-Kerr was actually a perv and got away with it. And the reason
    he got away with it is that the appeal court couldn't find anything in the legislation which would criminalise him while not also criminalising the official photographer at the event, which would clearly be an absurd conclusion. Had they been able to take his motive into account, the outcome could have been different.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From GB@21:1/5 to The Todal on Mon Oct 30 16:03:39 2023
    On 30/10/2023 13:18, The Todal wrote:

    The stages in the decision are: (a) did the defendant intend to take
    that photo (or save it onto his computer?

    I don't recall many of the details, but wasn't this image just a
    thumbnail? Presumably, the caches on all our computers are stuffed with thumbnails we didn't even notice?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From GB@21:1/5 to The Todal on Mon Oct 30 16:18:56 2023
    On 30/10/2023 15:29, The Todal wrote:

    Consider this case:

    https://www.bailii.org/ew/cases/EWCA/Crim/2011/461.html


    "Count 6 related to a DVD depicting an adult male of large proportions penetrating with his penis the anus of a female child who had not yet
    achieved puberty. The reason why the Recorder directed an acquittal on
    that count was his acceptance that there was no evidence that one of the relevant statutory criteria was satisfied, namely that the image
    portrayed in an explicit and realistic way "an act which results or is
    likely to result in serious injury to a person's anus" (section
    60(7)(b)). As the Recorder explained to the jury, there were no sounds
    of distress (there was no sound track to the DVD) and there were no
    obvious signs of distress displayed in the body language of the subject.
    No expert evidence had been called to suggest that serious injury would
    be likely to result from the act depicted, and the disparity in size
    between the man and the female in the image was insufficient to provide
    a proper evidential basis for conviction."


    Surely, the CPS failed badly here?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to The Todal on Mon Oct 30 16:30:43 2023
    On Mon, 30 Oct 2023 15:13:39 +0000, The Todal <the_todal@icloud.com> wrote:

    On 30/10/2023 15:02, Mark Goodge wrote:

    No, the circumstances of the production are a part of what makes it
    indecent. That was part of the judgment in R v Graham-Kerr. That is, that a >> photo of mere nakedness in a "legitimate setting" is not indecent. That is a >> ruling of fact by a precedent-setting court, and it's not open to a junior >> court to disregard it.

    Once again, you are wrong. You are misleading readers about the ratio >decidendi of the Graham-Kerr case. This is irresponsible of you. Some
    might actually act to their detriment as a result of such advice.

    The judges did not say that a photo of mere nakedness in a legitimate
    setting is not indecent. No such precedent has ever been set.

    Well, their actual words were that nakedness in a legitimate setting does
    not in itself give rise to a pornographic image. I agree that "pornographic"
    is not the same word as "indecent", and some might argue that an image can
    be indecent even though it is not pornographic. There were other aspects to
    the case, one of which was the motive of the photographer was not a relevant consideration. So I also agree that the question of legitimacy is not the
    only consideration. But, nonetheless, the offence for which Mr Graham-Kerr
    had originally been convicted was making an indecent image, and that was the conviction overturned on appeal. So the court's ultimate conclusion was that the image was not indecent (since the fact of it having been taken by Mr Graham-Kerr was not in dispute), and the comments related to pornography
    have to be interpreted in that light. And this is the interpretation placed
    on it by the IWF, among others.

    Now, you may not be persuaded by the IWF's opinions, but you cannot deny
    that they have some degree of authority in this field. And, in particular,
    this case (and others like it) caused the IWF to change their own policies
    and adopt a narrower definition of material that warrants being blocked.
    Given that the majority of criticism of the IWF revolves around overreach rather than underreach, I think that an explicit change of their policy to narrow their definitions is worthy of note.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to All on Mon Oct 30 17:13:23 2023
    On 30/10/2023 16:03, GB wrote:
    On 30/10/2023 13:18, The Todal wrote:

    The stages in the decision are: (a) did the defendant intend to take
    that photo (or save it onto his computer?

    I don't recall many of the details, but wasn't this image just a
    thumbnail? Presumably, the caches on all our computers are stuffed with thumbnails we didn't even notice?


    Are you referring to the David Mould case? If so, it was certainly a
    live issue at trial as to whether the image had been deliberately saved
    to the computer.

    "The material disclosed in the exhibits showed that it was more likely
    that he had created the .bmp file deliberately rather than accidentally,
    as he claimed".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Todal@21:1/5 to Mark Goodge on Mon Oct 30 17:33:35 2023
    On 30/10/2023 16:30, Mark Goodge wrote:
    On Mon, 30 Oct 2023 15:13:39 +0000, The Todal <the_todal@icloud.com> wrote:

    On 30/10/2023 15:02, Mark Goodge wrote:

    No, the circumstances of the production are a part of what makes it
    indecent. That was part of the judgment in R v Graham-Kerr. That is, that a >>> photo of mere nakedness in a "legitimate setting" is not indecent. That is a
    ruling of fact by a precedent-setting court, and it's not open to a junior >>> court to disregard it.

    Once again, you are wrong. You are misleading readers about the ratio
    decidendi of the Graham-Kerr case. This is irresponsible of you. Some
    might actually act to their detriment as a result of such advice.

    The judges did not say that a photo of mere nakedness in a legitimate
    setting is not indecent. No such precedent has ever been set.

    Well, their actual words were that nakedness in a legitimate setting does
    not in itself give rise to a pornographic image. I agree that "pornographic" is not the same word as "indecent", and some might argue that an image can
    be indecent even though it is not pornographic. There were other aspects to the case, one of which was the motive of the photographer was not a relevant consideration. So I also agree that the question of legitimacy is not the only consideration. But, nonetheless, the offence for which Mr Graham-Kerr had originally been convicted was making an indecent image, and that was the conviction overturned on appeal. So the court's ultimate conclusion was that the image was not indecent (since the fact of it having been taken by Mr Graham-Kerr was not in dispute), and the comments related to pornography
    have to be interpreted in that light. And this is the interpretation placed on it by the IWF, among others.


    If you have a full transcript of the Graham-Kerr Court of Appeal
    decision then please quote from it or provide a link. I have referred to
    a service called "Current Law" which provides a reliable precis of each important piece of case law, but does not quote the full judgment.

    I think you have misunderstood the judgment and maybe the IWF have misunderstood it too. It would be a wholly eccentric statement of law
    to say that certain photographs of naked children cannot be indecent. It
    would be inconsistent with the rest of the case law before and since.
    And it is not an interpretation that appears in Archbold or other
    authoritative legal textbooks.

    In the Graham Kerr case the Court of Appeal did *not* rule that the
    photographs of naked children at a swimming pool were not indecent.

    They allowed his appeal because the judge had misdirected the jury by
    saying that the Defendant's admission that he found the photographs
    sexually stimulating were relevant to whether the photographs were
    indecent. His motive, his admissions about his sexual attraction to
    children, should have been excluded from the material put before the jury.

    Plainly, he was very lucky. He won his appeal not because the
    photographs were innocuous but because the judge f*cked up.


    Now, you may not be persuaded by the IWF's opinions, but you cannot deny
    that they have some degree of authority in this field. And, in particular, this case (and others like it) caused the IWF to change their own policies and adopt a narrower definition of material that warrants being blocked. Given that the majority of criticism of the IWF revolves around overreach rather than underreach, I think that an explicit change of their policy to narrow their definitions is worthy of note.


    I think it might reasonably be said that it should not be up to the IWF
    or the CPS to decide whether a photograph is or is not lawful, in the
    manner of the Lord Chamberlain. Fashions change and one day it might be possible to collect photos of naked children (photos that are thought to
    be beautiful as distinct from those abhorrent ones that show sexual
    abuse) and not face prosecution. The prosecutors are in effect trying to preserve in aspic the attitudes of society at the time when the
    legislation was passed. Whether I personally approve of the IWF's
    opinions is not relevant. I have no interest in collecting photographs
    of that sort.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From GB@21:1/5 to The Todal on Mon Oct 30 17:38:02 2023
    On 30/10/2023 17:13, The Todal wrote:
    On 30/10/2023 16:03, GB wrote:
    On 30/10/2023 13:18, The Todal wrote:

    The stages in the decision are: (a) did the defendant intend to take
    that photo (or save it onto his computer?

    I don't recall many of the details, but wasn't this image just a
    thumbnail? Presumably, the caches on all our computers are stuffed
    with thumbnails we didn't even notice?


    Are you referring to the David Mould case?  If so, it was certainly a
    live issue at trial as to whether the image had been deliberately saved
    to the computer.

    Yes. Well worked out. :)

    It was the DM case I was referring to. For some reason, I was convinced
    you had mentioned it in your post, whereas it was further back in the
    thread. Sorry about that.



    "The material disclosed in the exhibits showed that it was more likely
    that he had created the .bmp file deliberately rather than accidentally,
    as he claimed".






    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to The Todal on Mon Oct 30 21:18:57 2023
    On Mon, 30 Oct 2023 17:33:35 +0000, The Todal <the_todal@icloud.com> wrote:

    If you have a full transcript of the Graham-Kerr Court of Appeal
    decision then please quote from it or provide a link. I have referred to
    a service called "Current Law" which provides a reliable precis of each >important piece of case law, but does not quote the full judgment.

    I don't have a full transcript. It's not on Bailii. I'm basing my comments
    on published articles about the case, which essentially make the point that I've been making.

    In the Graham Kerr case the Court of Appeal did *not* rule that the >photographs of naked children at a swimming pool were not indecent.

    They allowed his appeal because the judge had misdirected the jury by
    saying that the Defendant's admission that he found the photographs
    sexually stimulating were relevant to whether the photographs were
    indecent. His motive, his admissions about his sexual attraction to
    children, should have been excluded from the material put before the jury.

    Plainly, he was very lucky. He won his appeal not because the
    photographs were innocuous but because the judge f*cked up.

    But if the photos were not innocuous, then his conviction would still stand. Or, at least, a retrial would be in order. If he had taken a photo of, say,
    a child engaging in sexual activity, then the photo would be plainly
    indecent irrespective of his motives. It's the fact that it wasn't a sexual photo which caused the original ourt to go off on the red herring of motive.

    FWIW, as I've said elsewhere, I think the law should allow the court to take motive into consideration. That's the only way you can realistically distinguish between the actions of Mr Graham-Kerr and the actions of the official photographer at the event, given that the photos taken by both of
    them were essentially identical in form. Because if the photos taken by the official photographer were innocuous (and there has been absolutely no suggestion that they were not), then so must those taken by Mr Graham-Kerr.
    The difference between them lies not in the content, but in the
    circumstances of their being taken.

    I think it might reasonably be said that it should not be up to the IWF
    or the CPS to decide whether a photograph is or is not lawful, in the
    manner of the Lord Chamberlain.

    Given the IWF's role, it is their responsibility to make at least some assessment of whether an image is or is not lawful. Obviously, only a court
    can make a definitive ruling. But the IWF couldn't do their job if they
    didn't have a definition for their internal purposes which is broadly in accordance with the law.

    Fashions change and one day it might be
    possible to collect photos of naked children (photos that are thought to
    be beautiful as distinct from those abhorrent ones that show sexual
    abuse) and not face prosecution. The prosecutors are in effect trying to >preserve in aspic the attitudes of society at the time when the
    legislation was passed. Whether I personally approve of the IWF's
    opinions is not relevant. I have no interest in collecting photographs
    of that sort.

    I suspect it would be lawful now, provided the collector could demonstrate
    (as in the case of R v Neal, which you referenced earlier) that the photos
    were all readily available elsewhere and all they were doing was collecting them. I don't think there's any realistic prospect of someone being
    convicted for having a full set of Lewis Carroll's photos of naked children, for example. But I'm not sure that public morality would agree with that.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mark Goodge@21:1/5 to NOTsomeone@microsoft.invalid on Mon Oct 30 21:36:03 2023
    On Mon, 30 Oct 2023 16:18:56 +0000, GB <NOTsomeone@microsoft.invalid> wrote:

    On 30/10/2023 15:29, The Todal wrote:

    Consider this case:

    https://www.bailii.org/ew/cases/EWCA/Crim/2011/461.html


    "Count 6 related to a DVD depicting an adult male of large proportions >penetrating with his penis the anus of a female child who had not yet >achieved puberty. The reason why the Recorder directed an acquittal on
    that count was his acceptance that there was no evidence that one of the >relevant statutory criteria was satisfied, namely that the image
    portrayed in an explicit and realistic way "an act which results or is
    likely to result in serious injury to a person's anus" (section
    60(7)(b)). As the Recorder explained to the jury, there were no sounds
    of distress (there was no sound track to the DVD) and there were no
    obvious signs of distress displayed in the body language of the subject.
    No expert evidence had been called to suggest that serious injury would
    be likely to result from the act depicted, and the disparity in size
    between the man and the female in the image was insufficient to provide
    a proper evidential basis for conviction."


    Surely, the CPS failed badly here?

    It looks like it, yes. Had they gone for a charge of simple possession of
    the DVD, it would have been a slam-dunk conviction. And, given the content,
    the sentence would have been at the upper end of the range. But by going for the more serious charge despite the absence of the evidence necessary to
    make it stick, they not only lost that but lost the opportunity to prosecute the less serious.

    I also suspect that the CPS's mistake with the DVD charge is part of what
    led the jury to reach a legally untenable conclusion on the other charges. Reading between the lines a bit (and speculating, of course, since we can
    never know for certain what was talked about in the jury room) I have a
    feeling that their reasoning went something along the lines of "Well, we
    know he's a wrong-un, the DVD proves that, but we can't convict him for that because the CPS chose the wrong charge, so instead we'll convict him on the others to make sure that the pedo gets his just desserts". And, from a
    purely moral perspective, that's not necessarily invalid. It would be the "right" result, albeit reached via the wrong process. But, of course, the
    law doesn't allow that. Even though we know he's a wrong-un, we can't
    convict him of a crime he hasn't committed just to make up for the fact
    that, due to someone else's error, he's got away with a crime he has
    committed.

    Mark

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From GB@21:1/5 to Mark Goodge on Tue Oct 31 09:16:03 2023
    On 30/10/2023 21:36, Mark Goodge wrote:
    On Mon, 30 Oct 2023 16:18:56 +0000, GB <NOTsomeone@microsoft.invalid> wrote:

    On 30/10/2023 15:29, The Todal wrote:

    Consider this case:

    https://www.bailii.org/ew/cases/EWCA/Crim/2011/461.html


    "Count 6 related to a DVD depicting an adult male of large proportions
    penetrating with his penis the anus of a female child who had not yet
    achieved puberty. The reason why the Recorder directed an acquittal on
    that count was his acceptance that there was no evidence that one of the
    relevant statutory criteria was satisfied, namely that the image
    portrayed in an explicit and realistic way "an act which results or is
    likely to result in serious injury to a person's anus" (section
    60(7)(b)). As the Recorder explained to the jury, there were no sounds
    of distress (there was no sound track to the DVD) and there were no
    obvious signs of distress displayed in the body language of the subject.
    No expert evidence had been called to suggest that serious injury would
    be likely to result from the act depicted, and the disparity in size
    between the man and the female in the image was insufficient to provide
    a proper evidential basis for conviction."


    Surely, the CPS failed badly here?

    It looks like it, yes. Had they gone for a charge of simple possession of
    the DVD, it would have been a slam-dunk conviction. And, given the content, the sentence would have been at the upper end of the range. But by going for the more serious charge despite the absence of the evidence necessary to
    make it stick, they not only lost that but lost the opportunity to prosecute the less serious.

    I also suspect that the CPS's mistake with the DVD charge is part of what
    led the jury to reach a legally untenable conclusion on the other charges. Reading between the lines a bit (and speculating, of course, since we can never know for certain what was talked about in the jury room) I have a feeling that their reasoning went something along the lines of "Well, we
    know he's a wrong-un, the DVD proves that, but we can't convict him for that because the CPS chose the wrong charge, so instead we'll convict him on the others to make sure that the pedo gets his just desserts". And, from a
    purely moral perspective, that's not necessarily invalid. It would be the "right" result, albeit reached via the wrong process. But, of course, the
    law doesn't allow that. Even though we know he's a wrong-un, we can't
    convict him of a crime he hasn't committed just to make up for the fact
    that, due to someone else's error, he's got away with a crime he has committed.

    Mark

    That's what the COA thought, too:

    "The next question, however, is whether, having directed an acquittal on
    count 6 and given that explanation, the Recorder ought to have
    discharged the jury. There was no application by defence counsel for the
    jury to be discharged. It is submitted on appeal that the Recorder ought nonetheless to have discharged the jury of his own motion. Mr Gray, who
    appears today on behalf of the Crown, concedes that the jury should have
    been discharged given the potential prejudice caused in relation to the remaining counts by the jury's knowledge of the subject matter of count
    6. We agree that the jury ought to have been discharged. There was, in
    our judgment, a very real risk of the jury being prejudiced by what they
    had seen of the DVD in their assessment of the case against the
    appellant on the remaining counts (counts 1 to 5)."

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Adam Funk@21:1/5 to billy bookcase on Tue Oct 31 11:25:54 2023
    On 2023-10-30, billy bookcase wrote:


    "Max Demian" <max_demian@bigfoot.com> wrote in message news:uho6bo$ei7r$2@dont-email.me...

    Does anyone know the difference between a paraphilia and a fetish?

    A paraphilia is what somebody wearing a white coat would call it

    A fetish is what somebody wearing a rubber suit would call it.

    I was going to say something along the same lines but your version is
    funnier. Good work.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)