"Paedophiles using AI to turn singers and film stars into kids.
Paedophiles are using artificial intelligence (AI) to create images of >celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images is
that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced without >involving children.
Should we be encouraging this, rather than discouraging it, as providing
a relatively harmless outlet for paedophiles?
"Paedophiles using AI to turn singers and film stars into kids.
Paedophiles are using artificial intelligence (AI) to create images of celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images is
that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced without involving children.
Should we be encouraging this, rather than discouraging it, as providing
a relatively harmless outlet for paedophiles?
"Paedophiles using AI to turn singers and film stars into kids.
Paedophiles are using artificial intelligence (AI) to create images of celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images is
that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced without involving children.
Should we be encouraging this, rather than discouraging it, as providing
a relatively harmless outlet for paedophiles?
On 25/10/2023 13:25, GB wrote:
"Paedophiles using AI to turn singers and film stars into kids.Surely if the image is 'indecent' does it matter if the face is 'real'
Paedophiles are using artificial intelligence (AI) to create images of
celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images is
that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced without
involving children.
Should we be encouraging this, rather than discouraging it, as
providing a relatively harmless outlet for paedophiles?
or AI ?
On 25/10/2023 13:25, GB wrote:
"Paedophiles using AI to turn singers and film stars into kids.Surely if the image is 'indecent' does it matter if the face is 'real'
Paedophiles are using artificial intelligence (AI) to create images of
celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images is
that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced without
involving children.
Should we be encouraging this, rather than discouraging it, as providing
a relatively harmless outlet for paedophiles?
or AI ?
"Paedophiles using AI to turn singers and film stars into kids.
Paedophiles are using artificial intelligence (AI) to create images of celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images is
that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced without involving children.
Should we be encouraging this, rather than discouraging it, as providing
a relatively harmless outlet for paedophiles?
On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:
On 25/10/2023 13:25, GB wrote:
"Paedophiles using AI to turn singers and film stars into kids.Surely if the image is 'indecent' does it matter if the face is 'real'
Paedophiles are using artificial intelligence (AI) to create images of
celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images is
that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced without
involving children.
Should we be encouraging this, rather than discouraging it, as
providing a relatively harmless outlet for paedophiles?
or AI ?
Well then we are well down the road to censorship pure and simple.
Because who gets to decide "indecent" ? Bearing in mind the moment the
public were allowed to judge, an awful lot of what we had been told was indecent was judged not to be.
On 25/10/2023 13:25, GB wrote:
"Paedophiles using AI to turn singers and film stars into kids.Surely if the image is 'indecent' does it matter if the face is 'real'
Paedophiles are using artificial intelligence (AI) to create images of
celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images
is that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced
without involving children.
Should we be encouraging this, rather than discouraging it, as
providing a relatively harmless outlet for paedophiles?
or AI ?
On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:
Surely if the image is 'indecent' does it matter if the face is 'real'
or AI ?
Well then we are well down the road to censorship pure and simple.
Because who gets to decide "indecent" ?
On 25/10/2023 23:12, TTman wrote:
On 25/10/2023 13:25, GB wrote:
"Paedophiles using AI to turn singers and film stars into kids.Surely if the image is 'indecent' does it matter if the face is 'real'
Paedophiles are using artificial intelligence (AI) to create images of
celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female
singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images
is that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced
without involving children.
Should we be encouraging this, rather than discouraging it, as
providing a relatively harmless outlet for paedophiles?
or AI ?
Usually, laws exist to stop others being harmed.
If no-one is harmed by the activity concerned, why should the law
prohibit it?
On Thu, 26 Oct 2023 11:56:13 +0100, Norman Wells wrote:
On 25/10/2023 23:12, TTman wrote:
On 25/10/2023 13:25, GB wrote:
"Paedophiles using AI to turn singers and film stars into kids.Surely if the image is 'indecent' does it matter if the face is 'real'
Paedophiles are using artificial intelligence (AI) to create images of >>>> celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female >>>> singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images
is that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced
without involving children.
Should we be encouraging this, rather than discouraging it, as
providing a relatively harmless outlet for paedophiles?
or AI ?
Usually, laws exist to stop others being harmed.
If no-one is harmed by the activity concerned, why should the law
prohibit it?
Thank you John Stuart Mill.
Now explain our drugs laws.
And vice versa: for some, any image of a child's genitalia is indecent,
even if posted by the child's parents. "Won't the child be embarrassed
when he grows up." (I don't know whether that's the argument.)
On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk <jethro_uk@hotmailbin.com> wrote:
On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:
Surely if the image is 'indecent' does it matter if the face is 'real'
or AI ?
Well then we are well down the road to censorship pure and simple.
We already have censorship pure and simple. AI hasn't changed that, and
won't change that.
Because who gets to decide "indecent" ?
The court, using the usual yardstick of the disinterested observer. If it looks indecent, then it is indecent.
Bear in mind that "indecent" per se is not the same as illegal. There's a
lot of indecent imagery which is perfectly legal. There's a massive industry built around generating and distributing such material. It only becomes illegal if it depicts certain people, or certain actions, as well as being indecent.
Mark
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> wrote:
And vice versa: for some, any image of a child's genitalia is indecent,
even if posted by the child's parents. "Won't the child be embarrassed
when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was some form of abuse involved in taking the photo or there is some sexual aspect to the photo. Photos of naked children at a nudist beach aren't illegal per se.
Drawing the line between lawful and unlawful images can be difficult, not least because both "indecent" and "sexual" are often a matter of perception. But the CPS charging guidelines do attempt to address the nuance of it.
Mark
It's been the case for some time now that pseudo-photographs which depict >certain forms of indecent imagery are just as illegal as real photographs >showing the same content. Whether it's done with AI or Photoshop doesn't >really change that.
The justification for criminalising pseudo-photographs of this nature is >twofold. One argument is that such images are often a pathway to real
images, which do involve actual abuse. It's widely recognised that people
can become addicted to porn, and those who do often find themselves seeking >out ever more extreme material as they become desensitised to that which
they are familiar with. By this argument, criminalising such images does
help prevent actual abuse.
This argument is somewhat controversial, but it's not unreasonable to accept >that unless it is comprehensively refuted the precautionary principle >probably falls in favour of criminalisation.
The other argument is that as long as the origin of the image is irrelevant, >the prosecution doesn't have to prove that it is a real photograph and not a >pseudo-photograph - it's purely what it looks like to the hypothetical >traveller on the Peckham Borisbus that matters, not aspects which can only
be determined by detailed forensic analysis (and, often not even by that). >Pornographic photographs, like ducks, are recognised by sight.
This has the benefit that there is no possible defence of "it's only a >pseudo-photograph", which, in the absence of information leading back to a >genuine victim, may be very hard for the prosecution to refute to the >standard necessary for a criminal conviction. If the CPS had to prove that a >photograph was real, then it would be practically impossible to secure a >conviction in very many cases. So the criminalisation of pseudo-photographs >does, somewhat paradoxically, make it easier to prosecute possession of real >photographs of unlawful material.
Personally, though, I think that illegal AI porn is less of an issue than
the BBC news article implies. The problem with fighting Child Sexual Abuse >Material (CSAM) isn't how it's made, it's how it's distributed. It's just as >easy to share digital copies of real abuse as it is of AI-generated abuse.
So the number of sources isn't really a major issue. It's the number of >consumers which is the problem. And I don't think that's going to rise just >because people can now generate it using AI as well as with a camera.
What seems to me to be more of an issue with AI porn is not unlawful porn, >but lawful porn (that is, depicting adults engaged in consensual sexual >activities) which is AI-generated or AI-manipulated to represent real
people. If someone takes a photo of a person fully clothed, and then runs it >through an AI "nudify" app, then the resulting image isn't unlawful porn in >itself. But publishing that AI-manipulated photo may well be very
distressing to that person. The problem is, there are no laws which would >currently make it illegal. It isn't unlawful imagery per se, it isn't an >invasion of privacy, and it isn't revenge porn. But it will happen. I
suspect it's already happening, on the kind of websites that I'd prefer not >to look at (not even in the name of research[1]).
I suspect, though, that this won't receive any legislative attention until a >deepfake of Keir Starmer rogering Rishi Sunak starts to do the rounds. In
the meantime, "won't somebody think of the children" is a far more potent >campaign slogan.
[1] Actually, if you do a Google[2] image search for female celebrities with >safe search turned off, the chances are you'll find one in the results. Fake >nudes have been a thing for a while, but AI is making them more widespread.
[2] Other search engines are available.
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
wrote:
And vice versa: for some, any image of a child's genitalia is indecent,
even if posted by the child's parents. "Won't the child be embarrassed
when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was
some
form of abuse involved in taking the photo or there is some sexual
aspect to
the photo. Photos of naked children at a nudist beach aren't illegal
per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at
the time.
Many photos of naked children involve no "abuse" (eg upskirt photo of a
girl on a swing) but are deemed indecent and criminal by a jury, and
many would say rightly so.
Drawing the line between lawful and unlawful images can be difficult, not
least because both "indecent" and "sexual" are often a matter of
perception.
But the CPS charging guidelines do attempt to address the nuance of it.
Mark
On Wed, 25 Oct 2023 17:52:33 +0100, Mark Goodge <usenet@listmail.good-stuff.co.uk> wrote:
It's been the case for some time now that pseudo-photographs which depict >>certain forms of indecent imagery are just as illegal as real photographs >>showing the same content. Whether it's done with AI or Photoshop doesn't >>really change that.
The justification for criminalising pseudo-photographs of this nature is >>twofold. One argument is that such images are often a pathway to real >>images, which do involve actual abuse. It's widely recognised that people >>can become addicted to porn, and those who do often find themselves
seeking
out ever more extreme material as they become desensitised to that which >>they are familiar with. By this argument, criminalising such images does >>help prevent actual abuse.
This argument is somewhat controversial, but it's not unreasonable to >>accept
that unless it is comprehensively refuted the precautionary principle >>probably falls in favour of criminalisation.
The other argument is that as long as the origin of the image is >>irrelevant,
the prosecution doesn't have to prove that it is a real photograph and not >>a
pseudo-photograph - it's purely what it looks like to the hypothetical >>traveller on the Peckham Borisbus that matters, not aspects which can only >>be determined by detailed forensic analysis (and, often not even by that). >>Pornographic photographs, like ducks, are recognised by sight.
This has the benefit that there is no possible defence of "it's only a >>pseudo-photograph", which, in the absence of information leading back to a >>genuine victim, may be very hard for the prosecution to refute to the >>standard necessary for a criminal conviction. If the CPS had to prove that >>a
photograph was real, then it would be practically impossible to secure a >>conviction in very many cases. So the criminalisation of
pseudo-photographs
does, somewhat paradoxically, make it easier to prosecute possession of >>real
photographs of unlawful material.
Personally, though, I think that illegal AI porn is less of an issue than >>the BBC news article implies. The problem with fighting Child Sexual Abuse >>Material (CSAM) isn't how it's made, it's how it's distributed. It's just >>as
easy to share digital copies of real abuse as it is of AI-generated abuse. >>So the number of sources isn't really a major issue. It's the number of >>consumers which is the problem. And I don't think that's going to rise
just
because people can now generate it using AI as well as with a camera.
What seems to me to be more of an issue with AI porn is not unlawful porn, >>but lawful porn (that is, depicting adults engaged in consensual sexual >>activities) which is AI-generated or AI-manipulated to represent real >>people. If someone takes a photo of a person fully clothed, and then runs >>it
through an AI "nudify" app, then the resulting image isn't unlawful porn
in
itself. But publishing that AI-manipulated photo may well be very >>distressing to that person. The problem is, there are no laws which would >>currently make it illegal. It isn't unlawful imagery per se, it isn't an >>invasion of privacy, and it isn't revenge porn. But it will happen. I >>suspect it's already happening, on the kind of websites that I'd prefer
not
to look at (not even in the name of research[1]).
I suspect, though, that this won't receive any legislative attention until >>a
deepfake of Keir Starmer rogering Rishi Sunak starts to do the rounds. In >>the meantime, "won't somebody think of the children" is a far more potent >>campaign slogan.
[1] Actually, if you do a Google[2] image search for female celebrities >>with
safe search turned off, the chances are you'll find one in the results. >>Fake
nudes have been a thing for a while, but AI is making them more
widespread.
[2] Other search engines are available.
I wonder if one unintended side effect of not being able to readily distinguish between fake and real will be that blackmailing could
become ineffective. "That photo is not of me - you can send it to
whom you want".
On Thu, 26 Oct 2023 11:56:13 +0100, Norman Wells wrote:
On 25/10/2023 23:12, TTman wrote:
On 25/10/2023 13:25, GB wrote:
"Paedophiles using AI to turn singers and film stars into kids.Surely if the image is 'indecent' does it matter if the face is 'real'
Paedophiles are using artificial intelligence (AI) to create images of >>>> celebrities as children.
The Internet Watch Foundation (IWF) said images of a well-known female >>>> singer reimagined as a child are being shared by predators."
https://www.bbc.co.uk/news/technology-67172231
One of the arguments for prosecuting possession of paedophile images
is that the making and dissemination of those images involves harm to
children. Yet, child pornography can now apparently be produced
without involving children.
Should we be encouraging this, rather than discouraging it, as
providing a relatively harmless outlet for paedophiles?
or AI ?
Usually, laws exist to stop others being harmed.
If no-one is harmed by the activity concerned, why should the law
prohibit it?
Thank you John Stuart Mill.
Now explain our drugs laws.
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> wrote:
And vice versa: for some, any image of a child's genitalia is indecent, >>even if posted by the child's parents. "Won't the child be embarrassed
when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was
some form of abuse involved in taking the photo or there is some sexual aspect to the photo. Photos of naked children at a nudist beach aren't illegal per se.
Drawing the line between lawful and unlawful images can be difficult,
not least because both "indecent" and "sexual" are often a matter of perception.
But the CPS charging guidelines do attempt to address the nuance of it.
On 26/10/2023 13:44, Mark Goodge wrote:
On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk
<jethro_uk@hotmailbin.com> wrote:
On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:
Surely if the image is 'indecent' does it matter if the face is
'real'
or AI ?
Well then we are well down the road to censorship pure and simple.
We already have censorship pure and simple. AI hasn't changed that, and
won't change that.
Because who gets to decide "indecent" ?
The court, using the usual yardstick of the disinterested observer. If
it looks indecent, then it is indecent.
In fact, the jury rather than the court decides what is or is not
indecent. The jury is expected to use its common sense rather than look
to any "expert" guidance.
On 27/10/2023 11:21, The Todal wrote:
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
wrote:
And vice versa: for some, any image of a child's genitalia is indecent, >>>> even if posted by the child's parents. "Won't the child be embarrassed >>>> when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was
some
form of abuse involved in taking the photo or there is some sexual
aspect to
the photo. Photos of naked children at a nudist beach aren't illegal
per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at
the time.
Many photos of naked children involve no "abuse" (eg upskirt photo of
a girl on a swing) but are deemed indecent and criminal by a jury, and
many would say rightly so.
Cartoons of naked children can also land you in gaol. Any depiction of indecency can land you in gaol, even if the actors are well over 18.
I would prefer those with a certain sexual orientation to masturbate to
these cartoons to satisfy themselves rather than aim directly for
children. Your morality on the subject suggests otherwise?
https://www.gazettelive.co.uk/news/teesside-news/anime-fan-convicted-over-illegal-7958896
But a child was saved!
Drawing the line between lawful and unlawful images can be difficult,
not
least because both "indecent" and "sexual" are often a matter of
perception.
But the CPS charging guidelines do attempt to address the nuance of it.
Mark
quote
We turn to the two other grounds which Mr Burton has argued before us.
He submits, first of all, that the photograph itself could not possibly
ever be said to be indecent. He submits that similar photographs can be
found in medical text books. To label this photograph as indecent would
mean that photographs of a similar kind in medical text books would also
be indecent.
If the test for deciding whether a photograph is indecent or not is
whether or not it is the kind of photograph which appears in medical
text books, then many of the photographs with which these courts are all
too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing this
as part of some medical research. We take the view that a jury was
entitled to reach the conclusion that this photograph was indecent as
the prosecution alleged.
On 27/10/2023 12:54, Fredxx wrote:
On 27/10/2023 11:21, The Todal wrote:
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>> wrote:
And vice versa: for some, any image of a child's genitalia is
indecent,
even if posted by the child's parents. "Won't the child be embarrassed >>>>> when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there
was some
form of abuse involved in taking the photo or there is some sexual
aspect to
the photo. Photos of naked children at a nudist beach aren't illegal
per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at
the time.
Many photos of naked children involve no "abuse" (eg upskirt photo of
a girl on a swing) but are deemed indecent and criminal by a jury,
and many would say rightly so.
Cartoons of naked children can also land you in gaol. Any depiction of
indecency can land you in gaol, even if the actors are well over 18.
I would prefer those with a certain sexual orientation to masturbate
to these cartoons to satisfy themselves rather than aim directly for
children. Your morality on the subject suggests otherwise?
*My* morality? I resent the implication that my own opinions are those
that are held by many people. I try not to agree with the majority.
It is probably that the CPS will pick and choose which images require a prosecution and which are too tame. Faced with a defendant and a
prosecutor I think most juries will tamely accept that a photograph of a child is indecent.
I am reminded of the remarks of the Court of Appeal in 2000 when hearing
an appeal against conviction pursued by David Mould (the chap
subsequently convicted of rather more serious offences).
quote
We turn to the two other grounds which Mr Burton has argued before us.
He submits, first of all, that the photograph itself could not possibly
ever be said to be indecent. He submits that similar photographs can be
found in medical text books. To label this photograph as indecent would
mean that photographs of a similar kind in medical text books would also
be indecent.
If the test for deciding whether a photograph is indecent or not is
whether or not it is the kind of photograph which appears in medical
text books, then many of the photographs with which these courts are all
too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing this
as part of some medical research. We take the view that a jury was
entitled to reach the conclusion that this photograph was indecent as
the prosecution alleged.
https://www.gazettelive.co.uk/news/teesside-news/anime-fan-convicted-over-illegal-7958896
But a child was saved!
Drawing the line between lawful and unlawful images can be
difficult, not
least because both "indecent" and "sexual" are often a matter of
perception.
But the CPS charging guidelines do attempt to address the nuance of it. >>>>
Mark
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
wrote:
And vice versa: for some, any image of a child's genitalia is indecent,
even if posted by the child's parents. "Won't the child be embarrassed
when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was some >> form of abuse involved in taking the photo or there is some sexual aspect to >> the photo. Photos of naked children at a nudist beach aren't illegal per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at
the time.
Many photos of naked children involve no "abuse" (eg upskirt photo of a
girl on a swing) but are deemed indecent and criminal by a jury, and
many would say rightly so.
On 26/10/2023 13:44, Mark Goodge wrote:
On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk
<jethro_uk@hotmailbin.com> wrote:
On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:
Surely if the image is 'indecent' does it matter if the face is 'real' >>>> or AI ?
Well then we are well down the road to censorship pure and simple.
We already have censorship pure and simple. AI hasn't changed that, and
won't change that.
Because who gets to decide "indecent" ?
The court, using the usual yardstick of the disinterested observer. If it
looks indecent, then it is indecent.
In fact, the jury rather than the court decides what is or is not
indecent. The jury is expected to use its common sense rather than look
to any "expert" guidance.
On 27/10/2023 16:22, The Todal wrote:
quote
We turn to the two other grounds which Mr Burton has argued before us.
He submits, first of all, that the photograph itself could not possibly
ever be said to be indecent. He submits that similar photographs can be
found in medical text books. To label this photograph as indecent would
mean that photographs of a similar kind in medical text books would also
be indecent.
If the test for deciding whether a photograph is indecent or not is
whether or not it is the kind of photograph which appears in medical
text books, then many of the photographs with which these courts are all
too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing this
as part of some medical research. We take the view that a jury was
entitled to reach the conclusion that this photograph was indecent as
the prosecution alleged.
Is that really saying that a photo in one context is indecent, and in a >different context it isn't indecent?
Or, is it saying that the photo is
indecent, but there's a defence for owning it in some circumstances?
On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote:
On 27/10/2023 16:22, The Todal wrote:
quote
We turn to the two other grounds which Mr Burton has argued before us.
He submits, first of all, that the photograph itself could not possibly
ever be said to be indecent. He submits that similar photographs can be
found in medical text books. To label this photograph as indecent would
mean that photographs of a similar kind in medical text books would also >>> be indecent.
If the test for deciding whether a photograph is indecent or not is
whether or not it is the kind of photograph which appears in medical
text books, then many of the photographs with which these courts are all >>> too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing this
as part of some medical research. We take the view that a jury was
entitled to reach the conclusion that this photograph was indecent as
the prosecution alleged.
Is that really saying that a photo in one context is indecent, and in a
different context it isn't indecent?
Not quite. It's a bit more nuanced than that. It's saying that mere similarity to an image known to be lawful does not automatically make a different image lawful (or, indeed, vice versa, similarity to an unlawful image does not ipso facto make a new image unlawful). Each image has to be judged on its own merits.
There are a number of factors involved in determining whether an image is indecent, and one of those is the question of whether the image arises from
a "legitimate setting". So, for example, there is case law[1] to the effect that a consensual photo of a naked child taken by a swimming instructor at a nudist swimming session is not unlawful, because you expect children participating in a nudist swimming session to be naked and therefore the
only issue is one of consent. But if a swimming instructor at a normal session persuaded a child to remove their swimwear in order to have a photo taken naked, that would be unlawful, because it would not be a legitimate setting. Despite the fact that to someone looking at the two photos they may be practically identical, one would be unlawful and the other would not be.
That's the broad thrust of the court's response to the argument cited above. Although Mr Mould's photos looked similar to those in medical textbooks, their setting was different. A photo taken of a naked child for publication in a medical textbook is a legitimate setting, a photo taken of a naked
child for publication in a pornographic magazine is not. Even if, to the observer, the photos are extremely similar.
It may seem somewhat counterintuitive that identical photos can be either lawful or unlawful depending on the context of how they were taken. But that's a necessary distinction if you want to avoid things like
criminalising parents who take naked photos of their own children. Or, indeed, people who take photos for the purposes of education or reportage. The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.
Or, is it saying that the photo is
indecent, but there's a defence for owning it in some circumstances?
That's an entirely different argument. There are some defences to possessing an unlawful image. But they are, deliberately, few and far between.
[1] R v Graham-Kerr (1989) 88 Cr App R 302
Mark
You might want to ask yourself why there were no prosecutions over the
photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for
example. Or,
indeed, comments made in R v Oliver[2] and the judgment in R v Graham-Kerr[2].
On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge" <usenet@listmail.good-stuff.co.uk> wrote:
On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote: >>
On 27/10/2023 16:22, The Todal wrote:
quote
We turn to the two other grounds which Mr Burton has argued before us. >>>> He submits, first of all, that the photograph itself could not possibly >>>> ever be said to be indecent. He submits that similar photographs can be >>>> found in medical text books. To label this photograph as indecent would >>>> mean that photographs of a similar kind in medical text books would also >>>> be indecent.
If the test for deciding whether a photograph is indecent or not is
whether or not it is the kind of photograph which appears in medical
text books, then many of the photographs with which these courts are all >>>> too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing this >>>> as part of some medical research. We take the view that a jury was
entitled to reach the conclusion that this photograph was indecent as
the prosecution alleged.
Is that really saying that a photo in one context is indecent, and in a
different context it isn't indecent?
Not quite. It's a bit more nuanced than that. It's saying that mere
similarity to an image known to be lawful does not automatically make a
different image lawful (or, indeed, vice versa, similarity to an unlawful
image does not ipso facto make a new image unlawful). Each image has to be >> judged on its own merits.
There are a number of factors involved in determining whether an image is
indecent, and one of those is the question of whether the image arises from >> a "legitimate setting". So, for example, there is case law[1] to the effect >> that a consensual photo of a naked child taken by a swimming instructor at a >> nudist swimming session is not unlawful, because you expect children
participating in a nudist swimming session to be naked and therefore the
only issue is one of consent. But if a swimming instructor at a normal
session persuaded a child to remove their swimwear in order to have a photo >> taken naked, that would be unlawful, because it would not be a legitimate
setting. Despite the fact that to someone looking at the two photos they may >> be practically identical, one would be unlawful and the other would not be. >>
That's the broad thrust of the court's response to the argument cited above. >> Although Mr Mould's photos looked similar to those in medical textbooks,
their setting was different. A photo taken of a naked child for publication >> in a medical textbook is a legitimate setting, a photo taken of a naked
child for publication in a pornographic magazine is not. Even if, to the
observer, the photos are extremely similar.
It may seem somewhat counterintuitive that identical photos can be either
lawful or unlawful depending on the context of how they were taken. But
that's a necessary distinction if you want to avoid things like
criminalising parents who take naked photos of their own children. Or,
indeed, people who take photos for the purposes of education or reportage. >> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.
You could equally avoid criminalising parents by not pretending a photograph is indecent when it isn't. How is this logic applied to pseudophotographs and cartoons? Are they only indecent when the possessor is a paedophile? You may notice a certain paradox here.
On Fri, 27 Oct 2023 11:01:10 +0100, The Todal <the_todal@icloud.com> wrote:
On 26/10/2023 13:44, Mark Goodge wrote:
On Thu, 26 Oct 2023 06:28:04 -0000 (UTC), Jethro_uk
<jethro_uk@hotmailbin.com> wrote:
On Wed, 25 Oct 2023 23:12:29 +0100, TTman wrote:
Surely if the image is 'indecent' does it matter if the face is 'real' >>>>> or AI ?
Well then we are well down the road to censorship pure and simple.
We already have censorship pure and simple. AI hasn't changed that, and
won't change that.
Because who gets to decide "indecent" ?
The court, using the usual yardstick of the disinterested observer. If it >>> looks indecent, then it is indecent.
In fact, the jury rather than the court decides what is or is not
indecent. The jury is expected to use its common sense rather than look
to any "expert" guidance.
The jury is part of the court. It's the part which makes decisions on questions of fact in a Crown Court trial. In other types of court, magistrates or judges make decisions on questions of fact. Referring to "the court" is merely a simple shorthand for "the people in a court whose responsibility it is to make decisions on questions of fact".
On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote:
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com>
wrote:
And vice versa: for some, any image of a child's genitalia is indecent, >>>> even if posted by the child's parents. "Won't the child be embarrassed >>>> when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was some >>> form of abuse involved in taking the photo or there is some sexual aspect to
the photo. Photos of naked children at a nudist beach aren't illegal per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at
the time.
I'm restating a widely held opinion that nudity, even of a child, is not in itself unlawful provided there is no abusive or sexual aspect to the photo.
You might want to ask yourself why there were no prosecutions over the photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for example. Or, indeed, comments made in R v Oliver[2] and the judgment in R v Graham-Kerr[2].
On 28/10/2023 21:18, Mark Goodge wrote:
The jury is part of the court. It's the part which makes decisions on
questions of fact in a Crown Court trial. In other types of court,
magistrates or judges make decisions on questions of fact. Referring to "the >> court" is merely a simple shorthand for "the people in a court whose
responsibility it is to make decisions on questions of fact".
But when you say "the court" many people might wrongly assume that it's
the judge who makes the decision about whether or not a photograph is >indecent.
On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge" ><usenet@listmail.good-stuff.co.uk> wrote:
It may seem somewhat counterintuitive that identical photos can be either
lawful or unlawful depending on the context of how they were taken. But
that's a necessary distinction if you want to avoid things like
criminalising parents who take naked photos of their own children. Or,
indeed, people who take photos for the purposes of education or reportage. >> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.
You could equally avoid criminalising parents by not pretending a photograph >is indecent when it isn't.
How is this logic applied to pseudophotographs and
cartoons? Are they only indecent when the possessor is a paedophile? You may >notice a certain paradox here.
On 29/10/2023 09:20, Roger Hayter wrote:
On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge"
<usenet@listmail.good-stuff.co.uk> wrote:
On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote:
On 27/10/2023 16:22, The Todal wrote:
quote
We turn to the two other grounds which Mr Burton has argued before us. >>>>> He submits, first of all, that the photograph itself could not possibly >>>>> ever be said to be indecent. He submits that similar photographs can be >>>>> found in medical text books. To label this photograph as indecent would >>>>> mean that photographs of a similar kind in medical text books would also >>>>> be indecent.
If the test for deciding whether a photograph is indecent or not is
whether or not it is the kind of photograph which appears in medical >>>>> text books, then many of the photographs with which these courts are all >>>>> too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing this >>>>> as part of some medical research. We take the view that a jury was
entitled to reach the conclusion that this photograph was indecent as >>>>> the prosecution alleged.
Is that really saying that a photo in one context is indecent, and in a >>>> different context it isn't indecent?
Not quite. It's a bit more nuanced than that. It's saying that mere
similarity to an image known to be lawful does not automatically make a
different image lawful (or, indeed, vice versa, similarity to an unlawful >>> image does not ipso facto make a new image unlawful). Each image has to be >>> judged on its own merits.
There are a number of factors involved in determining whether an image is >>> indecent, and one of those is the question of whether the image arises from >>> a "legitimate setting". So, for example, there is case law[1] to the effect >>> that a consensual photo of a naked child taken by a swimming instructor at a
nudist swimming session is not unlawful, because you expect children
participating in a nudist swimming session to be naked and therefore the >>> only issue is one of consent. But if a swimming instructor at a normal
session persuaded a child to remove their swimwear in order to have a photo >>> taken naked, that would be unlawful, because it would not be a legitimate >>> setting. Despite the fact that to someone looking at the two photos they may
be practically identical, one would be unlawful and the other would not be. >>>
That's the broad thrust of the court's response to the argument cited above.
Although Mr Mould's photos looked similar to those in medical textbooks, >>> their setting was different. A photo taken of a naked child for publication >>> in a medical textbook is a legitimate setting, a photo taken of a naked
child for publication in a pornographic magazine is not. Even if, to the >>> observer, the photos are extremely similar.
It may seem somewhat counterintuitive that identical photos can be either >>> lawful or unlawful depending on the context of how they were taken. But
that's a necessary distinction if you want to avoid things like
criminalising parents who take naked photos of their own children. Or,
indeed, people who take photos for the purposes of education or reportage. >>> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.
You could equally avoid criminalising parents by not pretending a photograph >> is indecent when it isn't. How is this logic applied to pseudophotographs and
cartoons? Are they only indecent when the possessor is a paedophile? You may >> notice a certain paradox here.
Medical textbooks may contain many "indecent" photographs. However, if
there is a legitimate excuse for including them in a publication (eg to educate and instruct doctors) then there is unlikely to be a conviction,
so no need for a prosecution.
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually irrelevant.
On 28/10/2023 20:43, Mark Goodge wrote:
On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote: >>
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>> wrote:
And vice versa: for some, any image of a child's genitalia is indecent, >>>>> even if posted by the child's parents. "Won't the child be embarrassed >>>>> when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was some >>>> form of abuse involved in taking the photo or there is some sexual aspect to
the photo. Photos of naked children at a nudist beach aren't illegal per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at
the time.
I'm restating a widely held opinion that nudity, even of a child, is not in >> itself unlawful provided there is no abusive or sexual aspect to the photo.
It may be a widely held opinion somewhere or other in the UK but it has
no basis in law.
On 29/10/2023 09:20, Roger Hayter wrote:
On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge"Medical textbooks may contain many "indecent" photographs. However, if
<usenet@listmail.good-stuff.co.uk> wrote:
On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid>
wrote:
On 27/10/2023 16:22, The Todal wrote:
quote
We turn to the two other grounds which Mr Burton has argued before
us. He submits, first of all, that the photograph itself could not
possibly ever be said to be indecent. He submits that similar
photographs can be found in medical text books. To label this
photograph as indecent would mean that photographs of a similar kind >>>>> in medical text books would also be indecent.
If the test for deciding whether a photograph is indecent or not is
whether or not it is the kind of photograph which appears in medical >>>>> text books, then many of the photographs with which these courts are >>>>> all too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing
this as part of some medical research. We take the view that a jury
was entitled to reach the conclusion that this photograph was
indecent as the prosecution alleged.
Is that really saying that a photo in one context is indecent, and in
a different context it isn't indecent?
Not quite. It's a bit more nuanced than that. It's saying that mere
similarity to an image known to be lawful does not automatically make
a different image lawful (or, indeed, vice versa, similarity to an
unlawful image does not ipso facto make a new image unlawful). Each
image has to be judged on its own merits.
There are a number of factors involved in determining whether an image
is indecent, and one of those is the question of whether the image
arises from a "legitimate setting". So, for example, there is case
law[1] to the effect that a consensual photo of a naked child taken by
a swimming instructor at a nudist swimming session is not unlawful,
because you expect children participating in a nudist swimming session
to be naked and therefore the only issue is one of consent. But if a
swimming instructor at a normal session persuaded a child to remove
their swimwear in order to have a photo taken naked, that would be
unlawful, because it would not be a legitimate setting. Despite the
fact that to someone looking at the two photos they may be practically
identical, one would be unlawful and the other would not be.
That's the broad thrust of the court's response to the argument cited
above.
Although Mr Mould's photos looked similar to those in medical
textbooks, their setting was different. A photo taken of a naked child
for publication in a medical textbook is a legitimate setting, a photo
taken of a naked child for publication in a pornographic magazine is
not. Even if, to the observer, the photos are extremely similar.
It may seem somewhat counterintuitive that identical photos can be
either lawful or unlawful depending on the context of how they were
taken. But that's a necessary distinction if you want to avoid things
like criminalising parents who take naked photos of their own
children. Or, indeed, people who take photos for the purposes of
education or reportage. The (in)famous photo of Phan Thi Kim Phuc
isn't unlawful, either.
You could equally avoid criminalising parents by not pretending a
photograph is indecent when it isn't. How is this logic applied to
pseudophotographs and cartoons? Are they only indecent when the
possessor is a paedophile? You may notice a certain paradox here.
there is a legitimate excuse for including them in a publication (eg to educate and instruct doctors) then there is unlikely to be a conviction,
so no need for a prosecution.
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually irrelevant.
On Sat, 28 Oct 2023 20:43:24 +0100, Mark Goodge wrote:
You might want to ask yourself why there were no prosecutions over the
photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for
example. Or,
indeed, comments made in R v Oliver[2] and the judgment in R v
Graham-Kerr[2].
Or the cover to Blind Faiths eponymous 1969 album
On 29 Oct 2023 09:20:02 GMT, Roger Hayter <roger@hayter.org> wrote:
[quoted text muted]
There is an argument for treating all non-sexual nudity, irrespective of context, as lawful, yes. The difficulty with that, though, at least as
far as real human subjects are concerned, is that just because something
is non-sexual (or, at least, not obviously sexual) doesn't mean it's
also non-abusive.
On Sat, 28 Oct 2023 20:43:24 +0100, Mark Goodge wrote:
You might want to ask yourself why there were no prosecutions over the
photo by Nan Goldin titled "Klara and Edda Belly-dancing"[1][2] for
example. Or,
indeed, comments made in R v Oliver[2] and the judgment in R v
Graham-Kerr[2].
Or the cover to Blind Faiths eponymous 1969 album
On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually
irrelevant.
Am I wrong in believing that you are not allowed to use the provenance of
the image (e.g. the fact it came from a legitimate textbook) in your
defence ?
On 29 Oct 2023 at 12:38:50 GMT, "The Todal" <the_todal@icloud.com> wrote:
On 29/10/2023 09:20, Roger Hayter wrote:
On 28 Oct 2023 at 20:15:08 BST, "Mark Goodge"
<usenet@listmail.good-stuff.co.uk> wrote:
On Fri, 27 Oct 2023 17:01:34 +0100, GB <NOTsomeone@microsoft.invalid> wrote:
On 27/10/2023 16:22, The Todal wrote:
quote
We turn to the two other grounds which Mr Burton has argued before us. >>>>>> He submits, first of all, that the photograph itself could not possibly >>>>>> ever be said to be indecent. He submits that similar photographs can be >>>>>> found in medical text books. To label this photograph as indecent would >>>>>> mean that photographs of a similar kind in medical text books would also >>>>>> be indecent.
If the test for deciding whether a photograph is indecent or not is >>>>>> whether or not it is the kind of photograph which appears in medical >>>>>> text books, then many of the photographs with which these courts are all >>>>>> too familiar could not be classified as indecent. ... It is not
suggested that he is the parent of the child or that he was doing this >>>>>> as part of some medical research. We take the view that a jury was >>>>>> entitled to reach the conclusion that this photograph was indecent as >>>>>> the prosecution alleged.
Is that really saying that a photo in one context is indecent, and in a >>>>> different context it isn't indecent?
Not quite. It's a bit more nuanced than that. It's saying that mere
similarity to an image known to be lawful does not automatically make a >>>> different image lawful (or, indeed, vice versa, similarity to an unlawful >>>> image does not ipso facto make a new image unlawful). Each image has to be >>>> judged on its own merits.
There are a number of factors involved in determining whether an image is >>>> indecent, and one of those is the question of whether the image arises from
a "legitimate setting". So, for example, there is case law[1] to the effect
that a consensual photo of a naked child taken by a swimming instructor at a
nudist swimming session is not unlawful, because you expect children
participating in a nudist swimming session to be naked and therefore the >>>> only issue is one of consent. But if a swimming instructor at a normal >>>> session persuaded a child to remove their swimwear in order to have a photo
taken naked, that would be unlawful, because it would not be a legitimate >>>> setting. Despite the fact that to someone looking at the two photos they may
be practically identical, one would be unlawful and the other would not be.
That's the broad thrust of the court's response to the argument cited above.
Although Mr Mould's photos looked similar to those in medical textbooks, >>>> their setting was different. A photo taken of a naked child for publication
in a medical textbook is a legitimate setting, a photo taken of a naked >>>> child for publication in a pornographic magazine is not. Even if, to the >>>> observer, the photos are extremely similar.
It may seem somewhat counterintuitive that identical photos can be either >>>> lawful or unlawful depending on the context of how they were taken. But >>>> that's a necessary distinction if you want to avoid things like
criminalising parents who take naked photos of their own children. Or, >>>> indeed, people who take photos for the purposes of education or reportage. >>>> The (in)famous photo of Phan Thi Kim Phuc isn't unlawful, either.
You could equally avoid criminalising parents by not pretending a photograph
is indecent when it isn't. How is this logic applied to pseudophotographs and
cartoons? Are they only indecent when the possessor is a paedophile? You may
notice a certain paradox here.
Medical textbooks may contain many "indecent" photographs. However, if
there is a legitimate excuse for including them in a publication (eg to
educate and instruct doctors) then there is unlikely to be a conviction,
so no need for a prosecution.
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually
irrelevant.
A reasonable POV, but Mark Goodge has just told us the complete opposite; that the motive for taking the picture and the circumstances it was taken in are vital to deciding if a picture is indecent. He even had a case to prove it
- the first David Mould case.
On Sun, 29 Oct 2023 12:40:38 +0000, The Todal <the_todal@icloud.com> wrote:
On 28/10/2023 21:18, Mark Goodge wrote:
The jury is part of the court. It's the part which makes decisions on
questions of fact in a Crown Court trial. In other types of court,
magistrates or judges make decisions on questions of fact. Referring to "the
court" is merely a simple shorthand for "the people in a court whose
responsibility it is to make decisions on questions of fact".
But when you say "the court" many people might wrongly assume that it's
the judge who makes the decision about whether or not a photograph is
indecent.
This is uk.legal.moderated, where I hope that the majority of participants would not make that mistake.
On Sun, 29 Oct 2023 13:04:10 +0000, The Todal <the_todal@icloud.com> wrote:
On 28/10/2023 20:43, Mark Goodge wrote:
On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote: >>>It may be a widely held opinion somewhere or other in the UK but it has
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>>> wrote:
And vice versa: for some, any image of a child's genitalia is indecent, >>>>>> even if posted by the child's parents. "Won't the child be embarrassed >>>>>> when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was some
form of abuse involved in taking the photo or there is some sexual aspect to
the photo. Photos of naked children at a nudist beach aren't illegal per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at
the time.
I'm restating a widely held opinion that nudity, even of a child, is not in >>> itself unlawful provided there is no abusive or sexual aspect to the photo. >>
no basis in law.
Well, the IWF and the CPS both share that view, so I'm inclined to think
they might be right. I agree that there's no explicit basis for it in statute, but both organisations reference case law to that effect.
On 29 Oct 2023 at 12:38:50 GMT, "The Todal" <the_todal@icloud.com> wrote:
Medical textbooks may contain many "indecent" photographs. However, if
there is a legitimate excuse for including them in a publication (eg to
educate and instruct doctors) then there is unlikely to be a conviction,
so no need for a prosecution.
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually
irrelevant.
A reasonable POV, but Mark Goodge has just told us the complete opposite; that the motive for taking the picture and the circumstances it was taken in are vital to deciding if a picture is indecent. He even had a case to prove it
- the first David Mould case.
On Sun, 29 Oct 2023 13:41:55 +0000, Mark Goodge wrote:
There is an argument for treating all non-sexual nudity, irrespective of
context, as lawful, yes. The difficulty with that, though, at least as
far as real human subjects are concerned, is that just because something
is non-sexual (or, at least, not obviously sexual) doesn't mean it's
also non-abusive.
Some people find feet and photos thereof - especially with shoes -
sexually arousing. As an example of beginning the long journey of reductio
ad absurdum
On Sun, 29 Oct 2023 16:54:53 -0000 (UTC), Jethro_uk <jethro_uk@hotmailbin.com> wrote:
On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually >>> irrelevant.
Am I wrong in believing that you are not allowed to use the provenance of
the image (e.g. the fact it came from a legitimate textbook) in your
defence ?
It's complicated. Fundamentally, it's the image which is unlawful, not the person in possession of it. That is, an unlawful image is unlawful for
anyone to possess (unless they have a defence), irrespective of their motives. If someone had in their possession a library of medical textbooks which included photos of naked children, then that would not be an offence even if they were routinely rubbing one out while looking at those photos.
However, what complicates it is the fact that the courts have decided that making a digital copy of a photo counts as "making" for the purposes of the law[1]. And the reason this complicates it is that this means that the circumstances of the creation of the copy are relevant, just as much as the circumstances of the creation of the original. So if someone was making a collection of copies of photos sources from medical textbooks, then those collected photos could, potentially, be deemed unlawful even if the
originals were not.
[1] FWIW, I think this was a poor decision[2], and I think that the legislation should be amended to make it clear that merely making a digital copy of an image is not the same as creating the image in the first place. But I suspect that there is little appetite in government circles to make such a change.
[2] Not because I have any particular sympathy for perverts who collect indecent images, but simply because, as an IT professional, the idea that copying a file is the same as creating a file seems to me to be utterly bizarre. And it's also directly opposite to established legislation and case law in the realm of Intellectual Property, where it's firmly established
that merely making a copy is *not* creating something new.
Mark
On 29/10/2023 16:56, Mark Goodge wrote:
On Sun, 29 Oct 2023 12:40:38 +0000, The Todal <the_todal@icloud.com> wrote: >>
On 28/10/2023 21:18, Mark Goodge wrote:
The jury is part of the court. It's the part which makes decisions on
questions of fact in a Crown Court trial. In other types of court,
magistrates or judges make decisions on questions of fact. Referring to "the
court" is merely a simple shorthand for "the people in a court whose
responsibility it is to make decisions on questions of fact".
But when you say "the court" many people might wrongly assume that it's
the judge who makes the decision about whether or not a photograph is
indecent.
This is uk.legal.moderated, where I hope that the majority of participants >> would not make that mistake.
This is uk.legal.moderated where most people aren't lawyers and when you
say "the court will decide" they would reasonably assume that it would
be a reliable decision based on expert evidence as to what is or is not indecent, plus case law.
Rather than, as what actually happens, the decision of 12 randomly
chosen jurors applying their notion of what society regards as decent, without any guidance from experts.
On 29/10/2023 13:21, Roger Hayter wrote:
On 29 Oct 2023 at 12:38:50 GMT, "The Todal" <the_todal@icloud.com> wrote:
Medical textbooks may contain many "indecent" photographs. However, if
there is a legitimate excuse for including them in a publication (eg to
educate and instruct doctors) then there is unlikely to be a conviction, >>> so no need for a prosecution.
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually >>> irrelevant.
A reasonable POV, but Mark Goodge has just told us the complete opposite;
that the motive for taking the picture and the circumstances it was
taken in
are vital to deciding if a picture is indecent. He even had a case to
prove it
- the first David Mould case.
I think it's more the circumstances it is *viewed*.
On 29/10/2023 19:18, Jethro_uk wrote:
On Sun, 29 Oct 2023 13:41:55 +0000, Mark Goodge wrote:
There is an argument for treating all non-sexual nudity, irrespective of >>> context, as lawful, yes. The difficulty with that, though, at least as
far as real human subjects are concerned, is that just because something >>> is non-sexual (or, at least, not obviously sexual) doesn't mean it's
also non-abusive.
Some people find feet and photos thereof - especially with shoes -
sexually arousing. As an example of beginning the long journey of reductio >> ad absurdum
All sex is evil (except when it isn't).
Does anyone know the difference between a paraphilia and a fetish?
Does anyone know the difference between a paraphilia and a fetish?
No defendant is likely to admit that he has the photo for the purposes
of masturbation.
On 30 Oct 2023 at 12:08:25 GMT, "Max Demian" <max_demian@bigfoot.com> wrote:
All sex is evil (except when it isn't).
Does anyone know the difference between a paraphilia and a fetish?
I guess that a paraphilia relates to whole organisms, or groups of them; while a fetish related to part of an organism or a non-biological object. Where plants come in this scheme I am not sure - but they don't seem to figure
much. BICBW.
On 29/10/2023 16:12, Mark Goodge wrote:
On Sun, 29 Oct 2023 13:04:10 +0000, The Todal <the_todal@icloud.com>
wrote:
On 28/10/2023 20:43, Mark Goodge wrote:
I'm restating a widely held opinion that nudity, even of a child, is
not in
itself unlawful provided there is no abusive or sexual aspect to the
photo.
It may be a widely held opinion somewhere or other in the UK but it has
no basis in law.
Well, the IWF and the CPS both share that view, so I'm inclined to think
they might be right. I agree that there's no explicit basis for it in
statute, but both organisations reference case law to that effect.
That proves my point.
Neither the IWF nor the CPS can tell the nation what is or is not
indecent. All they can do is apply a grading system which they regard as useful when deciding whether or not to prosecute.
If there is a prosecution the views of the IWF and CPS will not be
admissible in evidence or in guidance from the judge.
To put it very simply, the CPS decides whether or not to prosecute and
if there is a prosecution the jury will usually convict. Faced with a defendant, a photograph and a prosecutor the jury will decide that
although in their ordinary lives the word "indecent" has no meaning,
they will do their civic duty and convict a defendant whom they assume
will be a bad 'un who is probably a danger to kids.
How often have you looked at a photograph or a video and thought "that's indecent"? It is a word that no longer has a clear meaning in ordinary life, when we see naked people on dating shows and seemingly even if
that's indecent (as it probably is) nobody regards it as a reason for
banning the show.
However, nobody would ever come up with a better law to catch those who collect and exchange photographs of naked children. It would be wrong to repeal the law and not substitute a better law.
On 30 Oct 2023 12:15:38 GMT, Roger Hayter <roger@hayter.org> wrote:
On 29 Oct 2023 at 22:31:01 GMT, "Mark Goodge"
<usenet@listmail.good-stuff.co.uk> wrote:
On Sun, 29 Oct 2023 16:54:53 -0000 (UTC), Jethro_uk
<jethro_uk@hotmailbin.com> wrote:
On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:
If you, as an ordinary member of the public, were to scan a photo of a >>>>> naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted. >>>>> The circumstances in which the photo was originally "taken" are actually >>>>> irrelevant.
Am I wrong in believing that you are not allowed to use the provenance of >>>> the image (e.g. the fact it came from a legitimate textbook) in your
defence ?
It's complicated. Fundamentally, it's the image which is unlawful, not the >>> person in possession of it. That is, an unlawful image is unlawful for
anyone to possess (unless they have a defence), irrespective of their
motives. If someone had in their possession a library of medical textbooks >>> which included photos of naked children, then that would not be an offence >>> even if they were routinely rubbing one out while looking at those photos. >>>
However, what complicates it is the fact that the courts have decided that >>> making a digital copy of a photo counts as "making" for the purposes of the >>> law[1]. And the reason this complicates it is that this means that the
circumstances of the creation of the copy are relevant, just as much as the >>> circumstances of the creation of the original. So if someone was making a >>> collection of copies of photos sources from medical textbooks, then those >>> collected photos could, potentially, be deemed unlawful even if the
originals were not.
Your first two paragraphs contradict one another. First you say it is the
image itself that is indecent not the circumstances of its production.
No, the circumstances of the production are a part of what makes it
indecent. That was part of the judgment in R v Graham-Kerr. That is, that a photo of mere nakedness in a "legitimate setting" is not indecent. That is a ruling of fact by a precedent-setting court, and it's not open to a junior court to disregard it.
On 29 Oct 2023 at 22:31:01 GMT, "Mark Goodge" ><usenet@listmail.good-stuff.co.uk> wrote:
On Sun, 29 Oct 2023 16:54:53 -0000 (UTC), Jethro_uk
<jethro_uk@hotmailbin.com> wrote:
On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:
If you, as an ordinary member of the public, were to scan a photo of a >>>> naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted. >>>> The circumstances in which the photo was originally "taken" are actually >>>> irrelevant.
Am I wrong in believing that you are not allowed to use the provenance of >>> the image (e.g. the fact it came from a legitimate textbook) in your
defence ?
It's complicated. Fundamentally, it's the image which is unlawful, not the >> person in possession of it. That is, an unlawful image is unlawful for
anyone to possess (unless they have a defence), irrespective of their
motives. If someone had in their possession a library of medical textbooks >> which included photos of naked children, then that would not be an offence >> even if they were routinely rubbing one out while looking at those photos. >>
However, what complicates it is the fact that the courts have decided that >> making a digital copy of a photo counts as "making" for the purposes of the >> law[1]. And the reason this complicates it is that this means that the
circumstances of the creation of the copy are relevant, just as much as the >> circumstances of the creation of the original. So if someone was making a
collection of copies of photos sources from medical textbooks, then those
collected photos could, potentially, be deemed unlawful even if the
originals were not.
Your first two paragraphs contradict one another. First you say it is the >image itself that is indecent not the circumstances of its production.
Then
you say that when making a copy it is the circumstances of the copy that make >it indecent. This is, to say the least, inconsistent. Note, I am not blaming >you for this inconsistency. Winston Smith had the same problem when trying to >be a loyal citizen and expound doublethink.
[2] Not because I have any particular sympathy for perverts who collect
indecent images, but simply because, as an IT professional, the idea that
copying a file is the same as creating a file seems to me to be utterly
bizarre. And it's also directly opposite to established legislation and case >> law in the realm of Intellectual Property, where it's firmly established
that merely making a copy is *not* creating something new.
While I agree about copying not being making, I don't think copyright law >helps us at all. It is no defence to breaching copyright law that one has used >the copied item to make an otherwise wholly original and superior work.
On Sun, 29 Oct 2023 12:38:50 +0000, The Todal wrote:
Medical textbooks may contain many "indecent" photographs. However, if
there is a legitimate excuse for including them in a publication (eg to
educate and instruct doctors) then there is unlikely to be a conviction,
so no need for a prosecution.
If you, as an ordinary member of the public, were to scan a photo of a
naked child from a medical textbook and save it to a folder on your
computer, then you would be at risk of being prosecuted and convicted.
The circumstances in which the photo was originally "taken" are actually
irrelevant.
Am I wrong in believing that you are not allowed to use the provenance of
the image (e.g. the fact it came from a legitimate textbook) in your
defence ?
I would argue that anyone who makes their career out of becoming an expert on >the indecency of photographs has an overwhelming vested interest in there >being such a thing as an "indecent photograph".
On 29/10/2023 16:12, Mark Goodge wrote:
On Sun, 29 Oct 2023 13:04:10 +0000, The Todal <the_todal@icloud.com> wrote: >>
On 28/10/2023 20:43, Mark Goodge wrote:
On Fri, 27 Oct 2023 11:21:28 +0100, The Todal <the_todal@icloud.com> wrote:
On 27/10/2023 10:57, Mark Goodge wrote:
On Thu, 26 Oct 2023 11:50:45 +0100, Max Demian <max_demian@bigfoot.com> >>>>>> wrote:
And vice versa: for some, any image of a child's genitalia is indecent, >>>>>>> even if posted by the child's parents. "Won't the child be embarrassed >>>>>>> when he grows up." (I don't know whether that's the argument.)
It may well be indecent, but it isn't illegal unless either there was some
form of abuse involved in taking the photo or there is some sexual aspect to
the photo. Photos of naked children at a nudist beach aren't illegal per se.
Sorry, but you are wrong. You are merely offering your own opinion,
which is quite valueless unless you happen to be serving on a jury at >>>>> the time.
I'm restating a widely held opinion that nudity, even of a child, is not in
itself unlawful provided there is no abusive or sexual aspect to the photo.
It may be a widely held opinion somewhere or other in the UK but it has
no basis in law.
Well, the IWF and the CPS both share that view, so I'm inclined to think
they might be right. I agree that there's no explicit basis for it in
statute, but both organisations reference case law to that effect.
That proves my point.
Neither the IWF nor the CPS can tell the nation what is or is not
indecent. All they can do is apply a grading system which they regard as >useful when deciding whether or not to prosecute.
If there is a prosecution the views of the IWF and CPS will not be
admissible in evidence or in guidance from the judge.
To put it very simply, the CPS decides whether or not to prosecute and
if there is a prosecution the jury will usually convict. Faced with a >defendant, a photograph and a prosecutor the jury will decide that
although in their ordinary lives the word "indecent" has no meaning,
they will do their civic duty and convict a defendant whom they assume
will be a bad 'un who is probably a danger to kids.
How often have you looked at a photograph or a video and thought "that's >indecent"? It is a word that no longer has a clear meaning in ordinary
life, when we see naked people on dating shows and seemingly even if
that's indecent (as it probably is) nobody regards it as a reason for
banning the show.
However, nobody would ever come up with a better law to catch those who >collect and exchange photographs of naked children. It would be wrong to >repeal the law and not substitute a better law.
The stages in the decision are: (a) did the defendant intend to take
that photo (or save it onto his computer?
Consider this case:
https://www.bailii.org/ew/cases/EWCA/Crim/2011/461.html
On 30/10/2023 15:02, Mark Goodge wrote:
No, the circumstances of the production are a part of what makes it
indecent. That was part of the judgment in R v Graham-Kerr. That is, that a >> photo of mere nakedness in a "legitimate setting" is not indecent. That is a >> ruling of fact by a precedent-setting court, and it's not open to a junior >> court to disregard it.
Once again, you are wrong. You are misleading readers about the ratio >decidendi of the Graham-Kerr case. This is irresponsible of you. Some
might actually act to their detriment as a result of such advice.
The judges did not say that a photo of mere nakedness in a legitimate
setting is not indecent. No such precedent has ever been set.
On 30/10/2023 13:18, The Todal wrote:
The stages in the decision are: (a) did the defendant intend to take
that photo (or save it onto his computer?
I don't recall many of the details, but wasn't this image just a
thumbnail? Presumably, the caches on all our computers are stuffed with thumbnails we didn't even notice?
On Mon, 30 Oct 2023 15:13:39 +0000, The Todal <the_todal@icloud.com> wrote:
On 30/10/2023 15:02, Mark Goodge wrote:
No, the circumstances of the production are a part of what makes it
indecent. That was part of the judgment in R v Graham-Kerr. That is, that a >>> photo of mere nakedness in a "legitimate setting" is not indecent. That is a
ruling of fact by a precedent-setting court, and it's not open to a junior >>> court to disregard it.
Once again, you are wrong. You are misleading readers about the ratio
decidendi of the Graham-Kerr case. This is irresponsible of you. Some
might actually act to their detriment as a result of such advice.
The judges did not say that a photo of mere nakedness in a legitimate
setting is not indecent. No such precedent has ever been set.
Well, their actual words were that nakedness in a legitimate setting does
not in itself give rise to a pornographic image. I agree that "pornographic" is not the same word as "indecent", and some might argue that an image can
be indecent even though it is not pornographic. There were other aspects to the case, one of which was the motive of the photographer was not a relevant consideration. So I also agree that the question of legitimacy is not the only consideration. But, nonetheless, the offence for which Mr Graham-Kerr had originally been convicted was making an indecent image, and that was the conviction overturned on appeal. So the court's ultimate conclusion was that the image was not indecent (since the fact of it having been taken by Mr Graham-Kerr was not in dispute), and the comments related to pornography
have to be interpreted in that light. And this is the interpretation placed on it by the IWF, among others.
Now, you may not be persuaded by the IWF's opinions, but you cannot deny
that they have some degree of authority in this field. And, in particular, this case (and others like it) caused the IWF to change their own policies and adopt a narrower definition of material that warrants being blocked. Given that the majority of criticism of the IWF revolves around overreach rather than underreach, I think that an explicit change of their policy to narrow their definitions is worthy of note.
On 30/10/2023 16:03, GB wrote:
On 30/10/2023 13:18, The Todal wrote:
The stages in the decision are: (a) did the defendant intend to take
that photo (or save it onto his computer?
I don't recall many of the details, but wasn't this image just a
thumbnail? Presumably, the caches on all our computers are stuffed
with thumbnails we didn't even notice?
Are you referring to the David Mould case? If so, it was certainly a
live issue at trial as to whether the image had been deliberately saved
to the computer.
"The material disclosed in the exhibits showed that it was more likely
that he had created the .bmp file deliberately rather than accidentally,
as he claimed".
If you have a full transcript of the Graham-Kerr Court of Appeal
decision then please quote from it or provide a link. I have referred to
a service called "Current Law" which provides a reliable precis of each >important piece of case law, but does not quote the full judgment.
In the Graham Kerr case the Court of Appeal did *not* rule that the >photographs of naked children at a swimming pool were not indecent.
They allowed his appeal because the judge had misdirected the jury by
saying that the Defendant's admission that he found the photographs
sexually stimulating were relevant to whether the photographs were
indecent. His motive, his admissions about his sexual attraction to
children, should have been excluded from the material put before the jury.
Plainly, he was very lucky. He won his appeal not because the
photographs were innocuous but because the judge f*cked up.
I think it might reasonably be said that it should not be up to the IWF
or the CPS to decide whether a photograph is or is not lawful, in the
manner of the Lord Chamberlain.
Fashions change and one day it might be
possible to collect photos of naked children (photos that are thought to
be beautiful as distinct from those abhorrent ones that show sexual
abuse) and not face prosecution. The prosecutors are in effect trying to >preserve in aspic the attitudes of society at the time when the
legislation was passed. Whether I personally approve of the IWF's
opinions is not relevant. I have no interest in collecting photographs
of that sort.
On 30/10/2023 15:29, The Todal wrote:
Consider this case:
https://www.bailii.org/ew/cases/EWCA/Crim/2011/461.html
"Count 6 related to a DVD depicting an adult male of large proportions >penetrating with his penis the anus of a female child who had not yet >achieved puberty. The reason why the Recorder directed an acquittal on
that count was his acceptance that there was no evidence that one of the >relevant statutory criteria was satisfied, namely that the image
portrayed in an explicit and realistic way "an act which results or is
likely to result in serious injury to a person's anus" (section
60(7)(b)). As the Recorder explained to the jury, there were no sounds
of distress (there was no sound track to the DVD) and there were no
obvious signs of distress displayed in the body language of the subject.
No expert evidence had been called to suggest that serious injury would
be likely to result from the act depicted, and the disparity in size
between the man and the female in the image was insufficient to provide
a proper evidential basis for conviction."
Surely, the CPS failed badly here?
On Mon, 30 Oct 2023 16:18:56 +0000, GB <NOTsomeone@microsoft.invalid> wrote:
On 30/10/2023 15:29, The Todal wrote:
Consider this case:
https://www.bailii.org/ew/cases/EWCA/Crim/2011/461.html
"Count 6 related to a DVD depicting an adult male of large proportions
penetrating with his penis the anus of a female child who had not yet
achieved puberty. The reason why the Recorder directed an acquittal on
that count was his acceptance that there was no evidence that one of the
relevant statutory criteria was satisfied, namely that the image
portrayed in an explicit and realistic way "an act which results or is
likely to result in serious injury to a person's anus" (section
60(7)(b)). As the Recorder explained to the jury, there were no sounds
of distress (there was no sound track to the DVD) and there were no
obvious signs of distress displayed in the body language of the subject.
No expert evidence had been called to suggest that serious injury would
be likely to result from the act depicted, and the disparity in size
between the man and the female in the image was insufficient to provide
a proper evidential basis for conviction."
Surely, the CPS failed badly here?
It looks like it, yes. Had they gone for a charge of simple possession of
the DVD, it would have been a slam-dunk conviction. And, given the content, the sentence would have been at the upper end of the range. But by going for the more serious charge despite the absence of the evidence necessary to
make it stick, they not only lost that but lost the opportunity to prosecute the less serious.
I also suspect that the CPS's mistake with the DVD charge is part of what
led the jury to reach a legally untenable conclusion on the other charges. Reading between the lines a bit (and speculating, of course, since we can never know for certain what was talked about in the jury room) I have a feeling that their reasoning went something along the lines of "Well, we
know he's a wrong-un, the DVD proves that, but we can't convict him for that because the CPS chose the wrong charge, so instead we'll convict him on the others to make sure that the pedo gets his just desserts". And, from a
purely moral perspective, that's not necessarily invalid. It would be the "right" result, albeit reached via the wrong process. But, of course, the
law doesn't allow that. Even though we know he's a wrong-un, we can't
convict him of a crime he hasn't committed just to make up for the fact
that, due to someone else's error, he's got away with a crime he has committed.
Mark
"Max Demian" <max_demian@bigfoot.com> wrote in message news:uho6bo$ei7r$2@dont-email.me...
Does anyone know the difference between a paraphilia and a fetish?
A paraphilia is what somebody wearing a white coat would call it
A fetish is what somebody wearing a rubber suit would call it.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 300 |
Nodes: | 16 (2 / 14) |
Uptime: | 49:57:24 |
Calls: | 6,711 |
Calls today: | 4 |
Files: | 12,243 |
Messages: | 5,354,920 |
Posted today: | 1 |