On 9/16/2021 10:40 AM, Wilf wrote:
<snip>
The only thing is, and this is something I've seen elsewhere, so it's
not my own, could a malevolent government insist that other hashes
were included in the database and therefore hijack the purpose of
this move?
That's one of, but not the only, objection to this back door (or
whatever anyone wants to call it). And the question is not "could a malevolent government insist...?," it's a virtual certainty that they
would insist.
Would Apple cave if such a government insisted is the real question.
stock up on the tin foil. you are running low.
On Sep 16, 2021, nospam wrote
(in article<news:160920211410457746%nospam@nospam.invalid>):
And the question is not "could a
malevolent government insist...?," it's a virtual certainty that they
would insist.
they might insist, but the question is will they succeed.
the answer is a resounding no.
Why then did Apple say they follow all the laws everywhere iPhones are sold?
On Sep 16, 2021, nospam wrote
(in article<news:160920211425149884%nospam@nospam.invalid>):
Why then did Apple say they follow all the laws everywhere iPhones are sold?
they do.
Then Apple isn't going to say no to governments who expand these back doors
Ron, the humblest guy in town.
In article <p8flgfea8xm5$.dlg@news.solani.org>, RonTheGuy
<ron@null.invalid> wrote:
Apple can't control malware & government coders who open these back doors.
there aren't any backdoors to open.
seriously, read the white papers and learn what a backdoor actually is
before you go throwing around words you don't understand.
On 9/16/2021 11:22 AM, RonTheGuy wrote:
On Sep 16, 2021, nospam wrote
(in article<news:160920211410457746%nospam@nospam.invalid>):
And the question is not "could a
malevolent government insist...?," it's a virtual certainty that they
would insist.
they might insist, but the question is will they succeed.
the answer is a resounding no.
Why then did Apple say they follow all the laws everywhere iPhones are sold? >>
Is Apple now going to say a "resounding no" to the laws of those countries?
Based on the past, the answer is that it's likely that Apple would
comply with the insistence of a government
On 16/09/2021 at 10:39, Wilf wrote:
On 16/09/2021 at 03:23, Lewis wrote:
In message <16y8ioruep95q$.dlg@news.solani.org> RonTheGuy <ron@null.invalid> wrote:
On Sep 15, 2021, nospam wrote
(in article<news:150920210158598871%nospam@nospam.invalid>):
there are multiple ways to opt-out of csam checking,
How did you opt out to hackers & governments using the back doors?
There is no back door.
I've been watching this conversation and I am really am perplexed, so
I'm hoping that one of you knowledgeable folk here can clarify for me
and for others, maybe.
Why do some people think there is or is going to be a "back door"? What
do they classify as a "back door", and why do some here say there is no
"back door"? TIA.
Very helpful and interesting responses, thank you all. And, yes, it was
a genuine query.
On Sep 16, 2021, Jolly Roger wrote
(in article<news:iqh9l9F84bcU1@mid.individual.net>):
Since these are all optional services, opting out is very easy to do.
How do you opt out of malware code
Ron, the dimmest guy in town.
read the white papers on how it actually works
The reason for the emergency white papers
was for Apple to explain to a
sceptical world how Apple hopes it would work.
Governments and malware writers will read Apple's white papers to learn how Apple expects the software to be used.
But why do you think malware writers & governments will care what Apple can only wish they'd do?
Maybe a 'Front Door'? But
it could allow governments to insist that other hashes be included in
the database used to identify offending pictures.
multiple governments would need to conspire against the same images
(good luck getting multiple governments to agree on anything) *and* get
the manual review team to go along with it.
<https://i1.wp.com/9to5mac.com/wp-content/uploads/sites/6/2021/08/Screen- >> Shot-2021-08-13-at-2.01.20-PM.jpeg>
Thank you, that does make sense.
Except it doesnıt when considering Islamic countries who find the very same things objectionable under their Sharia law.
And it doesnıt even have to
be images, how long before it gets expanded to keywords?
The iMessage
scanning has the potential to be the one focused on by governments.
On Sep 16, 2021, nospam wrote
(in article<news:160920211452116867%nospam@nospam.invalid>):
stock up on the tin foil. you are running low.
What do you know that many experts at the EFF & elsewhere don't know?
In article <nf5ajfbfm3tz.dlg@news.solani.org>, RonTheGuy
<ron@null.invalid> wrote:
Since these are all optional services, opting out is very easy to do.
How do you opt out of malware code & government use of the new back doors?
what part of there is no backdoor is unclear?
did you even read the white papers? (no, you didn't).
do you know what a backdoor is? (no, you don't).
are you going to keep repeating the same bullshit? (yes, you are).
On Sep 16, 2021, nospam wrote
(in article<news:160920211452126928%nospam@nospam.invalid>):
are you going to keep repeating the same bullshit? (yes, you are).
You keep repeating you feel these new back doors aren't really back doors.
Ron, the most foolish guy in town.
nospam <nospam@nospam.invalid> wrote:
In article <nf5ajfbfm3tz.dlg@news.solani.org>, RonTheGuy
<ron@null.invalid> wrote:
what part of there is no backdoor is unclear?Since these are all optional services, opting out is very easy to do.
How do you opt out of malware code & government use of the new back doors? >>
did you even read the white papers? (no, you didn't).
do you know what a backdoor is? (no, you don't).
are you going to keep repeating the same bullshit? (yes, you are).
So you are saying there is no way in hell this new csam scanning proposal could ever be used nefariously by anyone?
they did add an faq to clarify things
The answer is "no".
Apple is on record refusing similar requests.
In article <si04l5$v6o$1@dont-email.me>, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
what part of there is no backdoor is unclear?Since these are all optional services, opting out is very easy to do. >>>>How do you opt out of malware code & government use of the new back doors? >>>
did you even read the white papers? (no, you didn't).
do you know what a backdoor is? (no, you don't).
are you going to keep repeating the same bullshit? (yes, you are).
So you are saying there is no way in hell this new csam scanning proposal
could ever be used nefariously by anyone?
nothing is 100% perfect so it can't be described as 'no way in hell',
but the number of obstacles to overcome is extraordinarily high to be
able to do it.
Unfortunately perception is 90% of reality and in this case the public and security experts perceive this as a potential back door.
Your constant
denials wonıt change that perception and only sound like damage control.
Apple is on record confirming they always comply with all laws.
On Sep 16, 2021, Jolly Roger wrote
(in article<news:iqhj1lF9rulU3@mid.individual.net>):
The answer is "no".
What makes you think malware writers care
Ron, the most gullible fearful fool in town.
On Sep 16, 2021, Jolly Roger wrote
(in article<news:iqhj7uF9rulU6@mid.individual.net>):
Apple is on record refusing similar requests.
Apple is on record confirming they always comply with all laws.
Ron, the biggest dipshit troll in town.
Apple is on record confirming they always comply with all laws.
as any ethical company should be.
So you are saying there is no way in hell this new csam scanning proposal >> could ever be used nefariously by anyone?
nothing is 100% perfect so it can't be described as 'no way in hell',
but the number of obstacles to overcome is extraordinarily high to be
able to do it.
So then why do it?
No one wants it but Apple.
Do your csam scanning on
iCloud just like everyone else. There will be a lot less outrage from the general public than scanning their personal devices.
Apple is on record confirming they always comply with all laws.
as any ethical company should be.
Then let's hope this "on hold" decision by Apple becomes permanent.
nospam <nospam@nospam.invalid> wrote:
In article <si04l5$v6o$1@dont-email.me>, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
Since these are all optional services, opting out is very easy to
do.
How do you opt out of malware code & government use of the new
back doors?
what part of there is no backdoor is unclear?
did you even read the white papers? (no, you didn't). do you know
what a backdoor is? (no, you don't). are you going to keep
repeating the same bullshit? (yes, you are).
So you are saying there is no way in hell this new csam scanning
proposal could ever be used nefariously by anyone?
nothing is 100% perfect so it can't be described as 'no way in hell',
but the number of obstacles to overcome is extraordinarily high to be
able to do it.
So then why do it?
No one wants it but Apple.
Do your csam scanning on iCloud just like everyone else.
There will be a lot less outrage
stop babbling about things you clearly do not understand at all.
In article <si05gl$62t$1@dont-email.me>, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
Unfortunately perception is 90% of reality and in this case the
public and security experts perceive this as a potential back door.
perception of a backdoor does not make it one.
if there's any denials, it's from those who deny what the facts
actually are.
the information is out there, and has been all along.
How is it you read the same Apple white papers and Apple faq on how Apple hopes it will work that thousands of security experts at EFF and around the world read, and you come to a different conclusion than those experts did?
put simply, apple's system is significantly more difficult for someone
to use it for nefarious purposes.
protecting children isn't something we should stop doing just because a
few gullible idiots are "outraged" by it anyway.
Why do you think Apple put this on hold if there are no problems with it?
In article <si0781$i3a$1@dont-email.me>, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
So you are saying there is no way in hell this new csam scanning proposal >>>> could ever be used nefariously by anyone?
nothing is 100% perfect so it can't be described as 'no way in hell',
but the number of obstacles to overcome is extraordinarily high to be
able to do it.
So then why do it?
unfortunately, there are sexual predators in this world
No one wants it but Apple.
no one wants to remove sexual predators from society? really?? is that
your claim?
Do your csam scanning on
iCloud just like everyone else. There will be a lot less outrage from the
general public than scanning their personal devices.
along with a lot less privacy, with a system that can't be audited.
with what you suggest, a government or other entity could force apple
to expand the search parameters and nobody would ever know.
with apple's proposed system, there are multiple stages where any
changes as to what's going on would be detected.
put simply, apple's system is significantly more difficult for someone
to use it for nefarious purposes.
the eff is pushing their propaganda story to gain popularity and
donations. it's that simple.
Why do you think Apple put this on hold if there are no problems with it?
because there are a lot of very vocal people with an agenda pushing disinformation for their own gains.
No one wants it but Apple.
no one wants to remove sexual predators from society? really?? is that
your claim?
No one wants that system on their personal devices.
Do your csam scanning on
iCloud just like everyone else. There will be a lot less outrage from the >> general public than scanning their personal devices.
along with a lot less privacy, with a system that can't be audited.
with what you suggest, a government or other entity could force apple
to expand the search parameters and nobody would ever know.
That would be better because they will catch more perverts or criminals. As it is they will create new content or just not backup to iCloud.
with apple's proposed system, there are multiple stages where any
changes as to what's going on would be detected.
put simply, apple's system is significantly more difficult for someone
to use it for nefarious purposes.
Itıs basically the same thing as turning on the camera and microphone to
look in on your private matters. Itıs an unauthorized scan of your personal device.
Why were there hundreds of privacy organizations against what Apple did?
Unfortunately perception is 90% of reality and in this case the public and security experts perceive this as a potential back door. Your constant denials wonât change that perception and only sound like damage control.
So then why do it? No one wants it but Apple. Do your csam scanning on
iCloud just like everyone else. There will be a lot less outrage from the general public than scanning their personal devices.
On Sep 16, 2021, nospam wrote
(in article<news:160920211519244822%nospam@nospam.invalid>):
stop babbling about things you clearly do not understand at all.
How is it you read the same Apple white papers and Apple faq on how Apple hopes it will work that thousands of security experts at EFF and around the world read, and you come to a different conclusion than those experts did?
In article <si095h$101$1@dont-email.me>, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
No one wants it but Apple.
no one wants to remove sexual predators from society? really?? is that
your claim?
No one wants that system on their personal devices.
only because they don't understand how it actually works and why it's
better than what google, facebook, etc. are currently doing.
Do your csam scanning on
iCloud just like everyone else. There will be a lot less outrage from the >>>> general public than scanning their personal devices.
along with a lot less privacy, with a system that can't be audited.
with what you suggest, a government or other entity could force apple
to expand the search parameters and nobody would ever know.
That would be better because they will catch more perverts or criminals. As >> it is they will create new content or just not backup to iCloud.
if apple was forced to expand their search parameters, the very thing
you don't want to happen, it could catch a *lot* of criminals.
worse, you'd never know they expanded it or what might qualify as
illegal.
that's the system you want?
with apple's proposed system, there are multiple stages where any
changes as to what's going on would be detected.
^this^
put simply, apple's system is significantly more difficult for someone
to use it for nefarious purposes.
ItÂıs basically the same thing as turning on the camera and microphone to
look in on your private matters. ItÂıs an unauthorized scan of your personal >> device.
no, it's nothing at all like that.
only photos uploaded to icloud are checked.
photos not uploaded are never checked.
nospam <nospam@nospam.invalid> wrote:
In article <si0781$i3a$1@dont-email.me>, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
So you are saying there is no way in hell this new csam scanning
proposal could ever be used nefariously by anyone?
nothing is 100% perfect so it can't be described as 'no way in
hell', but the number of obstacles to overcome is extraordinarily
high to be able to do it.
So then why do it?
unfortunately, there are sexual predators in this world
No one wants it but Apple.
no one wants to remove sexual predators from society? really?? is
that your claim?
No one wants that system on their personal devices.
Do your csam scanning on iCloud just like everyone else. There will
be a lot less outrage from the general public than scanning their
personal devices.
along with a lot less privacy, with a system that can't be audited.
with what you suggest, a government or other entity could force apple
to expand the search parameters and nobody would ever know.
That would be better
with apple's proposed system, there are multiple stages where any
changes as to what's going on would be detected.
put simply, apple's system is significantly more difficult for
someone to use it for nefarious purposes.
Itâs basically the same thing as turning on the camera and microphone
to look in on your private matters
On Sep 16, 2021, Jolly Roger wrote
(in article<news:iqhm3bF9rulU17@mid.individual.net>):
protecting children isn't something we should stop doing just because
a few gullible idiots are "outraged" by it anyway.
What else does Apple need to do
On Sep 16, 2021, nospam wrote
(in article<news:160920211519244822%nospam@nospam.invalid>):
stop babbling about things you clearly do not understand at all.
How is it you read the same Apple white papers and Apple faq on how
Apple hopes it will work that thousands of security experts at EFF and
around the world read, and you come to a different conclusion than
those experts did?
On Sep 16, 2021, nospam wrote
(in article<news:160920211610097573%nospam@nospam.invalid>):
the eff is pushing their propaganda story to gain popularity and
donations. it's that simple.
Was it only the EFF who was adamantly against what Apple was planning
to do?
The EFF doesn't look at perception, they look at reality. And to be
fair, they did at least state that it is a "narrow back door."
A "white paper," published by the entity that is opening the backdoor in
the first place, doesn't have much influence on the very smart people at
EFF. The old joke of "we're shipping data sheets in volume," applies
equally to "white papers."
"When ignorance screams, intelligence shuts up"
If theyâre being uploaded to iCloud then check them there.
No need to check them on your private personal device and use up
system resources.
How do you select which photos get uploaded and which donât anyway?
Isnât it all or nothing?
Do your csam scanning on
iCloud just like everyone else. There will be a lot less outrage from the
general public than scanning their personal devices.
along with a lot less privacy, with a system that can't be audited.
with what you suggest, a government or other entity could force apple
to expand the search parameters and nobody would ever know.
That would be better because they will catch more perverts or criminals. As
it is they will create new content or just not backup to iCloud.
if apple was forced to expand their search parameters, the very thing
you don't want to happen, it could catch a *lot* of criminals.
worse, you'd never know they expanded it or what might qualify as
illegal.
Itıs still better than invading my personal device.
If theyıre being uploaded to iCloud then check them there. No need to check them on your private personal device and use up system resources.
How do you select which photos get uploaded and which donıt anyway? Isnıt
it all or nothing?
Not everyone has the critical thinking skills and expertise of those at
the EFF. Those that are endlessly insisting that there is no "back door"
lack the knowledge to understand how all this works and the enormous potential for abuse. Or they really do understand and for some reason
feel compelled to defend the initial proposal. You choose.
Not everyone has the critical thinking skills and expertise of those at
the EFF. Those that are endlessly insisting that there is no "back door" >> lack the knowledge to understand how all this works and the enormous
potential for abuse. Or they really do understand and for some reason
feel compelled to defend the initial proposal. You choose.
the correct choice is you don't understand anything about csam or
iphones in general, refuse to learn, and resort to logical fallacies
and insults because you have absolutely nothing to back up your
baseless and often laughable claims.
You donıt seem to mind the constant insults being spewed by Jolly Roger, Lewis, Rod Speed. Why donıt you chastise them too?
In article <si0arc$c1m$1@dont-email.me>, sms
<scharf.steven@geemail.com> wrote:
Not everyone has the critical thinking skills and expertise of those at
the EFF. Those that are endlessly insisting that there is no "back door"
lack the knowledge to understand how all this works and the enormous
potential for abuse. Or they really do understand and for some reason
feel compelled to defend the initial proposal. You choose.
the correct choice is you don't understand anything about csam or
iphones in general, refuse to learn, and resort to logical fallacies
and insults because you have absolutely nothing to back up your
baseless and often laughable claims.
only because they don't understand how it actually works and why it's
better than what google, facebook, etc. are currently doing.
only photos uploaded to icloud are checked.
photos not uploaded are never checked.
No thanks. Most of Apple's customers don't want Apple to decrypt their
data on Apple's servers - and neither does Apple.
Not everyone has the critical thinking skills and expertise of those at
the EFF. Those that are endlessly insisting that there is no "back door"
lack the knowledge to understand how all this works and the enormous potential for abuse. Or they really do understand and for some reason
feel compelled to defend the initial proposal. You choose.
i do, but other than rod speed, they're usually correct in their facts.
only because they don't understand how it actually works and why it's better than what google, facebook, etc. are currently doing.
It's telling how the apologists always claim Apple is as bad as Facebook
and Google are (and, in this case, far worse).
On 16/09/2021 at 17:28, Jolly Roger wrote:
On 2021-09-16, Lewis <g.kreme@kreme.dont-email.me> wrote:
In message <shv3ci$2b5$1@gioia.aioe.org> Wilf <wilf@postingx.uk> wrote: >>>> On 16/09/2021 at 03:23, Lewis wrote:
In message <16y8ioruep95q$.dlg@news.solani.org> RonTheGuy
<ron@null.invalid> wrote:
On Sep 15, 2021, nospam wrote
(in article<news:150920210158598871%nospam@nospam.invalid>):
there are multiple ways to opt-out of csam checking,
How did you opt out to hackers & governments using the back doors?
There is no back door.
I've been watching this conversation and I am really am perplexed, so
I'm hoping that one of you knowledgeable folk here can clarify for me
and for others, maybe.
Why do some people think there is or is going to be a "back door"?
A few possible reasons:
o They are Androol trolls
o They only read headlines
o They only read posts form Androol trolls
o They din't know what a back door is.
do they classify as a "back door", and why do some here say there is
no "back door"? TIA.
A back door is a hidden feature or a flaw in software that allows
unknown and unauthorized access to a computer's data. There is no
hidden feature or flaw, not unknown access, no unauthorized access, no
back door.
Assuming Wilf really wants to know how Apple's proposed system works:
Photo Library Scanning
* photos being uploaded to Appleâs iCloud servers from your photo
library are examined on device just before upload (rather than after
upload on the server) â they are examined on device rather than in the >> cloud by generating a hash of the photo being uploaded and comparing
that hash with an on-device hash database of known child abuse images
* end users cannot access, view, or modify the database of known child
sexual abuse photos â nor can they identify which images were flagged >>
* photos that match the hashes of known child sexual abuse photos are
flagged as potential violations by generating encrypted safety voucher
containing metadata about the matching photo and a visual derivative
of the photo with sensitive portions obscured â the safety voucher is >> also uploaded to iCloud
* your photos in iCloud are encrypted both in transit and at rest on the
servers, and Apple does not decrypt them or access them on the servers
in order to scan them, which means Apple employees know absolutely
nothing about photos that do not match hashes of known child sexual
abuse material â nor do they know anything about photos that are not
uploaded to Apple servers
* the risk of the system incorrectly flagging an account is extremely
low (1 in 1 trillion probability of incorrectly flagging a given
account)
* only accounts with safety vouchers that exceed a threshold of multiple
(over 30) matches to known child sexual abuse photos are able to be
reviewed by Apple employees â until this threshold is exceeded, the
encrypted vouchers cannot be viewed by anyone, thanks to an encryption
technology called private set intersection (also known as threshold
secret sharing)
* only vouchers that are reviewed and verified by a human being to be
actual child abuse material are forwarded to authorities
Child Protection in Messages
* this feature is only available for accounts set up as families with
iCloud Family Sharing
* parent/guardian accounts must opt in to turn on the feature for their
family group
* parental notifications can only be enabled by parents/guardians for
child accounts age 12 or younger
* when enabled, the feature uses on-device AI (rather than the
aforementioned database of hashes) to analyze images that are sent and
received by the child - this happens privately so that Apple does not
have any knowledge of it
* if a child account sends or receives sexually explicit images, the
photo will be blurred and the child will be warned, presented with
helpful resources, and reassured it is okay if they do not want to
view or send the photo
* if parents desire (they must opt-in), as additional precaution, young
children can also be told that, to make sure they are safe, their
parents will get a message if they do view it
With this in mind, we know that if this matching activity concerns you,
you can opt out by refraining from uploading photos to iCloud (by
disabling iCloud Photos, My Photo Stream, and iMessage).
* Since these are all optional services, opting out is very easy to do.
* Claims stating that Apple is supposedly scanning your entire device
24/7 are unfounded.
* Claims that Apple is scanning every single photo on your device are
also unfounded.
Thank you, that is very interesting. And yes, I do "want to know"!!!
The only thing is, and this is something I've seen elsewhere, so it's not
my own, could a malevolent government insist that other hashes were
included in the database and therefore hijack the purpose of this move?
nospam wrot
it's designed specifically to prevent that.
Apple can't control malware & government coders who open these back doors.
nospam <nospam@nospam.invalid> wrote
Lewis <g.kreme@kreme.dont-email.me> wrote
do they classify as a "back door", and why do some here say there is no >>>> "back door"? TIA.
A back door is a hidden feature or a flaw in software that allows
unknown
and unauthorized access to a computer's data. There is no hidden
feature or flaw, not unknown access, no unauthorized access, no back
door.
yep. it's the very opposite.
It is the juiciest piece of meat ever dangled in front of hackers.
Even if it doesnât meet the definition of a back door in your mind now
it wonât be long before it will become one.
On 2021-09-16, Wilf <wilf@postingx.uk> wrote:
Thank you, that is very interesting. And yes, I do "want to know"!!!
That's comforting, because the trolls here are definitely *not*
interested in having a rational, factual discussion about this. Their
primary interest is trolling.
The only thing is, and this is something I've seen elsewhere, so it's
not my own, could a malevolent government insist that other hashes were
included in the database and therefore hijack the purpose of this move?
No, because Apple would learn that the database was modified, and Apple employees who verify the secure safety vouchers as mentioned above would
see that the matching images are not actually CSAM. And if that happened
on a regular basis, it would be very obvious that the database is
tainted.
From Apple's FAQ (available at https://www.apple.com/child-safety):
---
Can the CSAM detection system in iCloud Photos be used to detect things
other than CSAM?
Our process is designed to prevent that from happening. CSAM detection
for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This
set of image hashes is based on images acquired and validated to be
CSAM by at least two child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before
making a report to NCMEC. As a result, the system is only designed to
report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime
and Apple is obligated to report any instances we learn of to the
appropriate authorities.
Could governments force Apple to add non-CSAM images to the hash list?
No. Apple would refuse such demands and our system has been designed to prevent that from happening. Appleâs CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have
been identified by experts at NCMEC and other child safety groups. The
set of image hashes used for matching are from known, existing images of
CSAM and only contains entries that were independently submitted by two
or more child safety orga- nizations operating in separate sovereign jurisdictions. Apple does not add to the set of known CSAM image hashes,
and the system is designed to be auditable. The same set of hashes is
stored in the operating system of every iPhone and iPad user, so
targeted attacks against only specific individuals are not possible
under this design. Furthermore, Apple conducts human re- view before
making a report to NCMEC. In a case where the system identifies photos
that do not match known CSAM images, the account would not be disabled
and no report would be filed to NCMEC.
We have faced demands to build and deploy government-mandated changes
that degrade the privacy of users before, and have steadfastly refused
those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and
we will not accede to any governmentâs request to expand it.
Can non-CSAM images be âinjectedâ into the system to identify ac- counts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that
have been acquired and validated by at least two child safety
organizations. Apple does not add to the set of known CSAM image hash-
es. The same set of hashes is stored in the operating system of every
iPhone and iPad user, so targeted attacks against only specific
individuals are not possible under our design. Finally, there is no
automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system identifying images that do not match known CSAM images, the account
would not be disabled and no report would be filed to NCMEC.
Will CSAM detection in iCloud Photos falsely report innocent people to
law enforcement?
No. The system is designed to be very accurate, and the likelihood that
the system would incor- rectly identify any given account is less than
one in one trillion per year. In addition, any time an account is
identified by the system, Apple conducts human review before making a
report to NCMEC. As a result, system errors or attacks will not result
in innocent people being reported to NCMEC.
---
That's comforting, because the trolls here are definitely *not*
interested in having a rational, factual discussion about this.
Thank you.
It's telling how the apologists always claim Apple is as bad as Facebook
and Google are (and, in this case, far worse).
that's quite the twist.
Jolly Roger <jollyroger@pobox.com> asked
No thanks. Most of Apple's customers don't want Apple to decrypt their
data on Apple's servers - and neither does Apple.
And yet decrypt your personal data is what Apple does every single day
nospam <nospam@nospam.invalid> asked
i do, but other than rod speed, they're usually correct in their facts.
You claim Jolly Roger, who couldn't even earn his high school diploma, is always correct in his facts?
Jolly Roger <jollyroger@pobox.com> asked
That's comforting, because the trolls here are definitely *not*
interested in having a rational, factual discussion about this.
Someone pinch me please...
You clearly don't know a thing about me, old fart.
Lies are all you have, loser.
Wilf <wilf@postingx.uk> wrote:
On 16/09/2021 at 19:10, nospam wrote:
In article <shvvpm$11i8$3@gioia.aioe.org>, Wilf <wilf@postingx.uk>
wrote:
do they classify as a "back door", and why do some here say there is no >>>>>> "back door"? TIA.A back door is a hidden feature or a flaw in software that allows unknown >>>>> and unauthorized access to a computer's data. There is no hidden
feature or flaw, not unknown access, no unauthorized access, no back >>>>> door.
Might that just be semantics, though?
nope. the definition is clear, except to those with agenda and the
usual trolls.
Ok, so if a back door is something
hidden, then this is not a back door per se.
exactly the point.
Maybe a 'Front Door'? But
it could allow governments to insist that other hashes be included in
the database used to identify offending pictures.
multiple governments would need to conspire against the same images
(good luck getting multiple governments to agree on anything) *and* get
the manual review team to go along with it.
<https://i1.wp.com/9to5mac.com/wp-content/uploads/sites/6/2021/08/Screen- >>> Shot-2021-08-13-at-2.01.20-PM.jpeg>
Thank you, that does make sense.
Except it doesnât when considering Islamic countries
And it doesnât even have to be images
The iMessage scanning has the potential to be the one focused on by governments.
nospam <nospam@nospam.invalid> wrote:
In article <si04l5$v6o$1@dont-email.me>, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
Since these are all optional services, opting out is very easy to do. >>>>>How do you opt out of malware code & government use of the new back doors?
what part of there is no backdoor is unclear?
did you even read the white papers? (no, you didn't).
do you know what a backdoor is? (no, you don't).
are you going to keep repeating the same bullshit? (yes, you are).
So you are saying there is no way in hell this new csam scanning proposal >>> could ever be used nefariously by anyone?
nothing is 100% perfect so it can't be described as 'no way in hell',
but the number of obstacles to overcome is extraordinarily high to be
able to do it.
So then why do it? No one wants it but Apple.
Do your csam scanning on iCloud just like everyone else.
There will be a lot less outrage from the general public than scanning
their personal devices.
nospam <nospam@nospam.invalid> wrote:
In article <slrnsk6n6p.22a.g.kreme@m1mini.local>, Lewis
<g.kreme@kreme.dont-email.me> wrote:
do they classify as a "back door", and why do some here say there is no >>>> "back door"? TIA.
A back door is a hidden feature or a flaw in software that allows unknown >>> and unauthorized access to a computer's data. There is no hidden
feature or flaw, not unknown access, no unauthorized access, no back
door.
yep. it's the very opposite.
It is the juiciest piece of meat ever dangled in front of hackers. Even if
it doesnât meet the definition of a back door in your mind now it wonât be
long before it will become one.
nospam <nospam@nospam.invalid> wrote:
In article <si0781$i3a$1@dont-email.me>, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:
So you are saying there is no way in hell this new csam scanning proposal >>>>> could ever be used nefariously by anyone?
nothing is 100% perfect so it can't be described as 'no way in hell',
but the number of obstacles to overcome is extraordinarily high to be
able to do it.
So then why do it?
unfortunately, there are sexual predators in this world
No one wants it but Apple.
no one wants to remove sexual predators from society? really?? is that
your claim?
No one wants that system on their personal devices.
put simply, apple's system is significantly more difficult for someone
to use it for nefarious purposes.
Itâs basically the same thing as turning on the camera and microphone to look in on your private matters.
Itâs an unauthorized scan of your personal device.
Ant <ant@zimage.comANT> wrote:
https://support.apple.com/en-us/HT212807 for v14.8's release notes.
Next, v15? ;)
After the csam debacle Iâm suspicious of any future updates slipping that back door in without telling the user base.
You donât seem to mind the constant insults being spewed by Jolly Roger, Lewis, Rod Speed. Why donât you chastise them too?
What's more likely is that they end up doing is what Johns Hopkins
University cryptographer Matthew Green suggested in an article in Wired:
³If they feel they must scan, they should scan unencrypted files on
their servers,² which is the standard practice for other companies, like Facebook, which regularly scan for not only CSAM but also terroristic
and other disallowed content types.
Green also suggests that Apple
should make iCloud storage end-to-end encrypted, so that it canıt view
those images even if it wanted to."
In article <si1mbn$u1i$1@dont-email.me>, sms
Green also suggests that Apple
should make iCloud storage end-to-end encrypted, so that it canÂıt view
those images even if it wanted to."
matthew green is not very smart, nor are you.
icloud is already encrypted, which prevents scanning server-side. they
are two incompatible concepts.
apple's csam system is designed so that images can be checked *and*
still continue with icloud being fully encrypted.
no, the core issue is that there are a small but vocal group of people
who do *not* care about facts nor do they have any interest in
learning, or they have an agenda, and then get very upset when they are presented with easily verifiable facts that shows them to be completely wrong.
For the so called laws endorcement it is quite easy to blackmail Apple.
And they will obey immediately. They already do.
iCloud encryption is nothing that really matters. It is PR-bullshit.
The core issue is that you have a very small group of people that accept anything a company says as gospel, and get very upset when anyone
challenges their beliefs with facts.
no, the core issue is that there are a small but vocal group of people
who do *not* care about facts nor do they have any interest in
learning, or they have an agenda, and then get very upset when they are presented with easily verifiable facts that shows them to be completely wrong.
Why does that confession reveal what you have been doing all along?
On Sep 17, 2021, sms wrote
(in article<news:si1tmt$hjg$1@dont-email.me>):
The core issue is that you have a very small group of people that accept
anything a company says as gospel, and get very upset when anyone
challenges their beliefs with facts.
Can anybody make sense of why that "small group of people" accept anything Jobs said or Cook says "as gospel" & then they get upset when others don't?
RonTheGuy wrote:
Can anybody make sense of why that "small group of people" accept
anything Jobs said or Cook says "as gospel" & then they get upset
when others don't?
Apple is a religion, much like environmentalism and many other social positions today.
And the digital keys that unlock information on those computers are
stored in the data centers
In any case, the CSAM on-device scanning idea is almost certainly not
coming back. Apple will end up doing the much same thing that Facebook
does when it comes to detecting and reporting CSAM.
can anybody make sense of why some people accept what's demonstrably
false and then get upset when presented with actual facts?
another thing, jobs hasn't said much of anything in the past decade.
Jolly Roger <jollyroger@pobox.com> asked
You clearly don't know a thing about me, old fart.
I know enough to know you never earned your high school GED, Jolly
Roger.
You tried, three times, and failed each time.
No wonder you became an Apple apologist.
It is nothing like that unless your private matters included KNOWN and CATALOGED Child Sexual Abuse Material.
On Sep 17, 2021, Joerg Lorenz wrote
(in article<news:si294q$qg9$1@dont-email.me>):
For the so called laws endorcement it is quite easy to blackmail Apple.
And they will obey immediately. They already do.
Wasn't it Apple's own Tim Cook who said they obey all the laws of the land?
iCloud encryption is nothing that really matters. It is PR-bullshit.
Does Apple give governments your encrypted files or your unencrypted files?
Can anybody make sense of why that "small group of people" accept
anything Jobs said or Cook says "as gospel" & then they get upset
when others don't?
This is not that uncommon in various forums. Someone buys a specific
product or service, be it a car, a phone, a computer, a credit card, a wireless provider etc., and feels compelled to go to fantastic fantasy
in order to defend their buying decision.
With the whole Android device and operating system, versus iOS device
and operating system discussion, there are people that get extremely
upset every time someone points out a limitation of one or the other.
If you are concerned about your kiddie porn, turn off iCloud photos.
Because of idiots like you and irresponsible "journalists" and
clickbait sites we may well end up with a much worse and much less
secure system. Thanks, dumbshits. You all suck.
And once again, you show your total ignorance of the Messages system
which has NOTHING to do with the CSAM system at all,
In message <si01ja$9km$1@dont-email.me> badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
nospam <nospam@nospam.invalid> wrote:
In article <slrnsk6n6p.22a.g.kreme@m1mini.local>, Lewis
<g.kreme@kreme.dont-email.me> wrote:
do they classify as a "back door", and why do some here say there
is no "back door"? TIA.
A back door is a hidden feature or a flaw in software that allows
unknown and unauthorized access to a computer's data. There is no
hidden feature or flaw, not unknown access, no unauthorized access,
no back door.
yep. it's the very opposite.
It is the juiciest piece of meat ever dangled in front of hackers.
Even if it doesnât meet the definition of a back door in your mind
now it wonât be long before it will become one.
And you base this nonsensical statement on what? It is utter nonsense.
On Sep 17, 2021, Lewis wrote
(in article<news:slrnsk8f3v.2j62.g.kreme@m1mini.local>):
And once again, you show your total ignorance of the Messages system
which has NOTHING to do with the CSAM system at all,
Why do you think Apple was planning on simultaneously releasing CSAM & iMessage on-phone scans under the same timed Public Relations blitz?
On Sep 17, 2021, Joerg Lorenz wrote
(in article<news:si294q$qg9$1@dont-email.me>):
For the so called laws endorcement it is quite easy to blackmail
Apple. And they will obey immediately. They already do.
Wasn't it Apple's own Tim Cook who said they obey all the laws of the
land?
Am 17.09.21 um 16:08 schrieb nospam:
In article <si1mbn$u1i$1@dont-email.me>, sms
Green also suggests that Apple should make iCloud storage end-to-end
encrypted, so that it canÂıt view those images even if it wanted to."
matthew green is not very smart, nor are you.
And you are the most stupid of them all.
End-to-end encryption of iCloud is the only way to restore confidence
in Apple.
For the so called laws endorcement it is quite easy to blackmail
Apple. And they will obey immediately. They already do.
icloud is already encrypted, which prevents scanning server-side.
they are two incompatible concepts.
apple's csam system is designed so that images can be checked *and*
still continue with icloud being fully encrypted.
iCloud encryption is nothing that really matters. It is PR-bullshit.
Why would Apple want to become an active branch of any local government?
On Sep 17, 2021, Joerg Lorenz wrote
(in article<news:si294q$qg9$1@dont-email.me>):
For the so called laws endorcement it is quite easy to blackmail Apple.
And they will obey immediately. They already do.
Wasn't it Apple's own Tim Cook who said they obey all the laws of the land?
iCloud encryption is nothing that really matters. It is PR-bullshit.
Does Apple give governments your encrypted files or your unencrypted files?
Apple is a religion, much like environmentalism and many other social positions today.
The key thing for Apple is not build features into the device or
operating system than can be used to violate the privacy of users,
because there are governments that will demand access to such features.
Cloud storage encryption matters very much for those with an expectation
of privacy. As the New York Times investigation stated: "Apple abandoned
the encryption technology it used elsewhere after China would not allow
it. And the digital keys that unlock information on those computers are stored in the data centers theyıre meant to secure."
In any case, the CSAM on-device scanning idea is almost certainly not
coming back. Apple will end up doing the much same thing that Facebook
does when it comes to detecting and reporting CSAM.
can anybody make sense of why some people accept what's demonstrably
false and then get upset when presented with actual facts?
Isn't that a good description of you?
Did you disagree with the "holding it wrong" accusation from Steve Jobs?
* only accounts with safety vouchers that exceed a threshold of multiple
(over 30) matches to known child sexual abuse photos are able to be
reviewed by Apple employees â until this threshold is exceeded, the
encrypted vouchers cannot be viewed by anyone, thanks to an encryption
technology called private set intersection (also known as threshold
secret sharing)
* only vouchers that are reviewed and verified by a human being to be
actual child abuse material are forwarded to authorities
* this feature is only available for accounts set up as families with
iCloud Family Sharing
* parent/guardian accounts must opt in to turn on the feature for their
family group
* parental notifications can only be enabled by parents/guardians for
child accounts age 12 or younger
On Sep 16, 2021, sms wrote
(in article<news:si0081$v96$1@dont-email.me>):
it's a virtual certainty that they would insist.
Governments & malware writers are probably still working on expanding the back doors
Apple said they always follow the law in every country they sell phones.
In article <si2j47$hel$1@dont-email.me>, sms
<scharf.steven@geemail.com> wrote:
The key thing for Apple is not build features into the device or
operating system than can be used to violate the privacy of users,
because there are governments that will demand access to such
features.
apple's csam system was designed so that it *can't* violate the
privacy of their users and so that governments *can't* demand access
to such features either.
your suggestion that apple do it the way facebook does it means they
*would* violate user privacy and *would* allow governments to demand
access.
Cloud storage encryption matters very much for those with an expectation
of privacy. As the New York Times investigation stated: "Apple abandoned
the encryption technology it used elsewhere after China would not allow
it. And the digital keys that unlock information on those computers are
stored in the data centers theyÂıre meant to secure."
that's a gross misrepresentation of what it stated.
And the parent is ONLY notified if the child confirms TWICE that they
want to view the image, and then the image is sent from the child's
phone to the parents, with the knowledge of the kid. Again, this is an
opt-in feature and no one gets secret messages forward to anyone.
Governments & malware writers are probably still working on expanding the
back doors
You have no clue what you are talking about.
You have decided to cloak
yourself in ignorance and refuse to do anything but spew the same
ignorant misinformed and completely wrong information over and over.
Apple said they always follow the law in every country they sell phones.
And as you have been asked over and over again, which law are you
talking about? An imaginary law?
You are aware that CSAM scanning is already being done by all hosts,
right? Apple has been the outlier because they wanted a system that
preserves privacy, but we have these anti-privacy cunts screaming about
how bad Apple's system is.
It's very simple, if you think scanning photos on the server is "better"
you are an advocate for no-privacy.
On Sep 17, 2021, Lewis wrote
(in article<news:slrnska16n.2j62.g.kreme@m1mini.local>):
Governments & malware writers are probably still working on expanding the >>> back doors
You have no clue what you are talking about.
How can you still be unaware
On Sep 17, 2021, Lewis wrote
(in article<news:slrnsk8f3v.2j62.g.kreme@m1mini.local>):
And once again, you show your total ignorance of the Messages system
which has NOTHING to do with the CSAM system at all,
Why do you think Apple was planning on simultaneously releasing CSAM &
iMessage on-phone scans under the same timed Public Relations blitz?
It would be truly amazing if anyone at Apple believed that this whole
mess was going to be a Public Relations win.
apple's csam system was designed so that it *can't* violate the privacy
of their users and so that governments *can't* demand access to such
features either.
Can anybody make sense of why that "small group of people" accept
anything Jobs said or Cook says "as gospel" & then they get upset
when others don't?
Apple is a religion, much like environmentalism and many other social positions today.
can anybody make sense of why some people accept what's demonstrably
false and then get upset when presented with actual facts?
Isn't that a good description of you?
nope. it's a perfect description of you.
Did you disagree with the "holding it wrong" accusation from Steve Jobs?
i disagree with the claim that it's only the iphone 4 that had a
problem.
For the so called laws endorcement it is quite easy to blackmail
Apple. And they will obey immediately. They already do.
Wasn't it Apple's own Tim Cook who said they obey all the laws of the
land?
You still haven't been able to show a single law that requires Apple to report non-CSAM photos to authorities of any country.
On Sep 17, 2021, Jolly Roger wrote
(in article<news:iqk5v6Forn9U7@mid.individual.net>):
For the so called laws endorcement it is quite easy to blackmail
Apple. And they will obey immediately. They already do.
Wasn't it Apple's own Tim Cook who said they obey all the laws of the
land?
You still haven't been able to show a single law that requires Apple to
report non-CSAM photos to authorities of any country.
laws will be made in the future
Ron, the drunkest guy in town.
* only vouchers that are reviewed and verified by a human being to be
actual child abuse material are forwarded to authorities
And the images that are viewed by an Apple employee are NOT the CSAM
images, nor the user's image. They are VERY low resolution versions that
are just enough that a human can tell there is probably a match (or
not).
I know a lot of adults (mostly women) who would very much like to opt-in their own accounts for this AI/ML scanning of photos, but it's only for children's accounts.
apple's csam system was designed so that it *can't* violate the privacy
of their users and so that governments *can't* demand access to such features either.
Once these new back doors are in billions of devices they can't be closed.
Then how does Apple prevent a government from passing laws like China did?
And how does Apple prevent malware actors from exploiting these back doors?
Ron, the dumbest guy in town.
On Sep 17, 2021, Jolly Roger wrote
(in article<news:iqkloeFrq8pU2@mid.individual.net>):
Jolly Roger <jollyroger@pobox.com> asked
How can you still be unaware
Projection from a willfully ignorant fool.
all my responses are based on awareness
Ron, the most fearful guy in town.
How can you still be unaware
Projection from a willfully ignorant fool.
Is it only Lewis who is still unaware both types of on the phone scanning were to be released by Apple under the same public relations blitz campaign?
No, he, as well as all the others trying to defend this, are well aware
that what they I am saying is untrue.
there are no backdoors.
Then how does Apple prevent a government from passing laws like China did?
already explained. you aren't interested learning how.
And how does Apple prevent malware actors from exploiting these back doors?
already explained. you aren't interested learning how.
How can you still be unaware of everything all the experts have said?
iCloud encryption is nothing that really matters. It is PR-bullshit.
Does Apple give governments your encrypted files or your unencrypted files?
What would be the difference? None.
iCloud end to end encryption is only PR.
False.
On Sep 17, 2021, nospam wrote
(in article<news:170920211604371367%nospam@nospam.invalid>):
apple's csam system was designed so that it *can't* violate the privacy
of their users and so that governments *can't* demand access to such
features either.
Once these new back doors are in billions of devices they can't be closed. Then how does Apple prevent a government from passing laws like China did? And how does Apple prevent malware actors from exploiting these back doors?
No, he, as well as all the others trying to defend this, are well aware
that what they I am saying is untrue.
ftfy
On Sep 17, 2021, badgolferman wrote
(in article<news:si2jlb$pvp$1@dont-email.me>):
Can anybody make sense of why that "small group of people" accept
anything Jobs said or Cook says "as gospel" & then they get upset
when others don't?
Apple is a religion, much like environmentalism and many other social
positions today.
Don't most religious accept that others should be allowed to have privacy?
What is it about their religion that makes people like Lewis call us "cunts" for caring about our privacy?
you don't like being proven wrong, so you resort to attacks.
you don't like being proven wrong, so you resort to attacks.
You think that's a sort of "attack" to know you would have agreed with Jobs?
What did Lewis and nospam hope to gain by denying both types of on the phone scanning were to be released under the same public relations blitz campaign?
Ron, the most dishonest guy in town.
What is it about their religion that makes people like Lewis call us "cunts" >> for caring about our privacy?
They behave like that because of their own insecurities. They are
lashing out, not realizing that lying is not the way to convert others
to their point of view.
Have you never encountered evangelical Christians, Mormons, or Jehovah's witnesses. Part of their belief system is to proselytize.
They behave like that because of their own insecurities. They are
lashing out, not realizing that lying is not the way to convert others
to their point of view.
On Sep 17, 2021, nospam wroteOn Sep 17, 2021, Lewis wroteWhy do you think Apple was planning on simultaneously releasing CSAM &
And once again, you show your total ignorance of the Messages system
which has NOTHING to do with the CSAM system at all,
iMessage on-phone scans under the same timed Public Relations blitz?
What did Lewis and nospam hope to gain by denying both types of on the phone >> scanning were to be released under the same public relations blitz campaign?
nobody denied that.
On Sep 17, 2021, nospam wrote
apple's csam system was designed so that it *can't* violate the privacy
of their users and so that governments *can't* demand access to such
features either.
Once these new back doors are in billions of devices they can't be closed. >> Then how does Apple prevent a government from passing laws like China did? >> And how does Apple prevent malware actors from exploiting these back doors?
Yes, that's the precise concern of human rights groups, child advocates,
the LGBTQ+ community, and privacy advocates.
How can you still be unaware of everything all the experts have said?
logical fallacy.
how is that even apple's problem? should apple require parenting
classes as a prerequisite for buying an iphone?
you don't like being proven wrong, so you resort to attacks.
You think that's a sort of "attack" to know you would have agreed with Jobs?
you asked a question and i gave an extensive answer which showed you to
be wrong, which you snipped and continued attacking.
one could claim ignorance, except that you've been told that they're
false and with extensive proof, yet you continue to say the same
things, thereby making it deliberate intentional lies.
On Sep 17, 2021, nospam wrote
(in article<news:170920212024015207%nospam@nospam.invalid>):
one could claim ignorance, except that you've been told that they're
false and with extensive proof, yet you continue to say the same
things, thereby making it deliberate intentional lies.
Why is it that they are describing themselves in almost every message now?
Do you at least now understand Apple released both campaigns simultaneously?
Ron, the most dishonest guy in town.
There are people who for all the evidence presented to them, do not have
the ability to understand,
How does you agreeing with Jobs (as I said you would) "prove me wrong?"
reread the part you snipped where i proved you wrong and stop twisting
what was said to deny it.
Do you at least now understand Apple released both campaigns
simultaneously?
that was never in dispute.
Read the quote you snipped out from Lewis who also disputed it as you did.
On Sep 17, 2021, Lewis wrote
(in article<news:slrnsk8f3v.2j62.g.kreme@m1mini.local>):
And once again, you show your total ignorance of the Messages system
which has NOTHING to do with the CSAM system at all
Now you're not even making sense to yourself.
With that I'm going to cut the cord.
Ron, the trolliest guy in town.
Do you at least now understand Apple released both campaigns simultaneously?
that was never in dispute.
And once again, you show your total ignorance of the Messages system
which has NOTHING to do with the CSAM system at all
On Sep 17, 2021, nospam wrote
(in article<news:170920212012535114%nospam@nospam.invalid>):
How can you still be unaware of everything all the experts have said?
logical fallacy.
What do you gain
In article <slrnska0fu.2j62.g.kreme@m1mini.local>, Lewis
<g.kreme@kreme.dont-email.me> wrote:
* only vouchers that are reviewed and verified by a human being to
be actual child abuse material are forwarded to authorities
And the images that are viewed by an Apple employee are NOT the CSAM
images, nor the user's image. They are VERY low resolution versions
that are just enough that a human can tell there is probably a match
(or not).
that brings up a very interesting legal question for which i've not
yet seen a good answer.
what amount (and type) of degradation is sufficient such that a
illegal image is no longer considered illegal, yet still recognizable
so that it can accurately be matched to an actual illegal csam image?
i've seen all sorts of speculation, but nothing that is particularly convincing.
presumably apple has a very good answer, but i've yet to see it.
On 2021-09-18, RonTheGuy <ron@null.invalid> wrote:
On Sep 17, 2021, nospam wrote
(in article<news:170920212012535114%nospam@nospam.invalid>):
How can you still be unaware of everything all the experts have said?
logical fallacy.
What do you gain
Sanity.
Meanwhile your fear-driven world is based on the opposite.
On Sep 17, 2021, nospam wrote
(in article<news:170920212012535114%nospam@nospam.invalid>):
How can you still be unaware of everything all the experts have said?
logical fallacy.
What do you gain by calling a hundred different privacy organizations wrong?
On Sep 17, 2021, sms wrote
Yes, that's the precise concern of human rights groups, child advocates,
the LGBTQ+ community, and privacy advocates.
What does nospam gain by saying everyone who disagrees with Apple is wrong?
On Sep 17, 2021, Jolly Roger wrote
(in article<news:iqko6jFsmruU3@mid.individual.net>):
iCloud end to end encryption is only PR.
False.
Does Apple give governments the encrypted files or the unencrypted files?
Ron, the humblest guy in town.
In misc.phone.mobile.iphone Lewis <g.kreme@kreme.dont-email.me> wrote:
In message <4audnWR8JIV_EKL8nZ2dnUU7-N3NnZ2d@earthlink.com> Ant <ant@zimage.comANT> wrote:
https://support.apple.com/en-us/HT212807 for v14.8's release notes.
Next, v15? ;)
Yes, we already know iOS 15 will be released soon as some of us have
been running the beta for months. Could be tomorrow, more likely Friday
or next Week, depends on which schedule Apple chooses and when the new
phones will actually be in customer's hands,
How's v15 so far? Stable enough?
* only vouchers that are reviewed and verified by a human being to
be actual child abuse material are forwarded to authorities
And the images that are viewed by an Apple employee are NOT the CSAM
images, nor the user's image. They are VERY low resolution versions
that are just enough that a human can tell there is probably a match
(or not).
that brings up a very interesting legal question for which i've not
yet seen a good answer.
what amount (and type) of degradation is sufficient such that a
illegal image is no longer considered illegal, yet still recognizable
so that it can accurately be matched to an actual illegal csam image?
i've seen all sorts of speculation, but nothing that is particularly convincing.
All we know is that the secure safety vouchers contain a "visual
derivative" of the original photo. It wouldn't be all that hard to have
AI trained to obscure sensitive portions of he human anatomy in a given photo. And I suspect that's exactly what Apple is doing.
presumably apple has a very good answer, but i've yet to see it.
I doubt anyone outside of Apple will actually see it.
But I have little
doubt Apple can and will do it in a way that preserves privacy by
obscuring sensitive portions of potential matches.
should not buy a new device until a few months after it's released,
you
now should probably wait a bit prior to installing iOS updates until
they can be vetted by independent sources.
On 9/13/2021 6:20 PM, Ant wrote:
In misc.phone.mobile.iphone Lewis <g.kreme@kreme.dont-email.me> wrote:
In message <4audnWR8JIV_EKL8nZ2dnUU7-N3NnZ2d@earthlink.com> Ant
<ant@zimage.comANT> wrote:
https://support.apple.com/en-us/HT212807 for v14.8's release notes.
Next, v15? ;)
Yes, we already know iOS 15 will be released soon as some of us have
been running the beta for months. Could be tomorrow, more likely Friday
or next Week, depends on which schedule Apple chooses and when the new
phones will actually be in customer's hands,
How's v15 so far? Stable enough?
I used to have automatic updates enabled on all my Apple devices
(iPhone, iPad, Watch). Now I'm a little more cautious. Just like you
should not buy a new device until a few months after it's released, you
now should probably wait a bit prior to installing iOS updates until
they can be vetted by independent sources.
If theyâre being uploaded to iCloud then check them there. No need to check them on your private personal device and use up system resources.
On Sep 17, 2021, Lewis wrote
(in article<news:slrnska0fu.2j62.g.kreme@m1mini.local>):
And the parent is ONLY notified if the child confirms TWICE that they
want to view the image, and then the image is sent from the child's
phone to the parents, with the knowledge of the kid. Again, this is an
opt-in feature and no one gets secret messages forward to anyone.
How can you still be so unaware of the real issue here with bad parents?
On 9/16/2021 7:11 AM, badgolferman wrote:
Wilf <wilf@postingx.uk> wrote:
On 16/09/2021 at 03:23, Lewis wrote:
In message <16y8ioruep95q$.dlg@news.solani.org> RonTheGuy <ron@null.invalid> wrote:
On Sep 15, 2021, nospam wrote
(in article<news:150920210158598871%nospam@nospam.invalid>):
there are multiple ways to opt-out of csam checking,
How did you opt out to hackers & governments using the back doors?
There is no back door.
I've been watching this conversation and I am really am perplexed, so
I'm hoping that one of you knowledgeable folk here can clarify for me
and for others, maybe.
Why do some people think there is or is going to be a "back door"? What >>> do they classify as a "back door", and why do some here say there is no
"back door"? TIA.
Itâs a matter of perspective. Call it a back door, side door, or full
frontal attack. What it comes down to is Apple is introducing a method
which will scan your personal device for files which someone has determined >> are illegal. Who that âsomeoneâ is will be ambiguous forever.
Exactly. Those that insist that it's not a "back door" are really
arguing semantics.
EFF called it a "narrow back door," which is probably the most accurate description.
"All it would take to widen the narrow backdoor that Apple is building
is an expansion of the machine learning parameters to look for
On 9/13/2021 6:20 PM, Ant wrote:
In misc.phone.mobile.iphone Lewis <g.kreme@kreme.dont-email.me> wrote:
In message <4audnWR8JIV_EKL8nZ2dnUU7-N3NnZ2d@earthlink.com> Ant <ant@zimage.comANT> wrote:
https://support.apple.com/en-us/HT212807 for v14.8's release notes.
Next, v15? ;)
Yes, we already know iOS 15 will be released soon as some of us have
been running the beta for months. Could be tomorrow, more likely Friday
or next Week, depends on which schedule Apple chooses and when the new
phones will actually be in customer's hands,
How's v15 so far? Stable enough?
I used to have automatic updates enabled on all my Apple devices
(iPhone, iPad, Watch). Now I'm a little more cautious. Just like you
should not buy a new device until a few months after it's released,
you now should probably wait a bit prior to installing iOS updates
until they can be vetted by independent sources.
In message <si5c04$oia$1@dont-email.me> sms <scharf.steven@geemail.com> wrote:
On 9/13/2021 6:20 PM, Ant wrote:
In misc.phone.mobile.iphone Lewis <g.kreme@kreme.dont-email.me> wrote:
In message <4audnWR8JIV_EKL8nZ2dnUU7-N3NnZ2d@earthlink.com> Ant <ant@zimage.comANT> wrote:
https://support.apple.com/en-us/HT212807 for v14.8's release notes.
Next, v15? ;)
Yes, we already know iOS 15 will be released soon as some of us have
been running the beta for months. Could be tomorrow, more likely Friday >>>> or next Week, depends on which schedule Apple chooses and when the new >>>> phones will actually be in customer's hands,
How's v15 so far? Stable enough?
I used to have automatic updates enabled on all my Apple devices
(iPhone, iPad, Watch). Now I'm a little more cautious. Just like you
should not buy a new device until a few months after it's released,
Nonsense.
you now should probably wait a bit prior to installing iOS updates
until they can be vetted by independent sources.
More nonsense. Millions of people have been using iOS 15 for months. If
you want to be super cautious, you could wait a day or two. Beyond that
is paranoia and silliness.
On 16/09/2021 at 16:03, Lewis wrote:
do they classify as a "back door", and why do some here say there is noA back door is a hidden feature or a flaw in software that allows unknown
"back door"? TIA.
and unauthorized access to a computer's data. There is no hidden
feature or flaw, not unknown access, no unauthorized access, no back
door.
Might that just be semantics, though? Ok, so if a back door is something hidden, then this is not a back door per se. Maybe a 'Front Door'? But
it could allow governments to insist that other hashes be included in
the database used to identify offending pictures.
On 9/17/2021 4:16 PM, RonTheGuy wrote:
<snip>
Is it only Lewis who is still unaware both types of on the phone scanning
were to be released by Apple under the same public relations blitz campaign?
No, he, as well as all the others trying to defend this, are well aware
that what they are saying is untrue.
In article <al56vbi12fdr$.dlg@news.solani.org>, RonTheGuy
<ron@null.invalid> wrote:
multiple governments would need to conspire against the same images
(good luck getting multiple governments to agree on anything) *and* get
the manual review team to go along with it.
Malware writers don't have to agree to anything to open these back doors.
what part of there is no backdoor is not clear?
read the white papers on how it actually works and why there is no
backdoor.
On Sep 17, 2021, Lewis wrote
(in article<news:slrnska16n.2j62.g.kreme@m1mini.local>):
Governments & malware writers are probably still working on expanding the >>> back doors
You have no clue what you are talking about.
How can you still be unaware of everything the security experts have said?
You have decided to cloak
yourself in ignorance and refuse to do anything but spew the same
ignorant misinformed and completely wrong information over and over.
How can you still be unaware what a hundred privacy organizations have said?
Apple said they always follow the law in every country they sell phones.
And as you have been asked over and over again, which law are you
talking about? An imaginary law?
How can you still be unaware Tim Cook said Apple follows all local laws?
You are aware that CSAM scanning is already being done by all hosts,
right? Apple has been the outlier because they wanted a system that
preserves privacy, but we have these anti-privacy cunts screaming about
how bad Apple's system is.
Why is it that it's only you who is unaware of Apple's broken promises?
It's very simple, if you think scanning photos on the server is "better"
you are an advocate for no-privacy.
Are you still unaware Apple acted as an active branch of local authorities?
I used to have automatic updates enabled on all my Apple devices
(iPhone, iPad, Watch). Now I'm a little more cautious. Just like you
should not buy a new device until a few months after it's released,
Nonsense.
On Sep 17, 2021, Lewis wrote
(in article<news:slrnsk8fik.2j62.g.kreme@m1mini.local>):
If you are concerned about your kiddie porn, turn off iCloud photos.
How does anything you do stop malevolent governments & malware writers?
Because of idiots like you and irresponsible "journalists" and
clickbait sites we may well end up with a much worse and much less
secure system. Thanks, dumbshits. You all suck.
Why do you think everyone who cares about privacy is a dumbshit who sucks?
I have to say that I don't normally wait before applying iOS updates
and, so far, have never had a problem. Probably tempting fate now !!
You and the other troll morons are intentionally conflating two entire different things.
How can you still be so unaware of the real issue here with bad parents?
Which is why the parental notification only exists for children who re
12yo and under.
Back door has several criteria, not a singl one is matched by the CSAM
system
*) A hidden feature or bug
*) Allow unauthorized access
*) Allows surreptitious access
*) allows accessing the data on the computer or device
HE CAN'T READ! All those words, some of them are SIX LETTERS LONG!
Some are even longer!
The EFF has become a fund-raising organization. They have consistently
and repeatedly overstated and lied about numerous things in order to fearmonger to fill their coffers.
"All it would take to widen the narrow backdoor that Apple is building
is an expansion of the machine learning parameters to look for
And there is the lie. Right there, in plain sight. They are
intentionally and knowingly mixing up tow thing to mislead their readers
into panic and donations to the EFF.
On 9/16/2021 1:49 PM, badgolferman wrote:
<snip>
If theyâre being uploaded to iCloud then check them there. No need to
check them on your private personal device and use up system
resources.
We all know what the eventual outcome of all this is going to be.
Apple will scan unencrypted files on their servers, which is what
other companies, like Facebook, already do.
Apple can still use the CSAM hashes, but on their servers, not on
usersâ devices.i
In a few months this whole debacle will fall out of the news cycle and dipshit trolls like me will have find a new chew toy.
We all know what the eventual outcome of all this is going to be.
Apple
will scan unencrypted files on their servers, which is what other
companies, like Facebook, already do.
Apple can still use the CSAM hashes, but on their servers, not on usersı devices. They can state something like: ³our researchers invented a new
way to scan for CSAM using our servers so we no longer have a need to
scan on devices.²
Apple scares me
Apple already scans uploaded photos.
except that icloud is encrypted, so no.
True.
nospam <nospam@nospam.invalid> asked
Apple already scans uploaded photos.
they do not.
How do you think Apple reports CSAM in your iCloud Mail attachments?
Apple already scans uploaded photos.
they do not.
Does anyone notice conspicuously missing from that E2EE list above of what Apple doesn't have the key for, based on Apple's own documents, are:
iMessages (in your iCloud Backup)
iCloud Photo Library
The lack of security for iMessages is because the end-to-end encryption key for your Messages data is actually stored in your iCloud Backup. Only if you disable iCloud Backups is a new key automatically generated (which only then would make Messages in the Cloud more secure, but only if you leave iCloud Backups permanently off).
Your iCloud Backups and your iCloud Photo Library are merely 'encrypted at rest' which means that although they are stored on Apple's servers in a generic encrypted form, Apple has full and complete access to that generic encryption key which they can use for any purpose they want to use it for.
Apple already scans uploaded photos.
they do not.
How do you think Apple reports CSAM in your iCloud Mail attachments?
The usual way to detect CSAM is to scan uploaded photos.
This is what
Google and Facebook already do,
and what Apple is expected to do soon.
How do you think Apple reports CSAM in your iCloud Mail attachments?
Apple scans iCloud Mail attachments for CSAM, but they do not yet scan
iCloud Photos for CSAM. What we will likely see once the "delay" is
over, is server side scanning of photos for CSAM, using the hash
technique that they initially wanted to put on people's phones.
The usual way to detect CSAM is to scan uploaded photos. This is what
Google and Facebook already do, and what Apple is expected to do soon.
It's likely that the NCMEC was upset that they were receiving so few
reports of CSAM from Apple (since Apple is not scanning iPhoto storage).
Instead of following the usual practice of scanning uploaded photos and comparing them against a database of CSAM images, Apple decided that the
CSAM scanning should occur on users' phones.
There are theories of why
they thought this was better, including reducing the load on their
servers,
but no explanation from Apple why they could not follow the
lead of Google and Facebook and do server-side scanning
. They didn't
think through the potential for abuse of the client side scanning.
there was a very good explanation directly from apple.
It's likely that the NCMEC was upset that they were receiving so few
reports of CSAM from Apple (since Apple is not scanning iPhoto storage).
On Sep 18, 2021, Lewis wrote
(in article<news:slrnskcm1l.ot1.g.kreme@m1mini.local>):
Back door has several criteria, not a singl one is matched by the CSAM
system
*) A hidden feature or bug
*) Allow unauthorized access
*) Allows surreptitious access
*) allows accessing the data on the computer or device
When will you realize real privacy experts & privacy organizations have
shown all those criteria will happen by malevolent governments, parents,
& malware developers after Apple puts these two new back doors into billions of iPhones around the world?
Matthew Green suggested in an article in Wired: âIf they feel they
must scan, they should scan unencrypted files on their servers,â
which is the standard practice for other companies
Facebook, which regularly scan for not only CSAM but also terroristic
and other disallowed content types. Green also suggests that Apple
should make iCloud storage end-to-end encrypted, so that it canât view those images even if it wanted to."
See: <https://www.wired.com/story/apple-icloud-photo-scan-csam-pause-backlash/>.
In article <si1mbn$u1i$1@dont-email.me>, sms
<scharf.steven@geemail.com> wrote:
What's more likely is that they end up doing is what Johns Hopkins
University cryptographer Matthew Green suggested in an article in Wired:
³If they feel they must scan, they should scan unencrypted files on
their servers,² which is the standard practice for other companies, like
Facebook, which regularly scan for not only CSAM but also terroristic
and other disallowed content types.
wanting apple to be like facebook is abhorrent.
scanning on the servers violates the user's privacy, which is what
apple wants to avoid.
Green also suggests that Apple
should make iCloud storage end-to-end encrypted, so that it canÂıt view
those images even if it wanted to."
matthew green is not very smart
icloud is already encrypted, which prevents scanning server-side. they
are two incompatible concepts.
apple's csam system is designed so that images can be checked *and*
still continue with icloud being fully encrypted.
iCloud encryption is nothing that really matters. It is PR-bullshit.
Back door has several criteria, not a singl one is matched by the CSAM
system
*) A hidden feature or bug
*) Allow unauthorized access
*) Allows surreptitious access
*) allows accessing the data on the computer or device
When will you realize real privacy experts & privacy organizations have shown all those criteria will happen by malevolent governments, parents,
& malware developers after Apple puts these two new back doors into billions
of iPhones around the world?
They've not shown anything.
They've theorised based on hypotheticals.
It isn't an absolute truth that those things will ever actually happen.
sms <scharf.steven@geemail.com> asked
It's likely that the NCMEC was upset that they were receiving so few
reports of CSAM from Apple (since Apple is not scanning iPhoto storage).
Knowing full well the only time Apple ever tells the truth is when they're forced to, in court, we may never know what Apple was thinking they'd gain with this stupid idea of scanning iMessages and photos on your iPhone.
All we know is a hundred privacy organizations and many privacy experts around the world have criticized Apple for what is clearly a dumb move.
on the other hand, it's close to an absolute truth that they won't
happen because the obstacles needed to overcome are very, very high.
it's not any type of backdoor, which means they didn't look at reality.
they *did* look at perception and that by claiming it was, they could
gain popularity and profit from it.
The reason is pretty clear.
When they scan iCloud photos on the server,
they have to decrypt them prior to applying the hash.
It would be less
burdensome to scan the unencrypted photos on the device to look for
CSAM. We're talking about hundreds of millions of photos per day to be scanned. Meanwhile you have the Bionic processor in the phone with a
huge surplus of processing power.
It was not CSAM, per se, that the privacy and human rights organizations
were concerned about, it was the potential for abuse by other entities
that would no doubt demand on-device scanning for other things.
Assurances by Apple that they would never comply with such requests fell
on deaf ears, due to past history,
see
<https://www.nytimes.com/2021/05/17/technology/apple-china-censorship-data.htm
.
Perhaps the belief was that since the photos would be scanned no matter
what, no one would care if it was done on the phone versus on the server.
What's prescient is how nospam described Apple marketing motives so well.
1. *Apple marketing "_didn't look at reality_"*, and,
2. *Apple marketing wanted to "_gain popularity and profit from it_.*"
The reason is pretty clear. When they scan iCloud photos on the server,
they have to decrypt them prior to applying the hash.
It would be less
burdensome to scan the unencrypted photos on the device to look for
CSAM. We're talking about hundreds of millions of photos per day to be scanned. Meanwhile you have the Bionic processor in the phone with a
huge surplus of processing power.
All we know is a hundred privacy organizations and many privacy experts
around the world have criticized Apple for what is clearly a dumb move.
It was not CSAM, per se, that the privacy and human rights organizations
were concerned about, it was the potential for abuse by other entities
that would no doubt demand on-device scanning for other things.
Assurances by Apple that they would never comply with such requests fell
on deaf ears, due to past history, see <https://www.nytimes.com/2021/05/17/technology/apple-china-censorship-data.html>.
Perhaps the belief was that since the photos would be scanned no matter
what, no one would care if it was done on the phone versus on the server.
What's prescient is how nospam described Apple marketing motives so well.
1. *Apple marketing "_didn't look at reality_"*, and,
2. *Apple marketing wanted to "_gain popularity and profit from it_.*"
i was talking about the eff.
These concerns also need to be balanced against the potential benefit of
both reducing the dissemination of CSAM and additional harms to children.
On Sep 19, 2021, Chris wrote
(in article<news:si79th$f4q$2@dont-email.me>):
These concerns also need to be balanced against the potential benefit of
both reducing the dissemination of CSAM and additional harms to children.
What evidence exists anywhere this will have any effect on children at all?
Ron, the humblest guy in town.
facebook doesn't care about privacy, so they scan server-side, and for
much more than just csam.
And then use your images to create advertisements fr you, leaving you to believe that your contact are endorsing products Facebook is shilling.
Why anyone uses Facebook of anything is beyond me, they are easily the
worst company in the world, and dedicated to violating every users
privacy in every possible way by any possible means. The fuckers even
created a fake VPN so they could intercept every bit of data from people
dumb enough to use that VPN.
In article <si7oh8$gpe$1@dont-email.me>, sms
<scharf.steven@geemail.com> wrote:
The reason is pretty clear.
yet you fail to see it.
When they scan iCloud photos on the server,
they have to decrypt them prior to applying the hash.
thereby violating the user's privacy.
facebook doesn't care about privacy, so they scan server-side, and for
much more than just csam.
apple does not want to do that.
It was not CSAM, per se, that the privacy and human rights organizations
were concerned about, it was the potential for abuse by other entities
that would no doubt demand on-device scanning for other things.
only because they didn't bother reading how it works and decided to
grift the stupids.
why don't those privacy and human rights organizations go after
facebook and google, who can *easily* expand their search to anything
at all.
Assurances by Apple that they would never comply with such requests fell
on deaf ears, due to past history,
past history shows that apple does *not* comply and has *zero*
intention of complying.
past history also shows is that facebook, google and others *do* comply
with such requests, and have *already* done so.
Perhaps the belief was that since the photos would be scanned no matter
what, no one would care if it was done on the phone versus on the server.
nope. it's because it is the only way to keep the user's photos
private, which you said was important.
sms <scharf.steven@geemail.com> asked
The reason is pretty clear. When they scan iCloud photos on the server,
they have to decrypt them prior to applying the hash.
This is logical.
However... other things are as logical (IMHO)... given that...
There are at least two competing arguments we'd need to hash out before we can come to that conclusion that "the reason is pretty clear" why Apple
chose to do something so stupid (without even asking security experts!)...
... given it was "said" (yes, I know it was later found out to be incorrect) that Apple "can" scan the hash of an encrypted file, as long as it was Apple who did the initial encryption. (Whether or not that is technically feasible would matter greatly if this is to be an option).
The second argument we'd have to hash out is why should Apple actively do anything?
While I certainly "get" that any company would love to offload its computer resources to billions of their captive devices around the world, I can't believe that, if and when Apple decides to become an active arm of law enforcement, that Apple can't spare the server resources. I just can't.A "compute server" or "GPU server" is a different animal than a storage
Yeah, I wondered about that too, but I don't think that scanning the
hash would work. A completely different hash could be generated from essentially the same image based on very minor differences in the image.
On 9/19/2021 11:50 AM, Robin Goodfellow wrote:
nospam <nospam@nospam.invalid> asked
The EFF doesn't look at perception, they look at reality. And to be
fair, they did at least state that it is a "narrow back door."
it's not any type of backdoor, which means they didn't look at
reality.
they *did* look at perception and that by claiming it was, they
could gain popularity and profit from it.
For once, nospam got it right...
What's prescient is how nospam described Apple marketing motives so
well. 1. *Apple marketing "_didn't look at reality_"*, and, 2.
*Apple marketing wanted to "_gain popularity and profit from it_.*"
He got #1 right anyway
but I doubt if anyone ever believed that they would gain popularity
from their proposal. If they didn't anticipate the negative reaction
nospam <nospam@nospam.invalid> asked
it's not any type of backdoor, which means they didn't look at reality.
they *did* look at perception and that by claiming it was, they could
gain popularity and profit from it.
For once, nospam got it right...
What's prescient is how nospam described Apple marketing motives so well.
1. *Apple marketing "_didn't look at reality_"*, and,
2. *Apple marketing wanted to "_gain popularity and profit from it_.*"
A completely different hash could be generated from
essentially the same image based on very minor differences in the image.
Because other cloud service providers already do CSAM detection on their servers.
A "compute server" or "GPU server" is a different animal than a storage server, it's designed with multiple high-power Xeon or EPYC, and Nvidia Ampere processors. It's understandable why Apple wanted to offload the
photo scanning to users' devices.
If everyone jumped off the Brooklyn Bridge, why should Apple follow them?
The second argument we'd have to hash out is why should Apple actively do
anything?
Because other cloud service providers already do CSAM detection on their servers.
It's understandable why Apple wanted to offload the photo scanning to users' devices.
We all know what the eventual outcome of all this is going to be. Apple
will scan unencrypted files on their servers, which is what other
companies, like Facebook, already do.
Apple can still use the CSAM hashes, but on their servers
Repairing the bruised image of NCMEC needs to also be attempted. Their ill-advised message to Apple employees, labeling those who favor privacy
"the screeching voices of the minority" should never have happened (this
was an internal memo from Apple quoting a separate memo from NCMEC
Hopefully Apple will put procedures in place that will prevent this kind
of thing in the future. Perhaps set up an advisory committee with participants from leading privacy and human rights organizations to vet anything they plan to add to their devices that affect privacy.
In a few months this whole debacle will fall out of the news cycle
Admit to the world that private set intersection is completely over your
head without saying "private set intersection is completely over my
head". : D
Why didn't Apple test this idea out with any privacy experts first?
What evidence exists anywhere this will have any effect on children at all?
Define "this".
Why didn't Apple test this idea out with any privacy experts first?
they did, along with cryptographic experts and many others.
How did you confuse the issue of on the phone scanning with that of
PSI?
The only confusion is coming from you, dipshit troll.
On Sep 20, 2021, Jolly Roger wrote
(in article<news:iqq382Ftcq0U2@mid.individual.net>):
Admit to the world that private set intersection is completely over
your head without saying "private set intersection is completely over
my head". : D
How did you confuse the issue of on the phone scanning with that of
PSI?
Ron, the dizziest troll in town.
On Sep 19, 2021, Chris wrote
(in article<news:si8bek$c0h$1@dont-email.me>):
What evidence exists anywhere this will have any effect on children at all? >>Define "this".
How can you still be unaware of what Apple did?
Since we already know Apple never asked any privacy organizations nor any privacy experts, then it's obvious that Apple wasn't concerned with privacy.
What was Apple concerned about then if it isn't privacy?
What evidence is there that implementing these scans on the phone will have any effect on the global amount of illegal photographs in people's possession?
They did. Which is why they published their white papers carefully describing the technology. What they perhaps didn't anticipate was how many didn't read them or didn't understand them.
On 9/19/2021 11:50 AM, Robin Goodfellow wrote:
nospam <nospam@nospam.invalid> asked
it's not any type of backdoor, which means they didn't look at reality.
they *did* look at perception and that by claiming it was, they could
gain popularity and profit from it.
For once, nospam got it right...
What's prescient is how nospam described Apple marketing motives so well.
1. *Apple marketing "_didn't look at reality_"*, and,
2. *Apple marketing wanted to "_gain popularity and profit from it_.*"
He got #1 right anyway, but I doubt if anyone ever believed that they
would gain popularity from their proposal. If they didn't anticipate the negative reaction from hundreds of privacy organizations and millions of
end users, then that's something they need to look into.
They did. Which is why they published their white papers carefully describing the technology. What they perhaps didn't anticipate was how many didn't read them or didn't understand them.
What they actually didn't anticipate was how many people read them, understood them, but didn't believe the assertions that the technology
would not be expanded beyond CSAM.
There are lots of governments that
would love to scan for images of various things, including some very
wealthy countries that buy a lot of iPhones.
The whole patronizing attitude of "let us explain to you why what we're
doing is wonderful," when in fact all the privacy experts, human rights experts, and child protection organizations explained why it's _not_ wonderful, needs to change.
And it did change when the "delay" was
announced. You know that they're not going to bring back on-device
scanning and will simply do what other cloud service providers already do--scan on the server side.
You can get the facts here:
<https://www.eff.org/deeplinks/2
They did. Which is why they published their white papers carefully describing the technology. What they perhaps didn't anticipate was how many didn't read them or didn't understand them.
The point is that law enforcement all over the world are doing everything they can to stop this dissemination and catch those involved. Many online services are indirectly involved and they should try to do as much as they can to help.
It's all part of the jigsaw.
There won't be any evidence until the system is turned on, so impossible to answer right now.
On 9/20/2021 1:19 AM, Chris wrote:
<snip>
They did. Which is why they published their white papers carefully
describing the technology. What they perhaps didn't anticipate was how many >> didn't read them or didn't understand them.
What they actually didn't anticipate was how many people read them, understood them, but didn't believe the assertions that the technology
would not be expanded beyond CSAM. There are lots of governments that
would love to scan for images of various things, including some very
wealthy countries that buy a lot of iPhones.
The whole patronizing attitude of "let us explain to you why what we're
doing is wonderful," when in fact all the privacy experts, human rights experts, and child protection organizations explained why it's _not_ wonderful, needs to change. And it did change when the "delay" was
announced. You know that they're not going to bring back on-device
scanning and will simply do what other cloud service providers already do--scan on the server side.
You can get the facts here: <https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life>.
On Sep 20, 2021, Chris wrote
(in article<news:si9g6b$vql$1@dont-email.me>):
They did. Which is why they published their white papers carefully
describing the technology. What they perhaps didn't anticipate was how many >> didn't read them or didn't understand them.
How can you be still unaware malevolent governments and malware writers will not voluntarily restrict themselves to what Apple puts in their white paper?
He wasn't talking about Apple, and you both know that. And anyone who
can read can see he was talking about the *EFF* above. Your trolls are
just plain lame.
Again, as anyone who can read knows, he was talking about the *EFF* who looked at public perception ad thought they could gain popularity.
If everyone jumped off the Brooklyn Bridge, why should Apple follow them?
if 'everyone jumped off the brooklyn bridge', then who would be left to follow? nobody.
you haven't thought this through.
Admit to the world that private set intersection is completely over your
head without saying "private set intersection is completely over my
head". : D
In article <slrnskfdjm.2ii8.g.kreme@m1mini.local>, Lewis <g.kreme@kreme.dont-email.me> wrote:
facebook doesn't care about privacy, so they scan server-side, and for
much more than just csam.
And then use your images to create advertisements fr you, leaving you to
believe that your contact are endorsing products Facebook is shilling.
Why anyone uses Facebook of anything is beyond me, they are easily the
worst company in the world, and dedicated to violating every users
privacy in every possible way by any possible means. The fuckers even
created a fake VPN so they could intercept every bit of data from people
dumb enough to use that VPN.
which apple quickly shut down, going so far to pull their enterprise
cert, which caused their own internal apps to stop working.
On 9/19/2021 11:36 AM, Robin Goodfellow wrote:
sms <scharf.steven@geemail.com> asked
The reason is pretty clear. When they scan iCloud photos on the server,
they have to decrypt them prior to applying the hash.
This is logical.
However... other things are as logical (IMHO)... given that...
There are at least two competing arguments we'd need to hash out before we >> can come to that conclusion that "the reason is pretty clear" why Apple
chose to do something so stupid (without even asking security experts!)... >>
... given it was "said" (yes, I know it was later found out to be incorrect) >> that Apple "can" scan the hash of an encrypted file, as long as it was Apple >> who did the initial encryption. (Whether or not that is technically feasible >> would matter greatly if this is to be an option).
Yeah, I wondered about that too, but I don't think that scanning the
hash would work. A completely different hash could be generated from essentially the same image based on very minor differences in the image.
The second argument we'd have to hash out is why should Apple actively do
anything?
Because other cloud service providers already do CSAM detection on their servers.
While I certainly "get" that any company would love to offload its computer >> resources to billions of their captive devices around the world, I can't
believe that, if and when Apple decides to become an active arm of law
enforcement, that Apple can't spare the server resources. I just can't.
A "compute server" or "GPU server" is a different animal than a storage server, it's designed with multiple high-power Xeon or EPYC, and Nvidia Ampere processors. It's understandable why Apple wanted to offload the
photo scanning to users' devices.
And it did change when the "delay" was announced. You know that
they're not going to bring back on-device scanning and will simply do
what other cloud service providers already do--scan on the server
side.
In article <si8k4b$d4m$1@dont-email.me>, sms
<scharf.steven@geemail.com> wrote:
A completely different hash could be generated from
essentially the same image based on very minor differences in the image.
yet another thing you don't understand about apple's csam.
apple's neural hash is designed specifically to avoid alterations to
the images.
other providers scan for much more than just csam
apple's system is designed so that other entities *can't* force them to
scan for other stuff.
completely wrong and an absurd conspiracy theory.
Only anti-privacy morons think this is a good idea.
There are organisations urging Apple to implement their system.
And no, the state of iCloud RIGHT NOW is not TNO encryption, and Apple
right now CAN decrypt the data, but despite the frothing idiots here,
Apple does NOT do that.
Lewis <g.kreme@kreme.dont-email.me> asked
And no, the state of iCloud RIGHT NOW is not TNO encryption, and Apple
right now CAN decrypt the data, but despite the frothing idiots here,
Apple does NOT do that.
*Apologists' belief systems are based on exactly _zero_ (0) actual facts.*
Can Lewis not be aware that Apple provides decrypted data all the time to
law enforcement (remember, Apple claims they follow all the local laws)?
If not, then _everything_ Lewis just claimed is another imaginary belief.
Lewis <g.kreme@kreme.dont-email.me> asked
And no, the state of iCloud RIGHT NOW is not TNO encryption, and Apple
right now CAN decrypt the data, but despite the frothing idiots here,
Apple does NOT do that.
*Apologists' belief systems are based on exactly _zero_ (0) actual facts.*
Can Lewis not be aware that Apple provides decrypted data all the time to
law enforcement (remember, Apple claims they follow all the local laws)?
If not, then _everything_ Lewis just claimed is another imaginary belief.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 296 |
Nodes: | 16 (2 / 14) |
Uptime: | 77:43:42 |
Calls: | 6,658 |
Calls today: | 4 |
Files: | 12,203 |
Messages: | 5,332,838 |
Posted today: | 1 |