• WhatsApp head says Apple's child safety update is a 'surveillance syste

    From Jim_Higgins@21:1/5 to All on Sat Aug 7 14:27:52 2021
    WhatsApp head says Apple's child safety update is a 'surveillance system' https://www.engadget.com/whatsapp-head-apple-child-safety-surveillance-system-201956787.html

    --
    Psalm 12-8 NIV 1984
    The wicked freely strut about when what is vile is honored among men.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to gordian240@hotmail.com on Sat Aug 7 21:01:43 2021
    On 2021-08-07, Jim_Higgins <gordian240@hotmail.com> wrote:

    WhatsApp head says Apple's child safety update is a 'surveillance system'

    Gullible idiots like you will fall for anything that fits your skewed
    world view.

    Facebook / WhatsApp has already been scanning their users' photos with
    the same PhotoDNA database for *decades* now. Here's just one report
    from an arrest made back in *2015*, for instance:

    <https://www.wired.co.uk/article/whatsapp-encryption-child-abuse>
    ---
    PhotoDNA is used on WhatsApp group images, for example, to find known
    instances of child sexual abuse material. In the second half of last
    year, WhatsApp also started using a Google machine learning system to
    identify new instances of child sexual abuse in photos.

    Group names are also scanned, using AI, for potential words related to
    child sexual abuse. WhatsApp takes data from child safety groups about
    the language predators use. Paedophiles often use codewords to try and
    disguise their behaviour. As a result, WhatsApp has designed its systems
    to try and identify if criminals are intentionally misspelling “child
    porn” as “child pr0n”, for example. The company says its machine
    learning systems are designed to find people highly likely to be
    violating its policies. Humans then review what has been flagged to make
    sure that accounts are only banned or sent to the NCMEC when there is sufficient evidence. When WhatsApp finds concrete evidence of abuse,
    such as through PhotoDNA, it aut
  • From Your Name@21:1/5 to All on Sun Aug 8 09:53:51 2021
    On 2021-08-07 18:27:52 +0000, Jim_Higgins said:

    WhatsApp head says Apple's child safety update is a 'surveillance system' https://www.engadget.com/whatsapp-head-apple-child-safety-surveillance-system-201956787.html


    A. If you're not doing anything wrong, then you've got nothing to whine
    about. If you are doing sometyhing wrong, then you get what you deserve.

    B. You've been under surveillance near constantly since about the 1970s ...
    unless you live in a wooden hut or cave ion the woods and don't use any
    device nor have any contact with the civilised world.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Calum@21:1/5 to Jolly Roger on Sun Aug 8 14:43:02 2021
    On 07/08/2021 22:01, Jolly Roger wrote:
    On 2021-08-07, Jim_Higgins <gordian240@hotmail.com> wrote:
    Gullible idiots like you will fall for anything that fits your skewed
    world view.

    Facebook / WhatsApp has already been scanning their users' photos with
    the same PhotoDNA database for *decades* now.

    They have, but in a different, "more secure" way... at least according
    to them.

    <https://www.businessinsider.com/whatsapp-head-slams-apple-iphone-scanning-for-child-abuse-images-2021-8?r=US&IR=T>

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Calum on Sun Aug 8 17:18:56 2021
    On 2021-08-08, Calum <com.gmail@nospam.scottishwildcat> wrote:
    On 07/08/2021 22:01, Jolly Roger wrote:
    On 2021-08-07, Jim_Higgins <gordian240@hotmail.com> wrote:
    Gullible idiots like you will fall for anything that fits your skewed
    world view.

    Facebook / WhatsApp has already been scanning their users' photos with
    the same PhotoDNA database for *decades* now.

    They have, but in a different, "more secure" way... at least according
    to them.

    <https://www.businessinsider.com/whatsapp-head-slams-apple-iphone-scanning-for-child-abuse-images-2021-8?r=US&IR=T>

    Bullshit. He said no such thing. In fact, the words "more secure" aren't
    even mentioned in that article or his lame tweet.

    He also said the Apple software would allow access to "scan all of a
    user's private photos on your phone — even photos you haven't shared
    with anyone." which is a blatant lie. Only photos being transferred to
    Apple servers are examined.

    Nice try. No cigar. Facebook, WhatsApp, Google, and a slew of others
    have been using the same PhotoDNS system to monitor their users photos
    for many years without a peep from people, and only now that Apple has announced they are doing it too is there a major "outcry". Hypocrisy at
    its finest on display.

    And the fact is the way Apple is doing this actually preserves the
    privacy of people who are not in violation better than Facebook,
    WhatsApp, Google, and others, because:

    * only photos uploaded or transferred to Apple’s iCloud servers are
    examined; they are examined by generating a hash of the photo and
    comparing that hash to a list of hashes of known child sexual abuse
    photos

    * only those hashes that match the hashes of known child sexual abuse
    photos are flagged as potential violations by generating encrypted
    safety vouchers containing metadata and visual derivatives of matched
    photos

    * Apple employees know absolutely nothing about images that are not
    uploaded or transferred to Apple servers — nor do they know anything
    about photos that do not match hashes of known child sexual abuse
    photos

    * the risk of the system incorrectly flagging an account is an extremely
    low (1 in 1 trillion) probability of incorrectly flagging a given
    account

    * only accounts with safety vouchers that exceed a threshold of multiple
    matches are able to be reviewed by Apple employees - until the
    threshold is exceeded, the encrypted vouchers cannot be viewed by
    anyone

    * end users cannot access or view the database of known child sexual
    abuse photos - nor can they identify which images were flagged

    * only photos that were reviewed and verified to be child sexual abuse
    are forwarded to authorities

    Apple customers who are concerned by this system can opt out by
    refraining from uploading photos to iCloud (by disabling iCloud Photos,
    My Photo Stream, and iMessage). Since these are all optional services,
    this is very easy to do.

    Claims stating that Apple is supposedly scanning your entire device 24/7
    are unfounded.

    Claims that Apple is scanning every single photo on your device are also unfounded.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Joerg Lorenz@21:1/5 to All on Tue Aug 10 17:44:21 2021
    Am 07.08.21 um 20:27 schrieb Jim_Higgins:
    --
    Psalm 12-8 NIV 1984
    The wicked freely strut about when what is vile is honored among men.

    Are you a bigot?


    --
    De gustibus non est disputandum

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Calum@21:1/5 to Jolly Roger on Wed Aug 11 02:31:51 2021
    On 08/08/2021 18:18, Jolly Roger wrote:
    On 2021-08-08, Calum <com.gmail@nospam.scottishwildcat> wrote:
    On 07/08/2021 22:01, Jolly Roger wrote:
    On 2021-08-07, Jim_Higgins <gordian240@hotmail.com> wrote:
    Gullible idiots like you will fall for anything that fits your skewed
    world view.

    Facebook / WhatsApp has already been scanning their users' photos with
    the same PhotoDNA database for *decades* now.

    They have, but in a different, "more secure" way... at least according
    to them.

    <https://www.businessinsider.com/whatsapp-head-slams-apple-iphone-scanning-for-child-abuse-images-2021-8?r=US&IR=T>

    Bullshit. He said no such thing. In fact, the words "more secure" aren't
    even mentioned in that article or his lame tweet.


    You're correct, he did not use the words "more secure". He said it works "without breaking encryption", which I took to mean that he believed it
    more secure than Apple's approach. But I should have double-checked the
    quote before posting.

    (I also said "at least according to them" because I don't believe it any
    more than you do.)

    He also said the Apple software would allow access to "scan all of a
    user's private photos on your phone — even photos you haven't shared
    with anyone." which is a blatant lie. Only photos being transferred to
    Apple servers are examined.

    And that's certainly true for the CSAM detection bit that everyone's
    talking about. Although Apple did also announce the new "Communication
    Safety in Messages" feature at the same time, which uses "on-device
    machine learning" to analyze any photo that a Family Sharing account
    belonging to a child sends or receives, and warns their parents about
    any that appear to contain nudity. Still short of scanning "all of a
    user's private photos", but certainly some that may never leave your phone.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Calum on Thu Aug 12 14:34:05 2021
    On 2021-08-11, Calum <com.gmail@nospam.scottishwildcat> wrote:
    On 08/08/2021 18:18, Jolly Roger wrote:
    On 2021-08-08, Calum <com.gmail@nospam.scottishwildcat> wrote:
    On 07/08/2021 22:01, Jolly Roger wrote:
    On 2021-08-07, Jim_Higgins <gordian240@hotmail.com> wrote:
    Gullible idiots like you will fall for anything that fits your skewed
    world view.

    Facebook / WhatsApp has already been scanning their users' photos with >>>> the same PhotoDNA database for *decades* now.

    They have, but in a different, "more secure" way... at least according
    to them.

    <https://www.businessinsider.com/whatsapp-head-slams-apple-iphone-scanning-for-child-abuse-images-2021-8?r=US&IR=T>

    Bullshit. He said no such thing. In fact, the words "more secure" aren't
    even mentioned in that article or his lame tweet.

    You're correct, he did not use the words "more secure". He said it works "without breaking encryption", which I took to mean that he believed it
    more secure than Apple's approach. But I should have double-checked the
    quote before posting.

    (I also said "at least according to them" because I don't believe it any
    more than you do.)

    He also said the Apple software would allow access to "scan all of a
    user's private photos on your phone — even photos you haven't shared
    with anyone." which is a blatant lie. Only photos being transferred to
    Apple servers are examined.

    And that's certainly true for the CSAM detection bit that everyone's
    talking about. Although Apple did also announce the new "Communication
    Safety in Messages" feature at the same time, which uses "on-device
    machine learning" to analyze any photo that a Family Sharing account belonging to a child sends or receives, and warns their parents about
    any that appear to contain nudity. Still short of scanning "all of a
    user's private photos", but certainly some that may never leave your phone.

    That scan never leaves your phone either.

    Meanwhile Facebook and WhatsApp access and scan *ALL* of their user's
    photos.

    WhatsApp / Facebook are full of shit.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)