...

Roskachestvo commented on the Apple scandal

Apple has announced that the next version of iOS from the 15th will automatically check the messages sent by children, as well as scan users’ photos in iCloud to search for child pornography and prevent the spread of such files. The initiative was met with mass protests in the West. Tabloids reproach the corporation for wanting to hack into users’ phones and personal photos under plausible pretexts.

image_7

Sergey Kuzmenko, senior expert of the Roskatchestvo Digital Expertise Center for testing digital products:

“The problem of sexual exploitation of children that Apple is trying to solve is very severe and acute, both for the U.S. and the rest of the world. “In addition, the implementation of image analysis technology – at least as the company is currently describing it – assumes maximum privacy and security for owners of Apple devices.”.

As far as can be understood, it would work like this:

1 The company will check at the messaging stage to see if a child is sending someone a candid photo, and then display several warnings and notify parents. Child protection in iMessage will only work on family accounts with children 12 or younger, and will need to be activated manually. No information will be given to the authorities.

2 The Siri and Search search services will be provided with lists of code words and queries related to child pornography, and then users who make such requests will receive a warning bar that such interests are harmful and links to partner sites to get help. Will the information on such requests will be transferred to the authorities, it is not clear yet.

3 The most scandalous part: when uploading photos to iCloud if the feature is activated, it happens automatically , the company’s neural network will scan photos before uploading and compare them to the existing database of child porn. At that, the photos will be hashed i.e. converted into a sequence of symbols no one else can decode and the algorithm will compare them to the database, which also consists of similar chains of symbols. The process is automatic, and only if there is a match with known patterns will a human moderator get involved.

4 If a user is found to have known images in multiples or an incident involving the downloading of such photos is repeated, Apple reserves the right to turn his or her data over to the competent authorities. The company emphasizes that if a user has disabled uploading, the system won’t work, and the feature doesn’t check the private photo library on the device.

Of course, questions remain as to whether the company will follow the stated privacy rules, how much the recognition technology is actually foolproof, and whether it will be used for some other purposes in the future. In its current form, however, in the potential detection of paedophile accounts, one can see only pluses.

It should also be noted that the function is only launched in the US, in other countries it will be introduced gradually, taking into account the local legal framework.

Rate this article
( No ratings yet )
John Techno

Greetings, everyone! I am John Techno, and my expedition in the realm of household appliances has been a thrilling adventure spanning over 30 years. What began as a curiosity about the mechanics of these everyday marvels transformed into a fulfilling career journey.

Home appliances. Televisions. Computers. Photo equipment. Reviews and tests. How to choose and buy.
Comments: 2
  1. Rhiannon

    What specific scandal is Roskachestvo referring to in relation to Apple? What are the details of the scandal and how does it impact the consumers and the reputation of Apple?

    Reply
  2. Penelope Johnson

    What specific issues or aspects of the Apple scandal did Roskachestvo comment on?

    Reply
Add Comments