Streaming issues? Report here
Nonn Botha 1500 x 1500 2020 Nonn Botha 1500 x 1500 2020
702 Music with Nonn Botha
14:30 - 18:00
volume_up
volume_mute

Up Next: This is Africa with Richard Nwamba
See full line-up
702 Music with Nonn Botha
14:30 - 18:00
Home
arrow_forward
Business Unusual

Apple's plans to fight child porn by scanning your phone

18 August 2021 7:15 PM
Tags:
Digital technology
BusinessUnusual

It will only affect US users and it is not all photos, but it has got some asking if it is justified

If you own an iPhone, the first question is what does owning it even mean. Apple will not allow you to load unapproved apps. Apple only allows you to have it worked on by Apple approved service agents and in the US when the next iOS update is released it will scan your phone messages and photos.

These might not sound like good reasons to want to buy one of the most expensive phones on the market, but the reasons for the update do make sense even if not everyone agrees.

Privacy versus Safety

Technology has created some great new things, but with that it has also added to the complications about things that were simple before.

If you wrote something in a diary, no-one would get to see it unless you let them or they did so without your permission. Simple.

Your phone is like a diary, but it needs to connect to the internet and some of the things you put on your phone gets saved to a server that is not on your phone.

When you click okay on those long lists of requirements, permissions and rules, you also agree that some content stored on servers may be subject to checks for what is on it. The checks are only allowed to search for harmful content which is either child sexual abuse material (CSAM) or if required to do so by law.

Neither are reasons for most to rethink having clicked okay, but most were probably not aware. Many would not care even if it was for less serious reasons.

It is a fair view, but has in part led to an issue with how much information about you is stored and checked and then shared. The info is used to influence you, at the very least to try to get you to buy more things. You are probably not aware about what someone looking to sell you something already knows about you and so you may not get the best deal.

It may not be a big deal as you can decide to buy or not, but increasingly in countries where the state is mindful of keeping its citizens in line, it may come to use the same rules to legally gain access to your private information or at the very least regulate what you may or may not see.

China has built sophisticated tools to monitor and manage what Chinese citizens are able to see. When you don’t know that it is possible you would also not know when it is happening. Ignorance may be bliss, but we don’t have any examples of countries that choose to restrict basic speech that did not go on to extend that to the detriment of citizens basic freedoms.

Fighting to stop something is usually a lot harder than fighting to keep something.

But things don’t change overnight, they change just a little at first but over time the effect can be dramatic. Privacy advocates see each concession made in favour of anything that erodes personal privacy as the slippery slope that is easy to go down but almost impossible to go back up.

What Apple is planning

This is a simplified version of all the elements that will go into protecting children, finding CSAM and tracking down those that are making and sharing it.

I agree with the attempt and think it will be a worthwhile project, but I also think all the discussion and scrutiny is good too.

The update will be part of the next iOS update and will affect the messaging apps and the storing of photos on iCloud.

For those with children who are on a family account, the option to switch on the scanning of messages that are sent and received for images that contain nudity will remind the child that viewing the received image may not be advisable and that should they choose to view it, the phone will notify the parent that an image with suspected nudity was received and viewed by someone under 13 years old. It will do the same for images when an attempt to send them is made. For those over 13 they will just get the notification.

Parents will not be sent the images, just the notification.

Should a user search for CSAM materials or wish to report it, Siri has special results to explain the danger and consequence of the search. It will also make reporting it easier by loading the details of how to report it rather than just the links to do it.

Warning someone before they do something they may regret or have consequences they did not fully understand is a great option.

Some may make good arguments that in homes where the parent might be part of the problem that it is not a good solution. Apple notes that by not sharing any images with parents and always alerting the child before alerting the parent such a situation can be avoided.

Scanning for nudity and CSAM

The two may seem related but are treated very differently. Most social networks have a variety of tools that check uploaded content for a range of issues that may relate to text, images and video. Scanning for certain words or phrases makes sense as it would indicate if someone wanted to incite violence or threaten someone. In a similar way machines are able to scan an image for patterns in the same way it can look for words that allow it tell the difference between a cat or dog and detect people. It can then determine if the person might be naked and if that may be against its rules in which case it will not be shown or sent to a moderator to confirm it is not allowed.

It is not always easy. Facebook has issues with allowing men to appear topless but not women. For humans it is easy to tell the difference but imagine trying to tell someone who does not know what a male or female is or looks like then explaining the actual difference proves to be way more challenging.

Facebook had a blanket response to not allowing female nipples from being shown in images, but accepted banning any mother breastfeeding was also not justified.

The process is improving all the time and the accuracy is at least good enough to catch most of what is not allowed most of the time, it is able to refer what might be not allowed to moderators and rarely blocks something it should not have. That is not to say that it does not happen a lot, but in relation to just how much content is being uploaded, the percentages are small and getting smaller.

Videos are a series of images and so scanning works in a similar way.

For child abuse images, the test and the consequence is more technical. Certain authorities, both state and civilian, are tasked with intercepting and tagging known CSAM images. In the past someone had to review the images. Not something those tasked with protecting children would like to do.

Now once an image has been identified a special identifier for it is created and that is compared with another potential copy or version of the image and it will either be a match or not. Machines can create the identifier which Apple refers to as a NeuralHash and it will be able to identify the image as being the same even if cropped, the colours have been changed or made more blurry.

Making it accurate is not only important in order to find the actual images but perhaps more important to not incorrectly identify an image that is not one of the images that is in the database.

There are two further steps to limit the chances for something that is not an issue from being identified as one. The images are only scanned if they are being saved to iCloud and only if 30 images match those in the CSAM database. I was surprised that it would require that many before triggering the next step, but given the implication of the notification, getting it wrong can be very damaging to the person that is accused of having CSAM and potentially costly for Apple should it be sued for the false allegation.

Should the images trigger the response, Apple will be notified who will review the match and if correct will alert authorities and lock the phone.

The details of how and if it is effective enough goes beyond the scope of this piece, but it is important to at least be aware of the efforts to combat CSAM which is much easier to share digitally but also to balance the attempts with the consequences for those that will never do this, but could be affected negatively by the privacy erosion.

Discussion and listening to all sides is the best way to find a solution that will probably not be perfect but keep improving on the status quo.




18 August 2021 7:15 PM
Tags:
Digital technology
BusinessUnusual

More from Business Unusual