It’s Personal

If you want to check out the new video edition of the podcast, please go to:

https://youtu.be/FBBmA9YDNfQ

where you can subscribe, give thumbs up and ring bells like YouTubers have been asking you to do for years. You know the drill.

Also, our apologies for the hum in the audio throughout the entire episode. The problem has been identified and the source (Dan) has been taken out back and schooled on the difference between mic-level and line-level audio feeds. He promises it won’t happen again… often.

Now, on to the show.

This week, Dan, Brian and Erik tackle the recent changes announced by Apple regarding moves to protect children from online predators and from the passing of illegal material about children. The project has three parts, each with its own benefits and concerns. We cover them each individually:

First, the scanning of messages inbound to minors (Under 18s) on a Apple Family Sharing account in which images are tested for inappropriateness, blurred and the child alerted that they may be about to look at something that they may want to reconsider. If they are under 13 and decide to view the image the parents are notified. This is an opt-in programme and parents decide whether or not to join for the family.

Next comes the proactive scanning of iCloud Photo Library stored at Apple. For a long time many have wondered why end-to-end encryption had not been put into iCloud, and this is a likely factor. The photos are tested against the hashes of a set of known images containing child pornography and issues are raised to the authorities. This is and has been happening on other cloud photo services including Microsoft and Flickr for some time.

Finally, and most controvertially from a privacy perspective, Apple is implementing a proactive test of the hashes ofphotos stored on customers’ Apple devices against this same set of known images. In the US there is no law that prevents this but runs counter to the marketing emphasis Apple has placed on the privacy of data within their devices. The method is rather intricate and strives to prevent Apple from seeing anything unless it suspects there are systemic child pornography issues at bay.

These technology approaches change the game for prosecutors and law enforcement, and they expose issues earlier. But what happens when this capability gets expanded, or brought into law as mandatory for use against its citizens who speak out politically, or is taken over by bad actors? Look at the link in the show notes regarding the keys the TSA made for physical locks at the airport – every hole is a potential future vulnerability. Does the end justify the means? We discuss in depth on this week’s Great Security Debate!

If you want to support the efforts of The Great Security Debate, please feel free to become a patron and get some cool benefits of supporting this independent show – https://www.patreon.com/securitydebate

Support The Great Security Debate

Links: