fbpx
Frame-14

Privacy Ninja

        • DATA PROTECTION

        • CYBERSECURITY

        • Penetration Testing

          Secure your network against various threat points. VA starts at only S$1,000, while VAPT starts at S$4,000. With Price Beat Guarantee!

        • API Penetration Testing
        • Enhance your digital security posture with our approach that identifies and addresses vulnerabilities within your API framework, ensuring robust protection against cyber threats targeting your digital interfaces.

        • On-Prem & Cloud Network Penetration Testing
        • Boost your network’s resilience with our assessment that uncovers security gaps, so you can strengthen your defences against sophisticated cyber threats targeting your network

        • Web Penetration Testing
        • Fortify your web presence with our specialised web app penetration testing service, designed to uncover and address vulnerabilities, ensuring your website stands resilient against online threats

        • Mobile Penetration Testing
        • Strengthen your mobile ecosystem’s resilience with our in-depth penetration testing service. From applications to underlying systems, we meticulously probe for vulnerabilities

        • Cyber Hygiene Training
        • Empower your team with essential cybersecurity knowledge, covering the latest vulnerabilities, best practices, and proactive defence strategies

        • Thick Client Penetration Testing
        • Elevate your application’s security with our thorough thick client penetration testing service. From standalone desktop applications to complex client-server systems, we meticulously probe for vulnerabilities to fortify your software against potential cyber threats.

        • Source Code Review
        • Ensure the integrity and security of your codebase with our comprehensive service, meticulously analysing code quality, identifying vulnerabilities, and optimising performance for various types of applications, scripts, plugins, and more

        • Email Spoofing Prevention
        • Check if your organisation’s email is vulnerable to hackers and put a stop to it. Receive your free test today!

        • Email Phishing Excercise
        • Strengthen your defense against email threats via simulated attacks that test and educate your team on spotting malicious emails, reducing breach risks and boosting security.

        • Cyber Essentials Bundle
        • Equip your organisation with essential cyber protection through our packages, featuring quarterly breached accounts monitoring, email phishing campaigns, cyber hygiene training, and more. LAUNCHING SOON.

Apple Is Scanning Every Single iPhone for Images of Child Sexual Abuse

Apple Is Scanning Every Single iPhone for Images of Child Sexual Abuse

The world is changing, fast.

Apple has revealed plans to scan all iPhones in the United States for images of child sexual abuse, at once drawing praise from child protection groups, while also raising serious concerns about entrusting private information to systems that are not exactly subject to public consent (since smartphones are essential in modern society), potentially leading to a new sphere of legitimized surveillance on ordinary citizens, according to a blog post on Apple’s official website.

Apple’s new tool will scan images and encrypted messages for signs of child sexual abuse

Called “neuralMatch,” the new tool will scan every image before its uploaded to iCloud, and, if it finds a suggestive match, a real-life human will review it. If they decide it qualifies as child pornography, then the user’s account will be automatically disabled, the company will disable the user’s account, and a notification will be sent to the National Center for Missing and Exploited Children, according to an initial report from NPR. Crucially, Apple will also scan encrypted messages (presumably stored on one’s phone or sent) for signs of sexually explicit content, as a preventative measure against child abuse, and, understandably, this alarmed staunch privacy advocates.

Notably, the new detection tool will only flag images that the company’s database already has stored away as “known” child pornography. Parents who take pictures of their children in, say, a bathtub, are probably not in any danger. But researchers warn that the image-matching tool — which does not literally “see” images, and only makes approximations of what is or isn’t illegal based on mathematical “fingerprints” — might open the door for insidious motives, whether they come from Apple, the government, or any associated party. For example, a top cryptography researcher from Johns Hopkins University named Matthew Green has said that Apple’s new neuralMatch system isn’t fool-proof. In fact, the system, despite its noble intentions, might easily be used to frame innocent iPhone users.

This could work by sending a seemingly innocent image created to activate the tool’s matching function, and flag a harmless user as a sexual abuser of children. “Researchers have been able to do this pretty easily,” said Green of how simple it is to trick systems like Apple’s neuralMatch, in the NPR report. In case it isn’t obvious, “child sexual abuser” and “sexual predator” are extremely stigmatizing accusations that even the most well-behaved citizens could spend a lifetime trying to shake off. Not only from their publicly shared records and social media, but from the court of public opinion. In other words: It should go without saying that catching predators is important.

But at what cost?

Also Read: A Review of PDPC Undertakings July 2021 Cases

Apple is under increased pressure to enable mass surveillance

Apple assuming the right to monitor all photos on an iPhone, not just the ones that actually are the “fingerprints” of a sexual predator, puts a lot of control in the company’s hands, which seems to contradict the firm’s own assertions about how it should interact with law enforcement. Notably, additional abuses of power could involve Apple enabling the government surveillance of dissidents, or protestors, regardless of political persuasion. “What happens when the Chinese Government says, ‘Here is a list of files that we want you to scan for,'” asked Green in the report, rhetorically. “Does Apple say no? I hope they say no, but their technology won’t say no.”

And he has a point. For years, Apple has experienced increased governmental pressure to enable higher levels of surveillance on encrypted data. This has placed the company in a tenuous position, balancing a legal imperative to crack down on the abuse and exploitation of children, while also maintaining its image of being resolutely committed to protecting user privacy. But an online civil liberties organization called the Electronic Frontier Foundation sees Apple’s latest move as “a shocking about-face for users who have relied on the company’s leadership in privacy and security,” in the NPR report. While we can’t say we’re living in a cyberpunk dystopia, it seems today that big tech is beginning to exhibit some of the basic markers surrounding invasive surveillance of ordinary citizens.

Also Read: Protecting Data Online in the New Normal

This was a breaking story and was regularly updated as new information became available.

0 Comments

KEEP IN TOUCH

Subscribe to our mailing list to get free tips on Data Protection and Data Privacy updates weekly!

Personal Data Protection

REPORTING DATA BREACH TO PDPC?

We have assisted numerous companies to prepare proper and accurate reports to PDPC to minimise financial penalties.
×

Hello!

Click one of our contacts below to chat on WhatsApp

× Chat with us