fbpx
Frame-14

Privacy Ninja

        • DATA PROTECTION

        • CYBERSECURITY

        • Penetration Testing

          Secure your network against various threat points. VA starts at only S$1,000, while VAPT starts at S$4,000. With Price Beat Guarantee!

        • API Penetration Testing
        • Enhance your digital security posture with our approach that identifies and addresses vulnerabilities within your API framework, ensuring robust protection against cyber threats targeting your digital interfaces.

        • On-Prem & Cloud Network Penetration Testing
        • Boost your network’s resilience with our assessment that uncovers security gaps, so you can strengthen your defences against sophisticated cyber threats targeting your network

        • Web Penetration Testing
        • Fortify your web presence with our specialised web app penetration testing service, designed to uncover and address vulnerabilities, ensuring your website stands resilient against online threats

        • Mobile Penetration Testing
        • Strengthen your mobile ecosystem’s resilience with our in-depth penetration testing service. From applications to underlying systems, we meticulously probe for vulnerabilities

        • Cyber Hygiene Training
        • Empower your team with essential cybersecurity knowledge, covering the latest vulnerabilities, best practices, and proactive defence strategies

        • Thick Client Penetration Testing
        • Elevate your application’s security with our thorough thick client penetration testing service. From standalone desktop applications to complex client-server systems, we meticulously probe for vulnerabilities to fortify your software against potential cyber threats.

        • Source Code Review
        • Ensure the integrity and security of your codebase with our comprehensive service, meticulously analysing code quality, identifying vulnerabilities, and optimising performance for various types of applications, scripts, plugins, and more

        • Email Spoofing Prevention
        • Check if your organisation’s email is vulnerable to hackers and put a stop to it. Receive your free test today!

        • Email Phishing Excercise
        • Strengthen your defense against email threats via simulated attacks that test and educate your team on spotting malicious emails, reducing breach risks and boosting security.

        • Cyber Essentials Bundle
        • Equip your organisation with essential cyber protection through our packages, featuring quarterly breached accounts monitoring, email phishing campaigns, cyber hygiene training, and more. LAUNCHING SOON.

Thousands Of Women Have No Idea A Telegram Network Is Sharing Fake Nude Images Of Them

Thousands Of Women Have No Idea A Telegram Network Is Sharing Fake Nude Images Of Them

Lionel Bonaventure / Getty Images

Over 680,000 women have no idea their photos were uploaded to a bot on the messaging app Telegram to produce photo-realistic simulated nude images without their knowledge or consent, according to tech researchers.

The tool allows people to create a deepfake, a computer-generated image, of a victim from a single photo.

Sensity, a visual threat intelligence company headquartered in Amsterdam, discovered the Telegram network of 101,080 members, 70% of whom appeared to reside in Russia or Eastern Europe.

“This one’s unique because it’s not just people talking or people sharing content, it’s actually embedded in Telegram, and we have not found something similar,” said Giorgio Patrini, CEO and chief scientist at Sensity.

About 104,852 images of women have been posted publicly to the app, with 70% of the photos coming from social media or private sources. A small number of these victims appeared to be underage. The rest of the images were likely shared privately, researchers say.

Unlike the algorithms that make deepfake videos — including nonconsensual sexual videos — the Telegram bot doesn’t need thousands of images to work. It only needs one, “which is really a reason why so many private individuals are attacked, because only one profile picture from Facebook is enough to do this,” said Patrini.

Also Read: The Importance Of DPIA And Its 3 Types Of Processing

This harassment appears to be happening without the knowledge or consent of the photographed women, a vast majority of who are private citizens rather than celebrities or influencers.

“As soon as you share images or videos of yourself and maybe you’re not so conscious about the privacy of this content, who can see it, who can steal it, who can download it without you knowing, that actually opens the possibility of you being attacked,” Patrini said.

Nina Jankowicz, author of How to Lose the Information War, said this app shows that deepfakes go beyond politics. For Jankowicz, worries about the national security implications of a convincing fake video sidestep the reality that the technology is largely deployed to abuse women.

“It’s really disturbing how accessible it is,” Jankowicz said. “Frankly, the thing that we’ve seen shared as evidence of the growing deep fake phenomenon, the little silly videos of Joe Biden that President Trump has shared, for instance, that stuff is far less sophisticated than what you’re talking about.”

Jankowicz said the app has huge implications for women everywhere, especially in more socially conservative countries like Russia. Victims could be at risk of losing their job and livelihood if a convincing but fake nude photo were made public. Some could face partner violence.

“Essentially, these deepfakes are either being used in order to fulfill some sick fantasy of a shunted lover, or a boyfriend, or just a total creepster,” Jankowicz said. “Or they’re used as potential blackmail material.”

Also Read: Data Storage Security Standards: What Storage Professionals Need To Know

The app and the ease of accessibility speak to larger themes of online harassment and abuse women face online — something Jankowicz has experience with firsthand.

“This is all part and parcel of the broader abuse and harassment that women have to deal with in the online environment, whether that’s just trolling or whether it’s the gendered and sexualized abuse coming from all sides of the political spectrum,” Jankowicz said. “It’s used as a weapon of trying to push women out of the public sphere. This is just an extension of that.”

While Patrini and his team didn’t find proof of these images being used for extortion of women, they fear that possibility is fast approaching.

Deepfake nudes often target celebrities, but this network appears to be more focused on people who are not famous. According to a poll that Sensity conducted on people using the tool, 63% of those using it said they were interested in women they knew.

“In the industry, at least, it is a known problem to some extent,” said Patrini, “but I really struggle to believe that at the level of private citizens it’s known by anyone.”

According to Sensity, “All sensitive data discovered during the investigation detailed in this report has been disclosed with Telegram, [Russian social media site] VK, and relevant law enforcement authorities. We have received no response from Telegram or VK at the time of this report’s publication.”

Patrini pointed out that so-called “porn bots” go against Telegram’s terms of service.

The bot has been advertised on VK, with Sensity finding activity on 380 pages on the site.

“VK doesn’t tolerate such content or links on the platform and blocks communities that distribute them. Also, please note that such communities or links were not promoted using VK advertising tools,” a VK spokesperson told BuzzFeed News after this story was published. ”We will run an additional check and block inappropriate content and communities.”

This tool, which BuzzFeed News is declining to name, allows people to produce deepfakes on cellphones, remotely generating the images before sending them back to the user. The bot, which only works on images of women, provides watermarked images at no cost, and images without watermarks for a fee of about $1.50. Customers can also earn money by referring others to the service.

“That’s the phenomenon of this technology becoming a commodity,” Patrini said. “No technical skill required, no specialized hardware, no special infrastructure or accessibility to specific services that are hard to reach.”

According to Sensity, seven Telegram channels using the bot had attracted a combined 103,585 members by the end of July, a year since the tool was launched, with the most-populous channel having 45,615 people in it.

Patrini reiterated that while many people fear how deepfake technology could be used in politics, its actual widespread use is the exploitation of women online.

“This is not a problem of a high system of democracy, at least primarily. This is not a problem only for public figures and celebrities, but it’s going to be a problem for everybody, unfortunately quite soon,” Patrini said. “It’s already today a problem for hundreds of thousands of people.”

0 Comments

KEEP IN TOUCH

Subscribe to our mailing list to get free tips on Data Protection and Data Privacy updates weekly!

Personal Data Protection

REPORTING DATA BREACH TO PDPC?

We have assisted numerous companies to prepare proper and accurate reports to PDPC to minimise financial penalties.
×

Hello!

Click one of our contacts below to chat on WhatsApp

× Chat with us