Topic: WhatsApp has a child porn problem  (Read 852 times)

0 Members and 1 Guest are viewing this topic.

WhatsApp has a child porn problem
« on: December 21, 2018, 01:04:12 AM »
WhatsApp has a child porn problem

undefined
   
   
   
       
       
           
               
                   
  • WhatsApp has become a platform for users to "openly" share pictures and videos of child pornography, the Financial Times reports .
  • A group of Israeli researchers found dozens of active WhatsApp groups where people frequently sent messages with media depicting sexual abuse of children.
  • While WhatsApp's end-to-end encryption keeps the company and governments from being able to see the contents of messages sent and received on the app, the researchers point out that many of the groups made their intentions clear with publicly visible references to child porn or explicit profile photos.

               
           
               
                   

Child pornography is "openly shared" in dozens of groups on WhatsApp, even after researchers brought the problem to the company's attention, the Financial Times reports.


               
           
               
                   

Israeli researchers shared with FT that they discovered "extensive child abuse material" in dozen of WhatsApp groups earlier this year. The group reported their findings in September to Facebook, who owns WhatsApp, but FT was able to find "several" of these groups this week that were still extremely active.


               
           
               
                   

"It is a disaster: this sort of material was once mostly found on the darknet, but now it's on WhatsApp," one of the researchers told FT.


               
           
               
                   

The illegal content "videos and pictures of children being subjected to sexual abuse" was discovered in WhatsApp groups "easy to find and join" by researchers, who work at Israeli charities trying to improve online safety.


               
           
               
                   

WhatsApp told FT it has "techniques" to scan users and groups for illegal content, and that thousands of accounts are banned a day. However, Israeli researches say that some of the groups they monitored made their purpose clear, with names like "cp," an abbreviation of for child porn, and explicit profile pictures.


               
           
               
                   

WhatsApp has had   end-to-end encryption since 2016 . The feature provides users with an extra layer of privacy from any potential cybersecurity threats or government surveillance. However, that same protection also means that WhatsApp and law enforcement are unable to see the contents of messages suspected to contain illegal or abusive activity. In other words, the same detection tools that Facebook uses to monitor its site and Instagram can't be used for WhatsApp.


               
           
               
                   

Police and government officials have long criticized hardware and apps that utilize strong encryption as a hindrance to their investigations, while privacy activists tout the technology as one of the only ways to truly communicate privately in an era of   widespread government surveillance .


               
           
               
                    A moderation problem
               
           
               
                   

While Facebook employs   thousands of content moderators , WhatsApp only has 300 employees to monitor its 1.5 billion users globally, FT reports.


               
           
               
                   

WhatsApp has come under fire this year for allowing the major spread of hoaxes and viral fake news that has had severe real-world implications. In India,   the killings of 31 people this year are attributed to viral false rumors and fake videos that spread on WhatsApp and incensed lynch mobs to take violent action. In Brazil, WhatsApp was used to   spread disinformation and misleading information amidst the country's contentious election.


               
           
               
                   

But WhatsApp hasn't been the only social platform that's come under fire for its inability to keep child porn off its network.   Tumblr was removed from Apple's App Store for a month because child porn was slipping past the platform's filters.


               
           
               
                   

NOW WATCH:   USB-C was supposed to be a universal connector but it still has a lot of problems


               
           
               
                   

See Also:


               
           
               
                   
Source: WhatsApp has a child porn problem

- gist culled from pulseng

 

Related Topics

  Subject / Started by Replies Last post
0 Replies
2166 Views
Last post February 20, 2014, 07:51:51 AM
by flukky01
0 Replies
2092 Views
Last post February 21, 2014, 07:27:13 AM
by flukky01
0 Replies
5112 Views
Last post January 22, 2015, 10:21:19 AM
by olutee
0 Replies
2660 Views
Last post February 16, 2015, 02:00:50 PM
by olutee
0 Replies
1101 Views
Last post December 11, 2016, 01:02:28 AM
by thenation
0 Replies
1609 Views
Last post December 18, 2016, 07:03:47 PM
by linda
0 Replies
1266 Views
Last post February 26, 2017, 01:01:32 PM
by clowntom
0 Replies
905 Views
Last post April 06, 2017, 01:00:34 PM
by clowntom
0 Replies
1110 Views
Last post March 27, 2020, 07:43:28 PM
by flukky01
0 Replies
125 Views
Last post March 12, 2024, 07:02:29 AM
by PulseNG