WhatsApp failed to stop circulation of child sexual abuse videos
The abuse of messaging platforms like WhatsApp and Telegram for illegal activities is increasing day by day. Where these tech companies have been trying hard to stop them, criminals seem to find new ways to exploit them again and again. According to a new finding by Delhi based Cyber Peace Foundation, chat groups in WhatsApp are still being used to share child sexual abuse material (CSAM).
Rest assured, those WhatsApp groups have since been shut down after the researchers alerted WhatsApp, but when they were active, they reportedly operated as public abuse groups discoverable by anyone.
Nitish Chandra, a cybersecurity expert and Manager Training and Policy Cyber Peace Foundation stated;
“The members are first solicited publicly using these invite links and then the trusted ones (in this case, who would actually engage in sharing CSAM) are called on to join a more private group. Numbers used to create groups are often virtual numbers used by Indian users. There are several such tactics being used by group admins and members.”
In addition, CPF is also helping WhatsApp to identify these groups, as mentioned earlier. However, WhatsApp encryption feature makes it harder to proactively filter content as such technology exists for the sake of privacy and security. Most of the information shared on the report is, however, not from personal chats or group chats. It is actually public information like group icons and descriptions that has been called out.
CPF’s investigation, which was completed in just a couple of weeks in March, is not the first time WhatsApp has been linked to the circulation of child sexual abuse material. Meanwhile, fighting against such problems such as misinformation has become even more of a challenge for WhatsApp than it is for its parent company Facebook.