Cyberbullying and sending obscene photos are among the biggest problems faced by users in social media applications. Instagram has confirmed that it has started work on the “obscene photo filter” feature that will prevent users from being exposed to nude photos via private message.
Instagram develops obscene photo filter
Meta highlighted that optional user controls, which are still in development, will help protect people from nude photos and other spam. The tech giant stated that these controls will be similar to the “Hidden Words” feature, which allows users to automatically filter message requests containing certain words.
Software developer Alessandro Paluzzi shared a screenshot of the new feature on Twitter. Stating that he obtained the screenshot by reverse engineering, Paluzzi said,
Meta confirmed the screenshot shared. “We are working closely with experts to ensure that these new features allow people to have control over the messages they receive while protecting their privacy,” said Meta spokesperson Liz Fernandez.
Underlining that the developed technology cannot be viewed by third-party people and institutions, Meta announced that they will announce the new feature in the next few weeks. However, last year, the Pew Research Center released a report that found that 33 percent of women under the age of 35 had been sexually harassed online.
Professor Clare McGlynn, an image-based sexual abuse expert at Durham Law School, told HuffPost, “Some will come forward and say that the explicit photos are harmless. Everyone is debating the fact that he is not face-to-face, but you cannot categorize sexual crimes in this way. The psychological damage of sexual crimes is very significant and different forms of crime can have the same effect on different people.” she said.
What do you think about this subject? Your thoughts in the comments section and SDN ForumYou can share on