For instance, as far back as 2013, Google worked with the Lucy Faithfull Foundation to introduce warning messages when people search for terms that could be linked to CSAM. There is already some evidence that this kind of technical intervention can make a difference in diverting people away from potential child sexual abuse material and reduce the number of searches for CSAM online. He adds that he has not seen a tool exactly like this being developed for a pornography website before. “It’s an interesting collaboration, in a line of policy and public perception, to help users and point them toward healthy resources and healthy habits,” Perrino says. John Perrino, a policy analyst at the Stanford Internet Observatory who is not connected to the project, says there has been an increase in recent years to build new tools that use “ safety by design” to combat harms online. Those involved in the chatbot project say Pornhub volunteered to take part, isn’t being paid to do so, and that the system will run on Pornhub’s UK website for the next year before being evaluated by external academics. “The IWF chatbot is yet another layer of protection to ensure users are educated that they will not find such illegal material on our platform, and referring them to Stop It Now to help change their behavior,” a spokesperson for Pornhub says, adding it has “zero tolerance” for illegal material and has clear policies around CSAM. Last year, 9,000 pieces of CSAM were removed from Pornhub. In December 2020, Pornhub removed more than 10 million videos from its website and started requiring people uploading content to verify their identity. Pornhub has a checkered reputation for the moderation of videos on its website, and reports have detailed how women and girls had videos of themselves uploaded without their consent. The aim is to “divert” or “disrupt” someone who may be looking for child sexual abuse material and to do so using just a few clicks. Sexton explains the chatbot has been in development for more than 18 months and involved multiple different groups as it was designed. “We realized this needs to be as simple a user journey as possible,” says Dan Sexton, the chief technology officer at the IWF. People who click a prompt saying they would like help are offered details of the organization’s website, telephone help line, and email service. The chatbot tells people it is run by the Lucy Faithfull Foundation and says it offers “confidential, nonjudgmental” support. #Chatbot images series#The popup, which has been designed by anti-child abuse charity the Lucy Faithfull Foundation alongside the IWF, will then ask people a series of questions and explain that what they are searching for may be illegal. And searches can include millions of potential keyword combinations. The chatbot appears when someone searches Pornhub for any of 28,000 terms it has identified that it believes could have links to people looking for CSAM.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |