Google releases free AI tool to help companies identify child sexual abuse material

Stamping out the spread of child sexual abuse material (CSAM) is a priority for big internet companies. But it’s also a difficult and harrowing job for those on the frontline — human moderators who have to identify and remove abusive content. That’s why Google is today releasing free AI software designed to help these individuals.

Most tech solutions in this domain work by checking images and videos against a catalog of previously identified abusive material. (See, for example: PhotoDNA, a tool developed by Microsoft and deployed by companies like Facebook and Twitter.) This sort of software, known as a “crawler,” is an effective way to stop people sharing known previously-identified CSAM. But it can’t catch material that hasn’t already…

Continue reading…

from The Verge – All Posts https://ift.tt/2Cfvajv
via IFTTT