Facebook has announced new detection technology and an online resource hub to proactively fight against revenge porn on the platform.
TheNewsGuru (TNG) reports revenge porn, also known as non-consensual intimate images, is sharing someone’s intimate images without their permission.
“Finding these images goes beyond detecting nudity on our platforms.
“By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram,” Antigone Davis, Facebook’s Global Head of Safety said in a blog post.
This means that Facebook can now find revenge porn contents before anyone reports them.
Davis, however, said a specially-trained team member will review the contents found by the technology, before they are removed.
“If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission,” he said.
Meanwhile, there is an appeals process if someone believes Facebook made a mistake for removing a particular content.