Researchers building new forensic tools to quickly identify children in child sexual exploitation material

Researchers at the New York University Tandon School of Engineering and the digital intelligence tech company Griffeye have begun building a sophisticated suite of tools to be provided pro bono to law enforcement officials seeking to identify children in child sexual exploitation material (sometimes referred to as child pornography) and rescuing victims. The National Institute of Justice, a division of the U.S. Department of Justice, awarded the project $465,000 over three years.

Professor of Computer Science and Engineering Nasir Memon leads the software development along with Griffeye Director Johann Hofmann. Their team is consulting with the U.S. Department of Homeland Security Child Exploitation Investigations Unit.

The volume of child sexual exploitation material is growing dramatically, according to the National Center for Missing and Exploited Children. In 2015, NCMEC's CyberTipline received more than 4.4 million reports of child sexual exploitation, and since the center's inception, law enforcement has alerted NCMEC to more than 12,000 identified victims of child sexual exploitation.

The value of automating searches becomes particularly striking in light of how much material police must examine to bring a case or find a child who has been exploited. Many police officers reported that a normal case contains between 1 and 3 terabytes (TB) of data, representing 1 million to 10 million images and thousands of hours of video material, according to NetClean Report 2016. A few police officers told the authors they sorted through as much as 100 TB - over 100 million images and 100,000 hours of video material.

The researchers from NYU Tandon and Griffeye are automating the analysis of video by using advanced machine learning techniques to identify both nudity and children. Memon developed filtering techniques that can detect skin tones -- even in low light or poor-quality video -- then map connecting regions containing skin tones to determine if a subject is nude. This technique will be paired with systems that can extract facial features and perform spatial and textural analysis to determine if the face belongs to an adult or a child. In testing, Memon reports that his algorithms accurately detected explicit images 83 percent of the time and were 96.5 percent accurate in distinguishing children's faces from adults'.

If a video is determined to contain both nudity and children, the system will automatically assess the body motions to determine whether the content is explicit.

Even today's most advanced existing digital forensics tools fall short in their ability to quickly identify children and analyze content to determine whether it meets criteria for child sexual exploitation material, Memon explained. "The two most important qualities of a child sexual exploitation detection system are accuracy and speed, and it's very hard to achieve a good trade-off between the two," he said, noting that video content, which can vary in quality, is especially difficult to analyze.

The researchers will integrate their tools into Griffeye Analyze, a digital media investigation platform used by law enforcement agencies around the world in criminal investigations, including child sexual exploitation cases. Organizations involved with Project VIC -- a collaboration between law enforcement, industry, and nongovernmental organizations to improve data sharing in child exploitation and sexual abuse cases -- will also get free access.

"The scale of the problem, and the horrific nature of the material, means law enforcement officers are in great need of technology that can help them quickly prioritize data and detect contraband material. Using these novel child exploitation classifiers, developed by Professor Nasir and his team, together with the Griffeye Analyze platform, guarantees increased efficiency and better results in processing cases and identifying victims," said Hofmann.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Social media reshapes children’s diets and body image with alarming consequences