The fond childhood memories of Australians could be key to ending the pain and suffering of minors who are being sexually abused and exploited.
A trove of at least 100,000 photographs of children in family snaps is being collected by the Australian Federal Police, in collaboration with Monash University.
Photos for the project, dubbed AiLECS Lab, will help train artificial intelligence systems to distinguish between children in safe versus unsafe circumstances to potentially flag photos of child exploitation material.
Researchers are side-stepping an ethical dilemma of feeding the system photos taken from the internet without consent by asking people aged over 18 to submit their own childhood photos.
The ultimate goal is for AI to rapidly trawl through photos and identify child sexual abuse victims and material not previously seen by law enforcement, Lab co-founder and AFP officer Janis Dalins said.
Last year, the Australian Centre to Counter Child Exploitation received more than 33,000 reports of online child exploitation.
Each report can contain large volumes of pictures and videos of children being sexually assaulted and exploited.
“Reviewing this horrific material can be a slow process and the constant exposure can cause significant psychological distress to investigators,” Dr Dalins said.
“AiLECS Lab’s initiatives will support police officers and the children we are trying to protect; and researchers have thought of an innovative way to ethically develop the technology behind such initiatives.”
The only data collected by those submitting photos to the My Pictures Matter crowdsourcing project is email addresses, which are stored separately.
Participants can withdraw their consent later if they choose, data ethics expert and project lead Nina Lewis said.
“While we are creating AI for social good it is also very important to us that the processes and methods we are using are not sitting behind an impermeable wall,” she said.
The researchers aim to have a database of at least 100,000 images by the end of 2022.