A database of childhood photos of everyday Australians could help artificial intelligence detect signs of abuse, experts say.

Monash University launched its My Pictures Matter campaign last June in the hopes of crowd-sourcing 100,000 photos of Australian children.

So far only 1,000 images have been collected and the Australian Federal Police are appealing for more people to send submissions.

The campaign will assist the creation of a new AI which can find child sexual abuse material on the dark web and suspects’ electronic devices.

It will not trawl through all parts of the internet and instead will be used to hone in on specific persons of interest. 

Monash University and Australian Federal Police are calling for everyday Aussies to submit their childhood photos to the My Picture Matters campaign

The campaign is aiming to collect 100,000 ‘safe’ childhood photos to help develop an AI that can detect child sexual abuse material on the dark web and suspects’ electronic devices

The AFP believes the new technology will save the mental wellbeing of hundreds of officers who are regularly exposed to the horrific material.

AFP Deputy Commissioner Lesa Gale is ‘deeply concerned’ about the high volume of child abuse material circulating in Australia.

‘This project is so important because it may not only save a child from being abused a day longer but it also will help our members who day-in, day-out are required to watch children being sexually abused for investigative purposes,’ she said.   

‘The AFP and Monash University are asking adults to provide pictures of themselves in their youth, not images of their children because consent is important. 

‘This enables development of technology that is both ethically accountable and transparent.

‘We also do not want to source images from the internet because children in those pictures have not consented for their photographs to be uploaded or used for research.’

Deputy Commissioner Gale urged Australians to consider the children who could be saved from harmful situations by the new technology.

‘The creation of child abuse material is a horrendous crime,’ she said.

‘The victims in these images are children and they are being used as a commodity for the sexual gratification of others, including those who try to make money from the abuse.

‘The children in this material are not actors, they are real children being abused.

‘As someone who is dedicated to protecting our most vulnerable, I will do my small part to encourage the crowdsourcing of images, and I hope other Australians will do so as well.’ 

Australians looking to submit their photos must be over 18 and give formal consent for its use in the research

The My Pictures Matter campaign was launched in June last year but has only collected 1,000 photos

For the AI to best learn the difference between ‘safe’ situations and instances of abuse, it needs a database of 100,000 ‘safe’ childhood photos from people of different ethnicities aged from infancy to 17.

The AI will be able to start operating with just 10,000 photos. 

Monash University is accepting ‘safe’ photos from anyone over the age of 18. 

The photos can show multiple people so long as all those pictured give consent for its usage in research.

The project is being developed by Monash University in partnership with federal police but the university alone will be responsible for the safe storage of the photos.

It will only share the photos with police or other research agencies if contributors choose to consent while uploading and with renewed permission at the time.

‘Consent isn’t just saying “yes”, it means understanding what you’re agreeing to,’ My Pictures Matter Project Head, Dr Nina Lewis, said.

‘We can reassure people that this dataset is wholly owned and managed by Monash University, with use by our AFP colleagues subject to the same transparency and accountability measures that apply for all researchers on the team working to combat child abuse.

‘People are also free to withdraw their childhood photos from the dataset if they change their mind.’

Several AFP officers have already uploaded their own photos and hope to get the project off the ground as soon as possible.

AFP Deputy Commissioner Lesa Gale (above) said the AI tool will help save children and protect the wellbeing of officers who are regularly exposed to child abuse material

The renewed push is part of the AFP’s effort for National Child Protection Week, which started last Sunday and ends Saturday.

Since the agency created the Australian Centre to Counter Child Exploitation in 2018 it has removed 637 children from harm, identified 570 victims and charged 877 alleged offenders with 7,029 offences.

To upload your photos or find out more, visit mypicturesmatter.org.

If you have any issues with regards to where and how to use genpornopics.com, you can call us at our web site.

Leave a Reply