Purveyors of Change: Confronting the New Challenges of Child Exploitation
Introduction
In recent years, we have seen the rapid growth of technology and at its centre, Artificial Intelligence (AI), and its innovative and alarming implementation in society. On September 5th 2023, Monash DeepNeuron’s Law and Ethics Committee collaborated with Monash University and AiLECS lab, who are at the forefront of developing AI so that it can assist law enforcement and be utilised to promote community safety, in the industry event, Protecting the Future: Combatting the Online Surge of Child Exploitation. This event seeks to leverage AI for the greater good and to raise awareness on the current failings of our justice system when it comes to protecting children from online crimes. However, despite AI’s untapped potential in the legal sector - it has also caused controversy due to various violations of data and privacy online.
Setting the Scene
The event kicks off with a keynote from eSafety Commissioner Julie Inman Grant she explains how one of the biggest challenges we face with online activity is the lack of regulation and transparency. Online coercion has become an increasingly prevalent issue with sexual reports having tripled over the last year and 12% of reports to the eSafety Commission are self-produced Child Sexual Abuse Material (CSAM). Assistant Commissioner with the Australian Federal Police, Hilda Sirec offers insight to the landscape in which online perpetrators operate, and that CSA perpetrators are a notorious tech-savvy offender base which are always at the forefront of technological developments.
One of the most important takeaways from this event is the importance of proliferating the survivor voice and as a community, what we can do for victim-survivors of child exploitation and identifying the current issues which they face. From a legal perspective, our current systems of justice do not sufficiently protect victim-survivors in court. This is because victim-survivors experience secondary victimisation, which is where they are constantly forced to relive their trauma when navigating their case throughout the legal process. Furthermore, their cases may never be heard because of confusion when navigating the legal system such as determining the tangible evidence required to solidify their case, which jurisdiction should hear their case (Family, Civil or Criminal) and a general unwillingness to bring cases to law enforcement. This also brings up issues of privacy and identification, as victim-survivors and their CSAM become highly traded and often idolised by the CSA perpetrator online community, making it increasingly difficult to remove this explicit content on online platforms due to the dissemination on the clear web. Not only this, but there are several cultural factors which contribute to the diminishing of the survivor voice. There are many cases where the perpetrator is likely to be a close contact, for girls 88% of the time the perpetrator is someone they know, which often leads to cases of assault going unreported due to familial ties and relationship dynamics.
However, the most pressing concern is that there is an unwillingness by society at large, to elevate these voices. Rather than turn one’s head away from these atrocities, as a society, we must learn to confront and empathise with the struggles of victim-survivors because if we do not, then how long must they wait to receive their justice?
Student Debate
The Law and Ethics Committee were largely involved in the debate portion of the event, featuring Elizabeth Perry, Leah Martinez, Merryn Cagney, and Parmeeta Siddique. The debate topic was whether the benefits of using Clearview AI and other similar technologies in law enforcement outweigh the risks.
Context
Clearview AI is a private corporation which uses facial recognition technology. The AI scrapes data across the web which is logged into a proprietary database containing over 30 billion facial images from public domains like social media, news outlets etc. to train itself. Then, when you upload a facial image, within seconds the AI will present you with links to that person’s images across the web.
However, Australian Information Commissioner and Privacy Commissioner Angelene Falk has found that Clearview AI, Inc. contravened Australians’ privacy as it was collecting biometric data without consent, breaching the Australian Privacy Act 1988. Clearview AI was then officially banned in Australia after Clearview AI Inc and Australian Information Commissioner [2023] AATA 1069 trial.
Conclusion of Debate
The affirmative argued largely on the basis that the current procedures that we have in place for law enforcement are insufficient. This is especially so, in relation to online child exploitation where Detective Inspector Jon Rouse from Queensland Police, said that victim identification is the most under-resourced skill in CSA. Using Clearview AI could greatly enhance and expedite victim identification and thus, discovering perpetrators exponentially, which allows law enforcement to use their resources elsewhere instead of an endless goose chase. Despite regulation around AI being under-developed, with proper oversight, it could become a valuable tool which promotes public security and safety.
To contend, the negative argued that the use of Clearview AI would fundamentally undermine our democratic rights as it completely disregards consent from users–and would remove online anonymity which is essential for living an authentic life. This technology could potentially be abused and would ultimately be a gross invasion to one’s entitlement of privacy. Furthermore, the improper use of this technology has led to false arrests and false positives due to prejudice in law enforcement and discrimination bias in the algorithm.
It was a strong debate, and although there was a general consensus from the audience in favour of using AI facial recognition technology, the negative reigned victorious in the debate.
Conclusion
When it comes to integrating AI into law enforcement, there is still much work to be done–especially when there is already a lot of distrust in its application from the general public. But even before that, there are social issues that are worth looking into and discussing further and making meaningful change. In 2021, the Australian Centre to Counter Child Exploitation received more than 33,000 reports of online child sexual exploitation–of which only 237 people were charged that year.
That is a 0.007% of turnovers. This is completely unacceptable.
We encourage you to read up further on CSA (we’ll have links below that you can read up on!) and if you are comfortable and want to help in the development of facial recognition technology for investigative purposes, you can upload your childhood photos onto My Pictures Matter (run by AiLECS labs) and help out today.
It is completely understandable to find these topics harrowing and difficult to face, but as a society, if we cannot face this reality, then these children will never be saved.
Additional Resources
My Pictures Matter - https://mypicturesmatter.org/
DNA Doe Project - https://dnadoeproject.org/
What is Child Sexual Abuse Material? - https://www.rainn.org/news/what-child-sexual-abuse-material-csam
Secondary Victimisation - https://medium.com/@okonagata/beyond-the-crime-understanding-secondary-victimization-and-its-impact-on-victims-dd25e071a486
Increasing Trend of CSAM - https://www.aic.gov.au/sites/default/files/2021-02/ti616_production_and_distribution_of_child_sexual_abuse_material_by_parental_figures.pdf