top of page

Opinion: Encryption vs Law Enforcement – Technology and its role in the CSAM Epidemic

CONTENT WARNING: Please take care.  This article discusses child sexual abuse material. Before proceeding, it is recommended that you strongly consider whether this article is appropriate for you.

You may not be aware, but the link between encryption technologies and the proliferation of child sexual abuse material (CSAM) online is increasingly well established.

The estimated volume of CSAM, and the number of related reports are growing exponentially.  In 2019, 69.1 million different images were reported to US companies alone. Facebook generated more than 90% of these reports.

Research from the US National Center for Missing and Exploited Children highlights the extreme growth in the number of child sexual abuse reports in recent history. The research shows that in 1998, only 3,000 reports of CSAM were received in the US. In 2019, that number grew to 26.9 million.

In Australia, the Australian Federal Police (AFP) has intercepted more than 250,000 instances of CSAM in the last 12 months.  In 2020, the AFP’s Australian Centre to Counter Child Exploitation received more than 20,000 reports, each of which contained up to thousands of different instances of CSAM.  Only 191 people were charged by the AFP that year.  Fewer were convicted.

Before smartphones hit the market, the instant camera was the greatest enabler of CSAM, with no requirement to have a third party develop film and submit a report to law enforcement authorities.

The exchange of CSAM is no longer relegated to instant photos or even the darknet. It can exist on mainstream platforms, since end-to-end encryption makes it extremely difficult for it to be discovered and reported, or detected and intercepted by law enforcement. Technology companies struggle to monitor their own servers to find CSAM, and practically, law enforcement agencies could never obtain enough warrants to intercept the data and analyse it in each instance to adequately address the issue.

Issues relating to end-to-end encryption platforms

Today, the greatest enabler of CSAM is end-to-end encrypted messaging and video conference platforms. These platforms facilitate the commission and sharing of CSAM on a scale that is near incomprehensible, with global offending rapidly increasing in reach and severity each year.

CSAM commonly arises in an active marketplace.  The platforms match demand to supply and allow, for example, paid-up live streaming of CSAM where the material itself is predicated on the involvement of an encrypted platform.

According the Australian Institute of Criminology, Australians who regularly engage in live CSAM streaming typically spend small amounts (under $55) to access the stream and do so roughly every three weeks.  The fee is nearly always paid in cryptocurrency.

For the last decade, the world has witnessed a failure of technology companies and policy makers to deal with the exponential rise of CSAM.  The servers of technology companies are riddled with encrypted CSAM and the push for personal privacy is fuelling the demand for the safe harbour they provide.

What role should technology companies be made to play?

In Australia, technology companies are not obliged to do much at all.  Some mandatory reporting requirements are engaged if CSAM is found, however those requirements dis-incentivise active monitoring, since costs are incurred to meet the regulatory obligations.  This is a straightforward case of moral failure.

While some technology companies have in fact deployed AI systems to scan for CSAM on their platforms, these AI systems are far from perfect, and huge amounts of CSAM slip through the cracks.  Despite that, this technology is better placed to identify CSAM than anything currently available to law enforcement agencies.

In March 2020, the US, UK, Canada, Australia and New Zealand released voluntary guidelines for technology companies to encourage cooperation with law enforcement to address the CSAM epidemic.  The guidelines do little to assure us the rise of CSAM will be curtailed any time soon. They do not even mention the word ‘encryption’.

Facebook recently added the ‘secret’ (encrypted) chat feature, and other tech giants are integrating messaging across a range of services offered, too.  The proliferation of this encrypted messaging technology should be contrasted with recent comments from FBI Director Christopher Wray, who notes that more encryption would gravely hinder law enforcement efforts to track down child sexual abusers.

While Australia’s abhorrent violent material laws represented a productive step in the direction of mandatory reporting, more needs to be done.  The ‘remove and report’ obligations under the legislation do not apply to technology companies unless the CSAM depicts a rape in Australia, and the technology company is already aware of it.  Further, these laws do not impose any monitoring obligations whatsoever.

There is no silver bullet on offer, however, it is clear that technology companies need do the hard work in assessing how they apply encryption to, and monitor the use of, their services. Meanwhile, policy makers must consider other avenues to procure the cooperation and assistance of technology companies in addressing the CSAM epidemic.

Max Slattery is the VSCL’s Policy Officer.

If any of the material covered in this article has raised concerns for you, support is available.  You can call Lifeline on 13 11 14 for confidential telephone support at any time.

16 views
bottom of page