Safety Tech Challenge Fund
The Safety Tech Challenge Fund will drive the development of innovative technologies that help keep children safe in end-to-end encrypted environments, without compromising user privacy. The UK Government is awarding five organisations up to £85,000 each to prototype and evaluate innovative ways in which sexually explicit images or videos of children can be detected and addressed within end-to-end encrypted environments.
The Safety Tech Challenge Fund will award five organisations from across the world up to £85,000 each to develop innovative technologies to keep children safe in end-to-end encrypted environments such as online messaging platforms.
Successful applicants will use the funding to develop innovative technologies which demonstrate how tech companies could continue to detect images or videos showing sexual abuse of children while ensuring end to end encryption is not compromised. Applicants must demonstrate how their solutions protect the privacy of legitimate users, whilst preventing services being used by child sexual abuse offenders to facilitate their crimes.
The UK, other governments and child safety organisations have raised concerns about the introduction of end-to-end encryption by social media, messaging and other tech services without sufficient safety measures in place. Without greater investment in safety solutions, there will be serious detrimental consequences for tech companies’ ability to reduce the proliferation of child sexual abuse material on their platforms, protect children from being groomed for sexual abuse, and help law enforcement to safeguard victims and arrest offenders.
The Fund, which will run for five months from November 2021, is part of the Government’s wider effort to tackle harmful and illegal behaviours taking place on social media and other online platforms.
End-to-end encryption will prevent law enforcement from securing lawfully authorised access to vital content as part of their investigations and will undermine existing safety measures. This means that fewer victims will be safeguarded, and fewer criminals will be brought to justice.
The Fund presents a way in which various sectors including NGOs, technology companies and academics, can come together to share best practice around this growing threat and ensure tech companies continue to address it.
Technologies developed will be evaluated by independent academic experts to measure effectiveness and privacy safeguards and ensure that learnings are shared.
Funded projects will be expected to begin in early November 2021 and finish by late
The goal of the Fund is to stimulate innovation, and so we are looking to applicant
organisations to identify the types of activity that would have the greatest impact.
However, all proposals must:
- make innovative use of technology to enable more effective detection and/or prevention of sexually explicit images or videos of children (within scope are tools which can identify, block or report either new or previously known child sexual abuse material, based on AI, hash-based detection or other techniques)
- address the specific challenges posed by E2EE environments,considering the opportunities to respond at different levels of the technical stack (including client-side and server-side)
- demonstrate that protecting user privacy is at the forefront of their approach.
The subject matter covered by this Fund is complex and sensitive, and will require a
collaborative, multidisciplinary approach to deliver successfully. The strongest bids are therefore likely to come from organisations who:
- are able to demonstrate skills and expertise across a number of disciplines – for example, social science, data science, knowledge of online harms, privacy and security issues;
- can demonstrate the effectiveness of these prototypes by testing within an E2EE environment (although this is not an essential requirement.)
We would therefore encourage potential applicants to strongly consider collaborative bids, which combine existing specialist expertise from different organisations. These collaborations could, for example, involve:
- different safety tech companies;
- safety tech companies and demand-side partners (eg gaming companies or
- academic partners or others with insights into the nature of harms;
- companies in parallel sectors (eg cybersecurity); or
- any combination of the above
All incorporated organisations from all geographies are welcome to apply and are not required to be incorporated in the UK or European Union.
For collaborative bids, we ask that all partners meet the data privacy and security requirements outlined in the Supplier Requirements individually, but the other requirements can be met collectively.
Further information about the Fund, and its underlying technical principles, can be found in the Supplier Guidelines.