Image Analyzer wins UK Government Safety Tech Challenge Fund to develop CSAM-detection technology for E2EE environments

Image Analyzer provides AI-powered visual content moderation technology that helps to protect online communities from illegal and harmful images, video and livestreamed footage

Image Analyzer provides AI-powered visual content moderation technology that helps to protect online communities from illegal and harmful images, video and livestreamed footage

- Safety Tech Challenge Fund awarded to AI-powered visual content moderation pioneer, working in partnership with Galaxkey and Yoti -

“Image Analyzer is delighted to be collaborating with Galaxkey and Yoti to deliver technology that recognizes users’ privacy whilst addressing inherent risks to children from end-to-end encryption.”
— Cris Pikes, CEO, Image Analyzer

GLOUCESTER, UNITED KINGDOM, November 17, 2021 /EINPresswire.com/ -- Visual content moderation software company, Image Analyzer, has been selected to receive a share of the UK Government’s Safety Tech Challenge Fund to find new ways to detect Child Sexual Abuse Material (CSAM) sent via encrypted channels, without compromising citizens’ privacy. Image Analyzer will work in partnership with content encryption technology provider, Galaxkey, and digital identity and age verification technology company, Yoti, to develop AI-powered visual content analysis technology that works within messaging services that employ end-to-end encryption.

Cris Pikes, CEO and founder of Image Analyzer commented, “Image Analyzer is delighted to be collaborating with Galaxkey and Yoti to deliver this exciting, first-of-a-kind technology pilot that recognizes the importance of protecting users’ data and privacy whilst addressing the inherent risks to children associated with end-to-end encryption. As a ground-breaking technology collaboration, the Galaxkey, Yoti and Image Analyzer solution will enable users to access all of the benefits related to encryption whilst enabling clean data streams and offering reassurance within specific use case scenarios such as educational sharing.”

End-to-end encryption (E2EE) is already included within the WhatsApp and Signal apps. In April, the UK Home Secretary and the UK National Society for Prevention of Cruelty to Children (NSPCC) decried Facebook’s plans to implement E2EE within Instagram and Messenger, citing the increased risks to children if law enforcement agencies cannot compile evidence of illegal images, videos and messages sent by child abusers. The government and child safety organisations warned that E2EE drastically reduces technology companies’ abilities to detect and prevent proliferation of CSAM on their platforms and prevents law enforcement agencies from arresting offenders and safeguarding victims. The NSPCC has estimated that up to 70% of digital evidence will be hidden if Facebook goes ahead with plans to introduce E2EE in Instagram and Messenger.

Child protection experts warn that encrypted messages could shroud evidence of grooming and coercion of children and the sharing of indecent and illegal images and extremist material.

The NSPCC also observed that encryption could create a technical loophole for technology companies to avoid their duty of care to remove harmful material when the Online Safety Bill becomes law, by allowing them to ‘engineer away’ their responsibility to monitor and remove harmful content.
In response to these heightened risks, the Safety Tech Challenge Fund was announced by the UK Government in September. The UK Government has awarded five organisations up to £85,000 each to prototype and evaluate new ways of detecting and addressing CSAM shared within E2EE environments, such as online messaging platforms, without compromising the privacy of legitimate users. The Safety Tech Challenge Fund provides a mechanism for government, the technology industry, non-profit organisations and academics to discover solutions and share best practice. Fund recipients have until March 2022 to deliver their proofs of concept.

In September, John Clark NCMEC President & CEO was quoted as saying, “The National Center for Missing & Exploited Children (NCMEC) applauds the launch of the UK’s Safety Tech Challenge Fund to support development of technology solutions that prioritize child safety online while also protecting consumer privacy. Last year, NCMEC’s CyberTipline received more than 21 million reports relating to child sexual exploitation, and the numbers of reports this year are likely to be even higher. Time is of the essence to develop safety measures that can operate in encrypted environments to protect children who are being enticed online and whose horrific images of child sexual abuse are circulated online. The Safety Tech Challenge will be crucial to enabling the tech industry, academic experts, non-profits, and government agencies to collaborate together on global solutions to keep children safer online without compromising consumer privacy.”

Image Analyzer holds European and U.S. patents for its automated, artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users. Its technology helps organizations minimize their corporate legal risk exposure caused by people abusing their digital platform access to share harmful visual material. Image Analyzer’s technology has been designed to identify visual risks in milliseconds, including illegal content, and images and videos that are deemed harmful to users, particularly children and vulnerable adults, with near zero false positives.

Pikes continues, “Law enforcement agencies depend on evidence to bring child abuse cases to court. There is a delicate balance between protecting privacy in communications and drawing a technical veil over illegal activity which jeopardises children. Hidden doesn’t mean it’s not happening. The UK Online Safety Bill will bring messaging apps into scope for the swift removal of harmful text and images. However, where content is encrypted end-to-end, this will significantly reduce Ofcom’s ability to prevent the most serious online harms. Working in partnership with encryption specialists at Galaxkey and identity verification experts at Yoti, we’ll help organisations to address online harms, while protecting the privacy of their law-abiding users.”

##

About Image Analyzer

Image Analyzer provides artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users. Its technology helps organizations minimize their corporate legal risk exposure caused by employees or users abusing their digital platform access to share harmful visual material. Image Analyzer’s technology has been designed to identify visual risks in milliseconds, including illegal content, and images and videos that are deemed harmful to users, especially children and vulnerable adults.
The company is a member of the Online Safety Tech Industry Association (OSTIA).

Image Analyzer holds various patents across multiple countries under the Patent Co-operation Treaty. Its worldwide customers typically include large technology and cybersecurity vendors, digital platform providers, digital forensic solution vendors, online community operators, and education technology providers which integrate its AI technology into their own solutions.

For further information please visit: https://www.image-analyzer.com

References:

Gov.UK, press release, 17th November 2021: ‘Government funds new tech in fight against online child abuse’, https://www.gov.uk/government/news/government-funds-new-tech-in-the-fight-against-online-child-abuse

Safety Tech Network, 8th September 2021, ‘Government launches Safety Tech Challenge Fund to tackle online child abuse in end-to-end encrypted services’ https://www.safetytechnetwork.org.uk/articles/government-launches-safety-tech-challenge-fund-to-tackle-online-child-abuse-in-end-to-end-encrypted-services

Safety Tech Network, ‘Safety Tech Challenge Fund’, https://www.safetytechnetwork.org.uk/innovation-challenges/safety-tech-challenge-fund

Josie Herbert
Phiness PR
email us here
Visit us on social media:
Twitter
LinkedIn