Canadian Firm Developing Child-Porn-Fighting Image Matching Software

An image matching program to help law enforcement compare and catalogue whole images drawn from child porn and other video and photographic evidence is being developed by Canadian tech firm BlueBear Network International, the company announced May 18.

"BBNI is dedicated to helping law enforcement identify and convict exploiters of children and the creators of child pornography," said chief executive Andrew Brewin announcing the project, known as ImageMatcher. "Through the creative application of biometrics, image processing, distributed search and secure instant messaging, we continue to put more tools in the hands of law enforcement."

BlueBear chairman Sal Khan tells AVNOnline.com that the project began as a recent pilot program that involved linking three Canadian police departments and a courthouse to the program, and that its success prompted one of those police departments to use ImageMatcher in their sex crimes unit.

"When they arrest somebody, they have multiple videos," Khan says. "Our software extracts faces, small and large, from video, and it will also identify the picture in the sequence of faces that's closest to a mug shot. And it will digitally sign it and match it to the video frame it came from."

And the extracts can be stored on the ImageMatcher server, allowing law enforcement agencies and courts to search the image database from remote locations. That, Khan says, will save them from having to travel distances to spend time-consuming hours trying to isolate a child-porn victim or perpetrator, or other kinds of criminals involved in other kinds of crimes as well.

The program will allow secure searches through databases maintained by other investigators, with its LACE application letting child-exploitation units and other investigative units hunt for common suspects, victims, and even witnesses, the company said. The process involves biometric-like templates made to identify unique images and compare location, character, and lighting changes.

Would ImageMatcher have helped, for example, the law enforcement officers involved recently in a months-long search for a child-porn victim who eventually turned up alive in Pittsburgh? That case began when Toronto police published an image of a hotel room where the girl was likely to have been assaulted and photographed, but it took time and investigators from Toronto, Florida, and the FBI to finally find her alive.

"You know, I have no idea. It would have been a different kind of investigation which they should have done, if they had access to the software prior to that," Khan says. "But it was a last resort, what they did," he continues, referring to southeastern U.S. deputies’ hunt for the girl in the Carolinas based on a film image in which the hotel was identified.

"We would hopefully be the easier way," Khan says of ImageMatcher. "If you were an arresting officer, and you got 300 videos from a certain website, you have to look through all of them, and you may want to do that, but how are you going to detect all the faces? All of this is derived from the ability to do distributed search."

With any technology being less than perfect, and the possibility of a false positive identification with even a sophisticated program, can technology alone address that problem? Brandon Shalton, a technology consultant to the Association of Sites for the Advancement of Child Protection, says it can't.

"False positives – finding matches that aren't true – and false negatives (not finding any matches) where [it] should have found a match are certainly issues that technology alone can't address," he tells AVNOnline.com. "Software can be used to scan millions of images to find potential matches, and it does require a human being to do the last bit of validation/confirmation. There are dedicated law enforcement agents around the world who do this kind of visual validation, and usage of technology certainly helps them to be able to find the "needle in a haystack"

But Shalton also said technology won't play judge and jury, either. "It will be up to law enforcement that uses these tools to help them to make the initial flagging of a suspect bad image," he says, "but it will require a human eye to do the final viewing. After that, it goes into the established procedures."