In 2015, Thorn CEO Julie Cordua received a call from two Department of Homeland Security officials. They needed help developing a technology solution to find a little girl whose sexual abuse videos had been dispersed across the globe.
The Department of Homeland Security had been looking for this little girl over 8 months to no avail. No images of her were in the National Center for Missing and Exploited Children database. No one had identified her or been looking for her over the 2 years her abuse videos had been being uploaded.
Julie didn’t have a lot of answers. Still, her Thorn team went to work to develop a technology solution for finding this abused child. Frustratingly, in the end, their efforts produced no results.
After five years of abuse, the little girl was found but not because of advanced technology. She was found by narrowing down the selling and manufacturing locations of two abuse-video-background items. Then, going by invoices of nearby selling locations, officials searched social media for people’s names who had purchased one of the items, and found the abused child’s image on her mother’s Facebook page. From there, they tracked down her address and rescued her from her mother’s boyfriend.
It would seem that, with the advent of machine learning (ML), artificial intelligence (AI), machine vision, the Internet of Things (IoT), and facial recognition, emerging technological capabilities would have made solving this case easy and quick.
Child Rescue Increasingly Happens at the Intersect of Tech and Law
Still, there’s good news: tech brands are partnering with law enforcement and, together, they’re getting better at applying emerging technologies — like machine-learning-powered cognitive automation — in innovative ways to stop child sex trafficking and dispersion of child sexual abuse material. Let’s look at how tech brands are leaning on ML-powered technologies to solve complex cases and what capabilities are still needed. Then, we’ll talk about how you can help.
Image matching tools go mainstream for greater tech and law-enforcement collaboration.
Five-hundred images of sexually abused children are traded online every minute. What’s worse is that it’s hard to stop. Simple photo tweaks made by traffickers — even a small mark added or a photo resize — creates a distinct image to track. As a result, traffickers know that small changes mean analysts following abuse images across platforms lose track of their journeys and the traffickers behind them.
Until recently, manual image analysis and matching efforts were the only way to track child abusers and the children they exploite. “Finding these known child sex abuse images in that huge universe is like finding a needle in a haystack,” says Courtney Gregoire, senior attorney at Microsoft Digital Crimes Unit. In our opening story, for example, it took law enforcement officers two years before they could identify the child as a victim.
To solve this problem, Microsoft’s PhotoDNA became a go-to tool for matching altered images to their originals. The tool uses cognitive-intelligence-powered hash-matching technology to divide images, hash them into grey-shaded squares, and number each grid square, creating an image DNA.
By then matching unaltered-image sections to images in the National Center for Missing and Exploited Children (NCMEC) database, the tool identifies image variations. From there, it pinpoints where they’ve appeared online, thus tracking the online footprint of the trafficker behind them.
To grant free access to qualifying organizations, Microsoft uploaded PhotoDNA to the cloud. This means that more organizations can come on board to match and track abuse images so no child endures abuse with no one looking for her.
Microsoft partners with Thorn for greater efficiency in collaboration efforts.
The partnership between Microsoft and Thorn (founded by Ashton Kutcher and Demi Moore) has become a powerful force against child sex traffickers. Until 2012, if the same child sexual abuse image was uploaded to several different platforms, since companies were scanning their platforms separately, each image was hashed and reported to the NCMEC several times as distinct images. The result: chaos within the database. In the end, victim assistance was delayed as redundant hashes had to be purged so rescue efforts weren’t chaotic and inefficient.
But, it gets better: Thorn’s Industry Hash Sharing Platform is the first initiative that allows participating organizations to work together to stop child sex trafficking. Because a centralized and accessible hash sharing database was formed, companies can use the hashes already identified by other companies via PhotoDNA Cloud to scan their image sharing platforms.
In doing so, they locate previously identified sexual abuse content and, therefore, help track it. But, if images scanned have not already been hashed, PhotoDNA Cloud reports them as belonging to a new victim.
The result: time-saving efficiency. Instant recognition of images not previously reported alert law enforcement to a new victim as quickly as possible. Faster victim discovery means children like the girl in our opening story may have to endure fewer years of abuse before a search case is opened to find them.
As a result of the Thorn-Microsoft partnership, shared database, and shared technology, in the hash-sharing program’s first year, eight of the United States’ most popular photo sharing platforms joined — including Facebook and Google — and 90,000 hashes were shared among them for faster victim identification.
But, there’s more…
Tech matches image background to locate abused children for law enforcement.
In our opening story, after identifying the child as a victim, law enforcement officials used video-background items to narrow down the abused child’s location. It was a long and tedious process. All the while, a little girl was living a nightmare. Since then, tech firms are developing solutions to make the process of matching image backgrounds to locations instant.
TrafficCam is the first step in automating this narrowing-down process for real-time results. Instead of narrowing down image-background items’ selling-locations, if a child is being abused in a hotel room and that image is then used for online advertising, police can lean on ML and a user-generated database of hotel interiors to match the hotel room to a location.
Here’s how it works: TraffickCam asks consumers to take a snapshot of the interior of hotel rooms they stay in, then upload it to the hotel-image database. Thus far, millions have been uploaded, and every new image uploaded means another child could be saved. Because it can only identify the child’s location if they’re in a hotel, it’s not perfect. But, it’s a promising start.
Futuristic child recognition technologies bring hope of instant victim and location identification.
New technology is on the horizon to make victim identification instant and rescue efforts immediate. Director of Carnegie Mellon University’s Cylan Biometrics Center, Marios Savvidas, has been hard at work developing iris scanning capabilities.
“Right now law enforcement has only photos of missing children to work with, but appearance can change,” said Savvidas in an interview with Fox News. An iris scanner is superior to image matching in that children’s appearances change over time, but their irises don’t.“We’re giving them a biometric that really cannot be altered,” he says.
Via these scanners, it only takes three seconds to scan the child’s face, locate her iris, and get an identification reading. Marios explains that, while a fingerprint would require the child to actually touch something, scanned irises are more useful. They are still distinct like fingerprints but can be read from 40 yards away and even if the child is in a car.
The idea is to install iris scanners in airport and border checkpoints. But, first, the technology needs data to compare scans to. So, before scanners can be used by law enforcement, parents would have to scan their young children’s irises. Then, if they are abducted, the scanner matches their iris to their original images.
Whether part of an organization or as a consumer, you can join the fight.
Today’s victim identification tools may not mean the end of child sex trafficking, but they no doubt help to rescue more children. And, cognitive intelligence means their performance surrounding unstructured elements like images will continue to improve with more data.
This is where you come in. Whether you’re an organization or an individual, you can help build intelligent-technology databases to increase their matching capabilities.
If you represent an organization with a photo sharing platform, contactNCMEC to participate in the Industry Hash Sharing Platform. And, if you need assistance implementing the technology, connect with Thorn via [email protected].
If you’re a consumer, download the TraffickCam app. Then, next time you’re traveling, be sure to take a photo of the inside of any hotel or motel rooms you rent. Then, simply upload them to the app. From there, law enforcement can use them to match images of children being abused in hotel rooms to determine their location.
Lastly, copy, paste, and share the following text to your social media pages:
Text HELP to 233733 (BEFREE): To get help for victims and survivors of human trafficking or to connect with local services.
Your involvement could save the life of a child.