Share this...

Connect Safely Podcast

Listen to podcast interview about this with Nat’l Center for Missing & Exploited Children COO Michelle DeLaune

By Larry Magid

I’ve been on the board of directors of the National Center for Missing and Exploited Children (NCMEC) for more than 20 years, and since I’ve joined the board, I’ve seen a steady increase in the number of reports to the organization’s CyberTipline about online child sexual abuse images (child porn) and other crimes against children.

Since the CyberTipLine began receiving reports in 1998, it has processed more than 37 million reports and received more than 10 million reports in 2017 alone. These reports are typically about apparent child sexual abuse images, but they can also be about online enticement, including “sextortion, child sex trafficking and child sexual molestation.

The organization employs analysts who have the difficult job of reviewing abusive images and attempting to identify the victims. Over the years, these analysts have reviewed more than 249 million images and videos and have identified more than 15,000 victims, according to a fact sheet on the organization’s website.

NCMEC also works with law enforcement to help find missing children. It also runs a number of education and prevention programs including NetSmartz, which uses animation and other media to educate children and teens.

Unintended consequences of technology

There was a time when printed pornographic images were sent via the postal service. Even though the printed images have never gone away, enforcement efforts by NCMEC, the U.S. Postal Inspection Service and other law enforcement agencies put a damper on this illegal trade.

But then came the internet and the ability to instantly and anonymously share these images. The advent of digital photography and videography made it even easier and cheaper to create such images and avoid detection.

As the internet evolved, we saw more avenues for distribution including anonymous peer-to-peer file sharing services, the “dark web,” and encryption tools that make it harder to detect the material and find those who are distributing it. I’m not blaming the internet anymore than I would blame Johannes Gutenberg for illegal printed images, but the facilitation of this crime has been one of the unintended consequences of some otherwise wonderful technology.

Social media has been another method of distribution, though most of it’s in private groups or between individuals, which explains how I’ve never run across these types of images and I suspect you haven’t either. But just because you and I haven’t seen them, doesn’t mean that they aren’t there.

In fact, Facebook just announced that “in the last quarter alone, we removed 8.7 million pieces of content on Facebook that violated our child nudity or sexual exploitation of children policies. And, thanks in part to some new AI and machine learning technology, the company said that 99 percent of it “was removed before anyone reported it.”

New technology

Facebook didn’t disclose details on how it is identifying images, but in general, machine learning allows for software to become increasingly smarter at finding items that meet the criteria specified.

That could be based on image types (skin exposure) and other clues that indicate that an image is suspect. Of course, any such technology will have false positives, which is why it’s essential that any suspected material be reviewed by a human moderator before action is taken.

My hope is that this type of technology can also be used for other problem areas, including fake news, inauthentic posts, trolling, harassment and bullying. But you have to start somewhere, and I can’t think of a more important cause than trying to stamp out child sex abuse images.

One of the reasons that experts are phasing out the term “child pornography,” is because that doesn’t convey the full horror of these images. They’re not porn, they are a memorialization of a horrific crime.

This is not the only technology used in the fight against child pornography. In 2009, Microsoft partnered with Dartmouth College to develop PhotoDNA, which it describes as “a technology that aids in finding and removing known images of child exploitation.” Unlike earlier attempts to match photos based on known “hashes,” PhotoDNA doesn’t need an exact match to know if an image is based on a previously known illegal one. The image can be resized or modified in other ways and still be picked up by PhotoDNA.

In an interview, NCMEC Senior Vice President and COO Michelle DeLaune said “over the last 20 years of the internet, the material that’s been distributed online continues to grow as old material continues to recirculate and new material is introduced.” She said that it’s critical to understand the psychological impact this has on those who are depicted in the images. You can listen to the full interview with DeLaune at Connectsafely.org/podcast.

In my capacity as a NCMEC board member, I’ve met with victims of child pornography. It can be a horrible burden to carry. I also have adult friends who were sexually abused as children, and even though they function well, the trauma haunts them.

Technology has created some problems, but it can also help solve them.

Disclosure: Larry Magid serves without compensation as a board member of the National Center for Missing & Exploited Children and he is CEO of ConnectSafely.org, a nonprofit internet safety organization that receives financial support from Facebook and other tech companies.


Share this...