U.S. flag

An official website of the United States government

Government Website

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Safely connect using HTTPS

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Breadcrumb

  1. Science and Technology
  2. News & Events
  3. Technologically Speaking Podcast
  4. Season Two
  5. Episode 3: Pain That Hurts and Pain That Alters

Pain That Hurts and Pain That Alters

Image
Technologically Speaking The Official Podcast Patricia Wolfhope Blue background

This episode discusses sensitive topics including violence against children. Listener discretion is strongly advised.

This is the first episode in a two-part series on how S&T is working to combat online child sexual exploitation and abuse. Host John Verrico is joined by Patricia (Patty) Wolfhope, subject matter expert in digital forensic systems, for an eye-opening discussion of the scope of this insidious threat and why developing tools to enhance law enforcement effectiveness is a top priority for the Department. Though some of the descriptions of what takes place on the dark web are very dark indeed, audiences will also leave this episode more aware of how to recognize signs of online exploitation and sextortion. The second part of this two-part series features Patty’s colleague, Shane Cullen, who will delve deeper into additional aspects of S&T’s digital forensics portfolio.

 
Run time: 25:15
Release Date: April 19, 2023

Show Notes

Guest: Patricia Wolfhope, Subject Matter Expert in Digital Forensics

Host: John Verrico, Chief of Media & Community Relations

[00:00:00] Verrico: Before we get started, this episode will discuss sensitive topics including violence against children. Listener discretion is strongly advised.

[00:00:09] Dave: This is Technologically Speaking, the official podcast for the Department of Homeland Security, Science and Technology Directorate, or S&T, as we call it. Join us as we meet the science and technology experts on the front lines, keeping America safe.

[00:00:25] Verrico: Hello and welcome to this episode of Technologically Speaking, the official podcast of the Department of Homeland Security Science and Technology Directorate. And I'm your host today, John Verrico, and I've got an extremely special guest today, Ms. Patty Wolfhope, who is our subject matter expert in digital forensics. I don't want to say this because, and people will say, what about me? But you are one of my favorite people here. And it's because you are so passionate about the work you do. How long have you been with S&T now?

[00:00:56] Patty: Hey, John, thank you for having me today. I think you and I have been working together at S&T for about, maybe 12 to 14 years. I'd have to look back, but you're one of my favorite people too.

[00:01:06] Verrico: Well, thank you. I appreciate you and the stuff that you do is so meaningful. So, tell me just a little bit, digital forensics, what is that? How would you describe that to the average citizen at home?

[00:01:19] Patty: You think about forensics and most people think about a fingerprint, right? And nowadays, we think about faces or irises. But digital forensics has everything to do with what is digital in the digital world. And we're living in that world now more than ever. Back whenever I became an engineer, everything was analog. Now there's so much digital around us that, it's being used for really good purposes and really, really dark purposes. And that's where we come into play. The Science and Technology Directorate digital forensics work is looking to take advantage of a digital output from criminals so that we can help law enforcement track down these people and make a difference.

[00:02:02] Verrico: Let's delve into that. There are some, let's say some really truly evil people out there that are doing some really horrible things. And I know that because of this, a lot of the work that you do falls into that category of law enforcement sensitive. I'd like to take the opportunity that you can talk about what you can talk about today. Tell us how we're using digital forensics and what kinds of crimes we're actually being able to address with it.

[00:02:32] Patty: Thank you for that question, John. A lot of people don't know how dark, dark can be, and that's okay. I would prefer, I, I didn't know, I didn't realize until I really got into this world with our operational folks and understood what their problems were. The kind of crimes that we're looking at are transnational crimes. In the realm of transnational organized crime, specifically targeting the child exploitation human trafficking realm, we apply from Science and Technology perspective, our digital forensics work initially to those use cases. Let's talk about child exploitation. There are pedophiles that record their illicit activities and then share them with other pedophiles, for free. The problem is, it used to be that you had to use the US mail, back in the day for pedophiles to share this kind of illicit information. Now they can share it anonymously over the internet and in the blink of an eye. Another use case is that kids, believe it or not, are self-producing sexual activity online just like a pedophile would do with a minor. Kids are producing themselves, and then sharing it online, unbeknownst to their parents or their guardians. And this leads to many times what we call sextortion. A lot of times kids that are sextorted, feel there's nowhere to turn. They lose hope.

[00:03:58] Patty: They're afraid. This is where a perpetrator will pretend to be somebody that is their peer. And many times, mostly boys, will share images of themselves and then the perpetrator then will begin to extort them. What does that mean? That really means that the perpetrator will tell the innocent child, I'm going to share your images now with all your friends that are on Facebook, all your social media people, the school that you go to, I'm going to share it with your teachers, and it gets out of hand. And they, some, sextortionists, will then demand money from these kids. And it's leading to suicide. It's really unfortunate. The law enforcement community believes that this is at pandemic level, suicides for kids due to sextortion. And a third way, is that children are sold online. The perpetrator never actually makes physical contact with that child, but will direct what the adults are doing to that child through a livestream event. And many times, that's pay-per-view and you'll have thousands of perpetrators joining a pay-per-view, of a rape of a child online.

[00:05:10] Verrico: It is staggering just how sick these people really are. And how prolific is this problem? It's, you're saying thousands of viewers. It's insane.

[00:05:23] Patty: Yeah. What happens, is on these live stream events, for example, the interconnection of the pedophiles online is so prolific that they'll put out a message to all their buddies, saying, hey, we're going to have a livestream event at x, y, z time. And, everybody comes. It's really sickening.

[00:05:44] Verrico: I have no words. I'm just absolutely staggered by how horrific this is, and the way it makes me feel about humanity. So obviously, you're doing what you can in the digital forensics world and how can we help to fight this kind of a crime?

[00:06:04] Patty: One of the things that we've had to do from an S&T perspective is to understand what an agent or an analyst needs to do in order to accomplish their job. And so, what is their job? They need to determine who the victim is, where in the world a crime took place, and who the perpetrators are? Those are the questions they're trying to answer for all of those scenarios that we talked about a moment ago. And so how to do that through digital images. And so, in working with law enforcement, I immediately understood that the average criminal pedophile has millions and millions of videos and pictures, available to them via hard drives, thumb drives, cell phones. And so, your average criminal investigation has terabytes of data. So, you can imagine how quickly that grows. We're talking millions and millions of videos and pictures, tens of millions. So, what can we do to help? What we immediately did from an S&T perspective was A, try to understand that problem and B, understand what kind of tools we currently had and look forward to tools that we could develop to help them cull down that massive amount of data so that they could at least start to identify these victims.

[00:07:31] Patty: So the first thing I put my focus on was the face recognition work. If we could somehow get a picture of a face of one of these children, could we use that to look at the rest of the repository of tens of millions of pictures? Because keep in mind these perpetrators will share a picture of Jim Doe or Jane Doe once, and then that gets shared and shared repeatedly online. So these kids are re-victimized. It turns out, back in the day, face recognition performance wasn't good enough to implement into the use case for the analysts. But, the IARPA program, JANUS, J A N U S, really took a leap forward and allowed us to use that, their facial recognition algorithms to make such a big difference for our law enforcement, worldwide. So that was our, my initial step. Using that, agents have been able to cull down their data set using, face recognition when they have a face. Another piece of technology that I found extremely useful is the ability to match pictures from the same camera. I won't say too much about that other than what a useful tool that has been, for our worldwide law enforcement capabilities.

[00:09:00] Verrico: Looking for then repeat perpetrators.

[00:09:03] Patty: Yes. Or pictures just from that same camera, so that when you're looking at a repository of 10 million, you can cull that down to a hundred pictures that came from that same camera. And that's very helpful. Many of these perpetrators have multiple victims. And we don't even realize that until we realize that, oh, all these pictures came from that same camera, therefore, it's all, interconnected in some way.

[00:09:30] Verrico: This is amazing. Who are you working with on these projects?

[00:09:33] Patty: We work nationally and internationally with our research counterparts within government agencies. And we also work, with the operational folks in those same agencies. So, in the United States, it's our federal, state, and local, agencies that are currently using the tools that I mentioned earlier and some others that we've developed.

[00:09:55] Verrico: Tell me how the forensics really help here. I know that you've been able to, there's actually been children rescued because of the technology brought to bear. I know, like I said, a lot of this is law enforcement sensitive. What can you tell us about how this stuff works and some of the successes and milestones that we've reached?

[00:10:15] Patty: All of the tools that we develop, our goal is to help law enforcement spend less time having to view the images, these horrific images that are developed by these monsters.

[00:10:30] Verrico: I imagine just having to view that stuff is extraordinarily traumatic.

[00:10:35] Patty: Yeah, you know this causes post-traumatic stress for agents in the field having to view this material. So, applying, for example, artificial intelligence and machine learning, to digital forensics tools will eventually lead us to the capability of reducing the amount of time that law enforcement has to look at these pictures.

[00:11:03] Verrico: What's the biggest challenge that, that you face in being able to accomplish that goal?

[00:11:08] Patty: Our biggest challenge, and I know that a lot of people in the machine learning world will understand this is data. Data is always a problem, especially annotated data. In this case we're talking about child exploitation, digital images and videos. Nobody wants to sit around and view this horrific material. But that's what's required. We can only go so far with an automated annotation process. Each image and video has to be looked over by an agent in order to verify, validate that the annotations are correct and accurate. So it, obtaining this data for machine learning purposes is extremely difficult. And we are working nationally and internationally with our partners to develop a guidance documentation on annotation for child exploitation material. This way we're all working together as a world law enforcement worldwide, annotating data that we can all use to apply machine learning to these algorithms.

[00:12:12] Verrico: I know you can't get too much into the weeds on how law enforcement goes about their actual duties, but how could the types of information that are generated through these forensics be actually used by law enforcement agents to, maybe rescue children or pull them out of these situations or bring, some of these perpetrators, to, to bear the responsibility of what they've done.

[00:12:38] Patty: Lawfully seized digital evidence is extremely useful. But like we were saying, the magnitude of that evidence, it just makes it clearly impossible for a manual search through it to look for clues. Using the digital forensics capabilities that S&T is able to provide to law enforcement, to help them cull through, in an automated fashion, that data and find clues that will help them eventually locate and rescue a child is paramount.

[00:13:11] Verrico: And they have in fact been able to rescue some. Is that not correct?

[00:13:15] Patty: Absolutely. They've used our tools to save and rescue a number of children, and locate the perpetrators and, get them behind bars. So those are success stories that feed our development process and really make us happy that S&T is able to make such a big difference in a kid's life. The kids will never know, where all of the work came from and we're just one small cog in a bigger wheel for the Department of Homeland Security, but without our work, agents have told us that they would not be able to accomplish the number of rescues that they have. So, we're really happy and I'm personally happy to be involved in these efforts. As you know, John.

[00:14:02] Verrico: Patty, as we're talking here, and we're talking about all this really dark and horrible stuff. I'm looking at your face and you're talking about rescuing children and putting the perpetrators behind bars for the first time. Light came back into your face throughout this discussion and I'm so happy to see it there. Can you share with me, is there anything in particular that you were particularly proud of or some particular accomplishment that you want to, want to be able to highlight? And say, hey, I was part of that.

[00:14:31] Patty: Yeah. Absolutely. Being an engineer, I never thought I'd be doing what I'm doing today. And so being, just being a part of helping to save kids around the world is, the legacy that I would like to leave. You know, there's, there's pain that hurts and there's pain that alters. And these kids, abused so severely, tortured at such a young age, their lives have been altered forever. And if there's anything we can do to help stop this, that's the goal. We've all said we can't law enforcement our way out of this. Educating the public as to what the problem is, and helping kids realize if they do get themselves into a rut, that there is a way out, a safe way out, is paramount as well. So, I think working with law enforcement has been, fantastic and, I will continue to do so. And just helping to save kids. I love it when I can foresee a technology that will make a difference for law enforcement and to help them do their job better, safer, quicker. And we have some of those on the horizon. And so what really makes me happy is to know that in this next year, F Y 23, we have some goals set and we can see the horizon where those goals are going to become reality, and make a huge difference, a big impact in this world of child exploitation for law enforcement. I'm really excited.

[00:16:02] Verrico: I can tell that can, I can hear it in your voice. I can see it in your face, and I'm very excited to see what's coming down the pike. Do you have children?

[00:16:10] Patty: I do, I have two girls and one granddaughter.

[00:16:13] Verrico: Oh my goodness. I have known you a long time.

[00:16:18] Patty: Yeah, you probably, yeah.

[00:16:20] Verrico: So how does that make you feel then that you are able to protect them in this?

[00:16:27] Patty: You know, early on when my kids are now, thirties, but early on when they first, were able to get on a computer or have a cell phone, I told them that they couldn't get on unless I had their usernames and passwords. And that upset them, Hey, what do you mean? No, I want my privacy. But the really good thing was just that having that, that look ahead to say, you know what? It's not that I didn't trust you, it's that I knew that there was a bigger world out there that might impact you in a negative way. And, it was someone needed to watch, someone needs to watch over our kids, right? Parents, guardians, and so on. I'll tell you a quick story on one of the saves that we had down in the Philippines.

When my coworkers read about that, they called me up one day and said, oh, thank God you saved those kids and got them back to their parents. And I said, oh, no, no. It was their parents that were selling them.

[00:17:19] Verrico: Oh.

[00:17:21] Patty: Okay? Yeah. So, this is not unusual. These kids know their perpetrators. It's aunts, uncles, neighbors, it's priests, it's police officers, it's anybody, right? And so what people don't understand is that this particular crime doesn't look at how much you make. It doesn't look at what job you do. It doesn't care about the color of your face, it, none of that matters. Anybody can be a perpetrator and that's what's really hard, John. When you look around you and you think, wow, anybody around me could potentially be a perpetrator.

[00:18:01] Verrico: Wow, that is really scary, to say that Patty. To even really, like I said, it changes the view of the world, doesn't it? What do you say to people, parents and kids who are not doing these things, but want to protect their children or want to protect themselves? What can you, what advice can you give?

[00:18:20] Patty: Yeah, I think that, number one, usernames and passwords are really important. Limiting the amount of time that your kids spend online and knowing who they're spending that time with who they're talking with online is also extraordinarily important. There are a number of tools out there. S&T doesn't necessarily put out that any particular tool is better to monitor your child's use online. But monitoring, your kid's use online as far as these social media sites is extremely important. One of the ways perpetrators will get online and begin to groom a child is just in these chat rooms. Innocent chat rooms that you would never think, would have a pedophile in there. And so monitoring your kids' use and who they're talking to, what they're saying and so on. I know it's a, a lot to ask, but it's paramount that we all do that for our kids.

[00:19:17] Verrico: As an engineer, did you ever think that you'd be applying, engineering to this kind of work?

[00:19:25] Patty: No, exactly. And that, as an engineer, when I first got into the S&T world here at DHS, I was working in biometrics, which is a subset really, of digital forensics work. And so, my biometrics work is really what led me, to do this work. I was sitting at a conference out at the National Institute of Standards and Technology, NIST, and a H S I, Homeland Security Investigations agent got up and spoke about this problem, and that was over 10 years ago. And after he spoke, I went and spoke with him and asked, what can we do to help you? This is crazy. What he had spoke about just shattered my brain. It made me look around and wonder ooh, this is deep and dark. What can S&T do? That's really what got me involved in this work, and I've never looked back.

[00:20:18] Patty: I don't know, if our listeners can really relate to this and how many people out there have been touched by something similar to this. A child exploitation, a child abuse problem of their own. But once you've had that experience, you can’t unsee it. You can't unhear it. You have to do something about it. And I believe that's part of what S&T, the Science and Technology Directorate, as well as one of the goals of our Secretary, for DHS, has taken on in a big way. And so, I'm really, happy to see that more is being done for the agents who are trying to fight this problem but also the kids that are victims of these horrific crimes.

[00:20:59] Verrico: You know, people always wonder about the impact of the work that we do, and I don't think anything is, as impactful as what you've just described here today. And so, I so greatly appreciate your taking the time to come on to describe this situation, and even more so, telling us about how we can actually make a difference, what kinds of things can we be looking for in the future?

[00:21:27] Patty: You know, there's, there's a fair amount that's, happening in the tech world, that everyone walking the planet is pretty aware of. Those items like the internet, are going to be really nifty, cool items that are going to be used for really good purposes, but they're also going to be used for nefarious purposes. And what we're doing at S&T is trying to stay one step ahead of those nefarious actors. And I think we're doing a pretty good job of that.

[00:21:56] Verrico: I know you are. I know you are. And I can see it every day. I'm so happy that you're here, and I'm so happy that you're working on the right side.

[00:22:05] Patty: I wouldn't be anywhere else, John. You know it.

[00:22:07] Verrico: I'm so glad about that. I'm so glad about that. S,o let me just ask you, after dealing with this kind of stuff and seeing this evil is the only word that comes to mind.

[00:22:19] Patty: Yep.

[00:22:20] Verrico: But knowing that you're working hard to make a difference, at the end of the day, how do you unwind? How do you, you wash all of this out of your brain and have a normal life afterward.

[00:22:30] Patty: Yeah. Yeah. And I think that's a good question for the law enforcement analysts as well. how do you not just curl up in the fetal position and, cry? I have a lot of hobbies. as you know, John. I have a pretty decent sized garden that I eat from pretty much year-round, even in the winter here. And I love to paint, love to sew. the hobbies just go on and on raising butterflies. Wow, what joy that brings, right?

[00:22:53] Verrico: Oh, wow. I didn't know you did butterflies.

[00:22:56] Patty: Yeah, specifically Monarchs. we like to tag them, and they fly south and then their tags are read down in Mexico and California. And then of course they fly north again in the Spring, and we'll read their tags again to see how many made it. So, it's a lot of fun.

[00:23:10] Verrico: That's amazing. I connect you with my wife. She's a big butterfly lover as well.

[00:23:14] Patty: I can get her some Chrysalis's.

[00:23:16] Verrico: That would be fantastic.

[00:23:18] Patty: Yeah, so there I have a lot of hobbies. I have a lot of things that I really, really enjoy doing, that just take my mind off of, all of the horrificness in the world. But I also hold near and dear, the kids that we've been able to save. I don't know them by name. I know them sometimes by face and location. but I look forward to that next kid that we're going to save. And I think that's what keeps a lot of us working in this field going, is knowing that there will be a next kid that we'll be able to save that needs us. So that brings real joy.

[00:23:51] Verrico: You know what, I certainly understand that. And it's unfortunate that there will be a next kid, but it's fortunate that there'll be a next kid that we’ll rescue.

[00:24:03] Patty: Yes, sir.

[00:24:03] Verrico: You've been listening to, Technologically Speaking with our special guest, Patty Wolfhope, who's an engineer who's applied her skills to digital forensics focused on saving children from the most horrific types of abuse that you can possibly imagine. So thank you so much for everything that you do every day.

[00:24:25] Patty: Thank you John, and thank you S&T, for your support of this work. I truly appreciate it.

[00:24:32] Verrico: There's more to come on this topic. Be sure to tune in next episode when we speak with Shane Cullen, a colleague of Patty Wolfhope’s, and we'll talk more about digital forensics. Thank you for listening to Technologically Speaking. To find out more, visit our website and follow us at DHS SciTech. And to learn more about how you can help stop child trafficking and exploitation, visit the National Center for Missing and Exploited Children at www.missingkids.org. To report online Child Sexual Exploitation, call 1-800-THE-LOST or 1-800-843-5678.

 

Last Updated: 04/01/2024
Was this page helpful?
This page was not helpful because the content