Every Parent Should be Concerned About Online Sexual Exploitation of Children. Here's why.
In-depth analysis on the explosion of online child sexual abuse material
I know most parents believe that bad things happen only to others but the reality of online child sexual abuse will compel them to think twice: every 1 in 5 children are sexually solicited while on the internet.
We live in a world where the abuse of minors, sexual or otherwise, is nothing short of a humanitarian crisis and parents need to understand that technology has given predators direct access into the homes of children.
“Technology has created new mechanisms for all-digital abuse, including online grooming, user-generated image abuse and live-streamed abuse, which are impacting millions of children who were previously safe from online abuse,” read a 2019 report by the Bracket Foundation, an organization that leverages technology for social good.
A 2016 study by the Justice Department identified online grooming of kids, commonly known as “sextortion,” as “by far the most significantly growing threat to children.”
The abuse happens on mainstream social media platforms like Facebook, YouTube, and Instagram, on less known sites like Kik and Omegle, which are popular with the younger generation, within the gaming world and literally anywhere on the internet where children can send and receive messages.
A teenage girl in Tennessee thought she made a new friend on Kik Messenger and they both chatted for six months bonding over their love for volleyball.
The high school student messaged a partially nude photo of herself to her online friend and things turned sinister. Her “friend” demanded that she record a video of herself performing explicit acts or bear the consequences.
“You literally have no choice but to obey unless u want ur pics spread to your friends,” according to court records.
The girl told her mother and the offender, a 24-year-old man from Louisiana, Matthew Chaney Walker, was arrested. Police investigation revealed that Walker had tricked over 50 girls to send him nude and sexually explicit images.
“I thought for a long time that there was something wrong with me or that I was a bad person,” the girl said, whose identity cannot be revealed to protect her privacy. “Now that I’ve gotten to college, I’ll talk to my friends about it, and there have been so many girls who have said, ‘That exact same thing happened to me.’”
In 2018, there were 1,500 cases of sextortion reported to the National Center for Missing & Exploited Children (NCMEC), a federally mandated clearinghouse that works with families and law enforcement agencies to combat child abuse. Authorities, however, say that this is just the tip of the iceberg. In many cases the offenders are never reported to the police let alone prosecuted for their crimes.
The gaming world is a hunting ground for predators. Kate, a mother of a 13-year-old boy who spends his time on free-to-play games like Fortnite, was shocked to see videos of children being abused including beastiality involving a young boy on her son's Discord account, a messaging site where gamers can chat while playing.
Hoping to identify the culprits, Kate asked her son about the usernames of fellow gamers. “And he’s saying, ‘That’s so-and-so who goes to this school.’ And they all think it’s a friend of somebody,” she said, “but then they realize it’s not a friend of anybody.”
Kate alerted Discord and received a message saying that the company could not take action because her son had deleted the messages.
“First I was very sad, but I’m really angry,” she said. Kate, like all the victims and their families interviewed, agreed to release only part of her name to protect their privacy.
Posing as children, predators gradually build trust with kids often by sharing stories of hardship or self-loathing.
“Their goal, typically, is to dupe children into sharing sexually explicit photos and videos of themselves — which they use as blackmail for more imagery, much of it increasingly graphic and violent,” according to a New York Times investigation in 2019.
“The first threat is, ‘If you don’t do it, I’m going to post on social media, and by the way, I’ve got a list of your family members and I’m going to send it all to them,’” said Special Agent Matt Wright.
“If they don’t send another picture, they’ll say: ‘Here’s your address — I know where you live. I’m going to come kill your family.’”
The Times reviewed court records, police reports and academic studies and found that some perpetrators are grooming hundreds and even thousands of children.
You still think your child is somehow magically safe in the digital age?
Rise of child sexual abuse content online.
The enormous increase in sextortion cases coincides with the explosion of child sexual abuse material, or CSAM, on the internet. The NCMEC found 18.4 million URLs containing content with abuse against children in 2018 including 45 million images and 22 million videos. A huge jump from just 6,000 URLs in 1998.
CSAM means sexual abuse of children that is recorded on video or captured in images and then posted on the internet. A lot of the time, the abuser is a close relative or a friend of the victim and children are told that what is happening to them is completely normal.
Court documents reveal the disgusting nature of abuse. In one video, a woman orally forced herself on a girl before tying her upside down by the ankles so that another child could urinate on her face. I know this is difficult to read but we need to be aware to understand the dangers that our children face. Take a moment to think about the trauma that this young girl will carry for the rest of her life (more on that later).
Another video showed a woman inserting an ice cube into the vagina of a young girl, tying her legs together, sealing her mouth with tape and then suspending her upside down. The girl was repeatedly beaten and burned with a candle or a match.
“The predominant sound is the child screaming and crying,” according to the documents.
The abuse of children is nothing new and modern technology has not just made it easier to share but also created new avenues of abuse.
Culprits are exploiting images that are innocently posted by children to create “deep fake” porn films in which a child’s face is digitally edited into existing CSAM. Four of every ten children remove privacy settings to attract more followers.
Sexting is another way abusers post self-generated, sexually explicit material from its original location onto the internet. Up to 40% of adolescents engage in sexting using messaging apps and live-streaming technology to indulge in their sexual curiosity and according to one report 88% of such content is uploaded elsewhere including websites on the dark web.
The police operation against the dark web forum, PlayPen, led to over 540 international arrests and close to 300 children rescued. Before it was taken down, PlayPen had 150,000 members. The now shuttered Child’s Play website, also on the dark web, had one million user accounts.
Love Zone, another website on the dark web, had 30,000 members. An exclusive section of the website was reserved only for members who shared videos and pictures of children they had abused themselves. They were known as “producers.” It was managed by a man in Ohio and his arrests revealed that he had 3 million videos and photos on his hard drives.
Children who are exploited in CSAM are from all around the world but the majority of CSAM, 95%, is hosted on servers based in Europe and North America. The Netherlands hosts 47% of all CSAM.
The pain of abuse.
The reason I devote a lot of my time studying and reporting child abuse is because abuse of any kind leaves a profound impact on children’s mind and they have to live with it for the rest of their lives.
One F.B.I. study revealed that a quarter of sextortion cases led to suicides or attempts of suicide.
For sisters F. and E., it is a never ending struggle. Their father, a man from the Midwest who is now in prison, posted several videos and pictures of the girls on the internet when they were 7 and 11-years-old. In one, the father and another man drugged and raped the 7-year-old. A decade later, the two sisters live in fear of being recognized because online abusers are known to seek out victims, even in adulthood. That is why the sisters can not speak publicly about the crimes committed against them.
“You get your voice taken away,” E. said. “Because of those images, I don’t get to talk as myself. It’s just like, Jane Doe.” Their mother shares their anguish.
“Every hope and dream that I worked towards raising my children — completely gone,” she said. “When you’re dealing with that, you’re not worried about what somebody got on a college-entrance exam. You just want to make sure they can survive high school, or survive the day.”
The cruel reality of the modern era is that content of abuse uploaded online is seemingly preserved forever and the repeated sharing of the content “revictimizes children, intensifying feelings of shame and powerlessness that cause long term psychological damage.”
In 2019 alone, the pictures and videos of F. and E. were found in 130 police investigations. In the case of a teenage girl who now lives on the West Coast and was raped by her father when she was 4-years-old, her images were found in 350 police investigations over a four year period from across the country including Florida, Kansas, Kentucky, Michigan, Minnesota and Texas.
Every time the F.B.I. finds images in a child abuse investigation, they notify the family if the child is under 18. The family of the West Coast girl have not told her that recordings of her abuse are online because they don’t want her to get even more hurt.
“We’re just afraid of all the negative impacts that it might have — because I’ve spoken with other moms whose daughters know their images are online and they’re train wrecks,” her mother said. “She doesn’t need to be worrying about most likely the worst part of her life available on the internet.”
It is illegal to look up images of child sexual abuse regardless of the intentions. If you do you will receive a visit from the F.B.I. but that is if you are using conventional browsers like Google Chrome or Firefox that track and report illegal activity. Most dark web websites are not accessible via commonly used browsers. They require alternate browsers like Tor which masks your IP making it impossible to identify your location.
The important thing to note is that while predators use dark web forums to share tips and chat with peers, 80% of dark web traffic is on websites that contain child abuse material, the content itself is hosted on mainstream platforms like Google Drive, Microsoft One Drive, Dropbox, Facebook Messenger, Amazon Cloud and the likes. These platforms handle millions of uploads and downloads every second but none of them scan their products to detect child abuse content. Dropbox, Google and Microsoft do scan but only when the content is shared not when it is uploaded. Evading detection then becomes an insignificant additional step of sharing the account login with fellow abusers.
Social networks are complicit
“... police and social networks are of one mind when it comes to the mission of wiping CSAM off the internet,” wrote Chris Priebe, founder of Two Hat, a firm that uses artificial intelligence to moderate online content.
I do not agree with Priebe. I believe that social media companies are not invested in fighting online child abuse.
“The companies have known for years that their platforms were being co-opted by predators, but many of them essentially looked the other way,” according to interviews and a review of internal emails by the New York Times.
Facebook is by fat the most notorious when it comes to child abuse material. The company accounted for over 90% of imagery flagged by tech companies in 2018. The social media giant with 1.85 billion daily active users “thoroughly scans its platform” but is not using all available tools and Facebook Messenger, the main source of illegal content, lets users encrypt their conversations making detention impossible.
The Canadian counterpart to the NCMEC, the federally designated clearinghouse, notified Google about two images. One with semen covering the face of a young girl. The other showed a young girl exposing her genitals. Google wrote back saying that images do not meet the “reporting threshold” but eventually agreed to remove them.
In another instance, an image was found on Google’s search engine in which a woman was touching the genitals of a naked 2-year-old girl. The company wrote back to the Canadian investigators that pedophilia is “not illegal in the United States.”
“It baffles us,” said Lianna McDonald, executive director of the Canadian center.
This environment emboldens perpetrators to continue abusing children with impunity. Gregory Householder from Florida was arrested for child pornography. He told investigators that he knew his activities were illegal but he continued to use online platforms for eight years because he believed that he would never get caught. It will not come as a shock to you that 99% of CSAM goes undetected.
Companies just “don’t want to advertise that they are open for business” to criminals, said Alex Stamos, former security chief for Facebook and Yahoo.
“If they’re saying, ‘It’s a security problem,’ they’re saying that they don’t do it,” he added.
YouTube’s algorithm recommended pedophiles onto innocently shared videos of children. Investigations by newspapers found that communities of pedophiles were making sexually explicit comments and reacting to posts by other pedophiles.
Tech companies actively scan their products to detect copyright infringement, facial recognition and malware detection but look the other way when it comes to abuse related content citing very real privacy concerns.
Amazon, for example, said that the “privacy of customer data is critical to earning our customers’ trust.”
We cannot use prevention of child abuse to create laws that do not respect the privacy of people. We have to find a balance between the two but for that to happen the conversation around child abuse has to change.
“The problem of child sexual abuse imagery faces a particular hurdle: It gets scant attention because few people want to confront the enormity and horror of the content, or they wrongly dismiss it as primarily teenagers sending inappropriate selfies. Some state lawmakers, judges and members of Congress have refused to discuss the problem in detail, or have avoided attending meetings and hearings when it was on the agenda,” per The New York Times.
Only six people sat in a room for at least 100 where Ben Halpert had organized a talk on sextortion at the 2019 gaming festival in Atlanta with 35,000 registered attendees.
“People don’t want to talk about it,” Halpert said, who runs the nonprofit, Savvy Cyber Kids.
Most people do not know that over one billion kids, or half of all children worldwide, experience violence every year. That combined with the fact that children most vulnerable to online sexual abuse also experience difficulties in life like homelessness, mental disorders and parental conflicts creates an urgent need to address the crisis.
In my opinion, we need a public health approach that is not limited to the public health system but holistically addresses violence through changing of the laws, social change, and understanding the context that creates the space for violence. Experts around the world agree that violence occurs in communities that are plagued with substance abuse, infectious diseases and poverty.
We also need to educate parents and policy makers who, compared to the children, have rudimentary digital skills and lack the basic awareness of the dangers that lurk online. 80% of the world’s population have access to the internet today of which one in three are under 18 and are often unsupervised.
Attitudes will not change until laws do. Broadcasters have to adhere to a strict standard for the content they publish on their television channels or radio stations and utility companies are fined billions of dollars when they hurt our environment but there are no real consequences for tech companies, the richest firms in the world, when they pollute our online ecosystem.
In Germany, for example, if social media companies do not remove content with hate speech or CSAM within 24 hours, they can be charged up to €50 million for each instance.
Content uploaded to social media sites is reviewed by moderators. Facebook, for instance, hires 15,000 content moderators just in the U.S. alone. These moderators are paid minimum wage, are hardly trained and have to go through thousands of pieces of content every day. They, however, face the same hurdle as law enforcement agencies when it comes to CSAM.
“Investigators are, after all, human beings. Hardwired for empathy and compassion, we aren’t built for processing endless images of horror and abuse,” Priebe observed.
On average, 500,000 images and videos are found in each child abuse investigation. That takes a huge emotional toll on investigators who have to review each piece of CSAM. The NCMEC alone has reviewed 192 million CSAM related materials. Studies have found that their work causes burnout and compassion fatigue and may trigger mental illnesses like depression, anxiety and insomnia.
36% of law enforcement officers revealed that they experience moderate to high levels of secondary trauma from seeing disturbing images according to one study. Officers experience unintended thoughts about victims of child abuse outside of work hours and are predisposed to use alcohol as a coping mechanism.
Meanwhile the amount of CSAM is showing no signs of slowing down and the Coronavirus pandemic has accelerated it even more.
The NCMEC called the system at “breaking point”, with the amount of abusive images “exceeding the capabilities of law enforcement to take action.” A former employee of the now defund video service, Vine, disclosed that gigabytes of illegal videos appeared more quickly than they could be removed.
One in every 10 agents at the Homeland Security Investigation division, that deals with all kinds of threats including terrorism, is now working on child sexual exploitation cases.
“We could double our numbers and still be getting crushed,” said Jonathan Hendrix, a Homeland Security agent. Prompting agents to prioritize cases involving infants and toddlers.
Going back to sextortion, police and tech companies have had some success. Operation Game Over in 2012 was a collaboration between Microsoft, Apple, Electronic Arts, Disney Interactive Media Group, Warner Brothers and Sony. The operation led to the removal of 3,500 accounts from gaming platforms like Xbox and PlayStation that belonged to registered sex offenders.
Reacting to complaints by parents, police in New Jersey started chatting with pedophiles under identities of children and in one week arrested 24 people. Operations in Bergen and Somerset county led to 36 arrests. Police hoped to uncover a pattern to help with future investigations but they didn’t. Among those arrested were a police officer, a teacher, a minister, a nurse, a bank manager, a mechanic, a waiter, a dental hygienist, a college student and a delivery man.
“It cuts across all social and racial lines, across class lines — it cuts across every line,” said Christine Hoffman, assistant attorney general. “There is no profile.”
In the second installation of this series I will look at how AI can help with prevention of online child sexual abuse.
Check out my recent post on pedophilia in the Islamic world where a lack of awareness is exacerbating the problem.