I was sexually groomed online at age 12 - here's what I experienced that parents and kids must know

1 month ago 3

A victim of online grooming has come forward to share a warning to parents and kids about the dangers of the internet. 

Harrison Haynes was 12 years old when he befriended a 'teenager' while playing video games online — only to be exposed to pornographic content, self-harm videos and manipulative messages.

The 'friendship' took Harrison, who is now age 20, down a dark path of isolation, secrecy and shame for more than one year.

Now, Haynes is telling his story in an effort to show that dangerous people aren't in white vans handing out candy, but are 'coming from inside our iPhones.'

Above, Harrison Haynes as a 12-year-old. Back then, Haynes thought he had found a 'best friend' when he began chatting with a stranger whom he met while playing video games online. But the stranger soon began exposing him to pornography and videos of self-harm

Above, Haynes today attending a a protest at Apple headquarters in Cupertino, California. As a 20-year-old college student, Haynes has come forward to tell his story in an effort to pressure Apple to work harder to implement better child protection features into their tech products

'I think for almost every generation in America right now, everyone was told that there's going to be a stranger in a white van handing you candy, and you should say no to that stranger,' Haynes told ABC's Good Morning America on Tuesday.

'But I think for us and for my generation, the danger isn't the stranger in the white van. That's not where the call is coming from. The call is coming from inside our pockets. It's coming from inside our iPhones.' 

Haynes' story began when he was just a child who was not making friends at the time and found gaming was a way to ease the loneliness.

And that is when Haynes connected with the '19-year-old' he said would go onto sexually groom him.

'I don't think anyone in our world had the language yet for grooming,' Haynes said of his ordeal.

The relationship formed on a gaming platform, but once the 'teenager' had their hooks in Haynes they move the connection to iMessage.

'When we moved over to iMessage, there was no way to report him,' Haynes said of the encounter. 'For him, he was safe on iMessage.'  

Using Apple's messaging service, the stranger was able to send Haynes content with self-harm and pornography.

And the stranger began bombarding the child at school and while he was with his family. 

Haynes explained that he felt 'trapped against a wall' over time, as what began as a few unobjectionable messages a week grew into abusive and far-from-appropriate messages four to five times a day — which he could not block on his iPhone

He told GMA that this dark and private suffering came to a head when his virtual abuser began to threaten to commit suicide if he did not continue to comply with his demands.

In response to his abuser's manipulative suicide threat, Haynes said he 'cried so loud that I woke up my parents down the hall.'

When his parents raced in and began to go through his phone, learning for the first time of their son's abusive relationship with this virtual stranger, Haynes said he was surprised that his fears of shame and punishment had been misplaced.

'They didn't seem mad at me like I thought they would,' Haynes remembered. 'They sat me down and told me I was being manipulated in some sort of way.

'When [...] he started exposing me to pictures and videos of self-harm and internet pornography,' he added, 'I didn't think I could reach out to an adult anymore.'

Today, Haynes (above) is a college student and child safety activist at James Mason University

Now a college student at James Madison University, Haynes has joined forces with the nonprofit advocacy group Heat Initiative.

The group is described as 'a collective effort of concerned child safety experts and advocates encouraging leading technology companies to detect and eradicate child sexual abuse images and videos on their platforms.'

This June, Haynes participated in the group's protest outside Apple's headquarters in Cupertino, California.

He and Heat Initiative are calling on the tech giant to develop and introduce features that will help parents, children and other concerned parties 'report inappropriate images and harmful situations.'

The group also wants Apple to make it harder for sexual predators to store and spread known images and videos of child sexual abuse on its iCloud platform.

For its part, Apple pointed reporters to the child safety features it has already added to its operating system, iOS, and apps over the past eight years. 

The company told Good Morning America that since the release of iOS 15 in 2021, Apple devices have included additional Communication Safety features for minors.

Those features include a warning when underage users try to either send or receive images or videos containing nudity. 

Above, another image of Haynes as a 12-year-old around the time of the online abuse

The feature is now the default setting for child accounts under the age of 13, according to the company, ever since the release of iOS 17 in September of 2023.

Apple's attempts to identify and report child pornography hidden on its iCloud servers has proved to be a trickier and controversial issues, leading to outcry from privacy advocates worried about a large corporation surveilling all its users' data.

But Haynes also has a message directly for parents and children, who he says must do the work themselves to be more careful about how they navigate digital spaces.

'Parents, I cannot emphasize this enough,' Haynes said, 'do not be afraid to talk to your kids about uncomfortable things.'

The child safety activist said that it was his reliance on his abuser as a life-line to discuss his feelings, to cope with being bullied at school, and to discuss the typical challenges and questions of adolescence that led to him feeling trapped.

'There was this weird back and forth,' Haynes explained. 'In my head, like, I want to get out for myself, because now I'm self-harming and now I'm consuming pornography as a 12, 13-year-old boy.'

But at the same time, the false sense of trust his predator had built up in him carried a deep emotional weight.

Apple launched its Communication Safety tool in 2022 in the US. The tool - which parents can choose to opt in or out of - will scan images sent and received by children in Messages or iMessage for nudity and automatically blur them

If Apple's iMessage app detects that a child has received or is attempting to send a photo containing nudity, it will blur it out, before displaying a warning that the photo may be sensitive, and offering ways to get help

'It was someone that I really did deeply care for, and I knew that if I had reached out for help, that potentially would put him in danger,' he said.

But, another part of the issue, Haynes noted, 'was the taboo and the stigmatization' of the graphic images and videos that he'd been given on his iPhone. 

'I didn't feel like I could reach out to a principal or a counselor or a teacher or my parents, because I felt like I was going to get in trouble,' Haynes explained.  

'If parents can engage with their own kids in the space that makes them feel comfortable, in their own homes,' the 20-year-old believes today, 'I think we can have a much better future.'

'If I'd had that conversation with my parents, I wouldn't have needed to find solace in an online stranger.'

Read Entire Article
Progleton News @2023