Red Hot Cyber

Cyber security, cybercrime, hack news, and more
Search

Hunting Hydra: ETA-W1nterSt0rm #OpChildSafety discovers a huge CSAM network

Olivia Terragni : 2 April 2024 07:51

#OpChildSafety: just as in every story, there is always more than what a reader or a spectator can see and above all know. This is a story is about the discovery of a huge network of CSAM (Child Sexual Abuse Material) from which emerged a thousand-headed monster, Hydra and those who did it are the Electronik Tribulation Army (ETA) and the W1nterSt0rm conglomerate an OSINT and threat intelligence group committed to educating and then combating online sexual predators targeting children.

The #OpChildSafety (Operation Child Safety) campaign has a long history on the Internet: many of you will remember the #OpPedoHunt operation, created by Anonymous to protect those who have not yet developed their ability to decide or evaluate the world, just to remind us that some “wars” –  based on moral and ethical rules – are still worth fighting, when there are those – we are talking above all about pre-adolescent children – that can easily become objects or commodities of a potential sprawling network, which in some cases becomes a real business, as demonstrated by the millions of child pornography videos and photos that we can find on the Internet.

Now “Turtles all the way down” comes to mind: an expression of the problem of infinite regress, based on the mythological idea of a large turtle holding up all the other turtles in an infinite column to support the world. Well, as ETA-W1nterSt0rm is still digging, more and more, a great dark chasm has opened up, forcing us to reevaluate and reformulate the premises.

O:Before telling this story, can you explain what is the culture behind ETA-W1nterSt0rm, and what are the ethical and moral rules, that regulate the predators’ hunting, against the use of illegal hacking techniques, and above all why is it important to remember it?

Ghost Exodus: “There’s no short answer to this question. I’ve witnessed Anonymous doing the same things over and over, expecting different results when it comes to their online child safety initiatives. In many cases, I saw groups making a lot of misinformed claims concerning their success at eliminating threats against kids. I witnessed that they were doing this kind of work for selfish ambitions and putting themselves and others at risk of being arrested

Not all, but the majority of these groups do not seem to have an understanding of the harm they’re causing to investigative work that could aid in arresting individuals feeding the monster that is CSAM. One person is making a report to the Internet Watch Foundation (IWF) or the National Center for Missing and Exploited Children (NCMEC), another launching DDoS attacks on websites they just reported, which is undermining any investigative efforts. Still, another is reporting accounts to have them banned, thinking that they’re doing the world a service, while not doing anything meaningful to have the CSAM operators arrested, and others are publicly doxing targets which is alerting the targets that they’ve been found out. It’s fruitless chaos

I don’t believe most of these pedo hunters understand how dangerous this territory is. I spent 11 years in federal prison for hacking, and I was forced to live with pedophiles and men who contributed to feeding the CSAM monster. Likewise, I understand these kinds of criminal cases well, and how the FBI catches them, which can put honest pedo hunters at risk. Therefore, W1nterSt0rm had to create a culture to help guide hunters to protect them and teach them how to potentially maximize their effectiveness. Therefore, we developed laws called The Charter. It is through these laws that we help guide fellow hunters seeking direction and guidance. #WeGuardTheCharter“.

#opchildsafety the chart
Source: OpChildSafety, The Charter – GhostExodus.org
#opchildsafety the charter 5-13
Source: OpChildSafety, The Charter – GhostExodus.org

The tip of the iceberg

The internet has become an uncertain and often dangerous place for children who use social media, especially in the pre-adolescent age. Well, there would be nothing wrong with this at all, if predators, always lurking, hadn’t precise luring strategies and if social media themselves adopted effective measures in counter operations. Unfortunately, social media still today contains sections with adult content, which does not seem so difficult to access.

At this point, we must also state that in light of the recent lawsuit filed against Meta platforms and CEO Mark Zuckerberg by New Mexico Attorney General Raúl Torrez, the discovery made by a member of ETA-W1nterSt0rm on February 28th was extremely significant. A member of Anonymous – a former military intelligence analyst –  discovered a link in a public Facebook group they were investigating and building a case against for distribution of CSAM. At some point, the group’s privacy settings switched to private, but not before their WhatsApp and Telegram groups were discovered.

But we don’t only have Raul Torrez’s accusations, but a report by the researchers from the Wall Street Journal and the University of Stanford and Massachusetts Amherst has highlighted how in the Meta universe the distribution of child pornography has been allowed, thus failing to detect predatory networks. The investigation by the Wall Street Journal, highlighted how the Instagram algorithm would suggest videos of a sexual or child pornography nature among the “reels” and the algorithm would work in such a way that even minor users found themselves among the proposed “reels” are not suitable for them.

However, reporting certain content is not enough, this problem has been also highlighted by ETA-W1nterSt0rm: once banned, the user creates a new account and continues to distribute CSAM. Instead, these users should be reported to appropriate platforms such as the NCMEC in the United States, and the Child Exploitation and Online Protection (CEOP), which is a command of the National Crime Agency in the United Kingdom. 

O: “Here’s another interesting question: can you explain how the process of capturing and reporting CSAM material takes place and to whom the final report is forwarded by ETA-W1nterSt0rm members?

Ghost Exodus: “The first rule of thumb is that OPSEC is religion. This stands for operational security and denotes the zeal and devotion we must have toward protecting our online identity. While I cannot go into specifics on the process of capturing evidence without aiding the enemy with knowledge of our methods, I can describe it in the general sense.

It usually begins with someone accidentally stumbling upon a link to CSAM on Facebook, which says a lot about the social media giant, since the algorithms aren’t detecting this illegal content This is consequently why Facebook is being sued. One link leads to another link – and another. It’s so flagrant that Facebook has become consumed by child predators. From this point, my team works on the data side of these investigations using Open Source Intelligence (OSINT) techniques. This is done by pulling historical WHOIS records, using tools to uncover backend IP addresses, web scrapers to extract any payment methods, personal identifiers, reverse image searches, breach report databases, and pretty much any data item that could lead to an identity. The methods we use depend on the circumstances.

The most important element is reporting. However, it angers me how unclear many reporting platforms are when it comes to prosecution because, in my opinion, the IWF uses vague language that completely sidesteps answers regarding investigating and prosecuting based on the information they receive from reports. They do not make their purpose very clear. With a lot of Googling, I learned that they merely get CSAM  material removed. Which gives the criminals behind it a free pass to create new sites and accounts. This doesn’t make sense to me. People are being told to go somewhere else to make reports that seemingly don’t meet their criteria. People largely don’t even know where to go, and legitimate cases are being turned away.

You can’t contact the FBI, because they have completely outsourced criminal complaints of CSAM to the NCMEC. While FBI agents are assigned to work with them to identify these people and prosecute them, they offer little reassurance to the individuals sending reports that their reports are being looked into, which gives the public the impression that their reports are simply thrown in the trash. It is also not clear exactly what the NCMEC’s role is upon receiving reports without having to vigorously research to get the answers.

I’ve found that the Child Exploitation and Online Protection (CEOP) is very clear in what they do. They are a police unit from the National Crime Agency. They’ve talked with me, asked for the evidence my team collected, and have the appropriate connections to collaborate with other law enforcement, which civilians don’t have. I’ve seen them counsel an individual, and offer empathy and emotional support. I highly recommend them”.

Before delving into the story, we should also highlight the method of analysis and investigation that the group follows. Even if no law prohibits the use of OSINT gathering techniques, at the same time any type of illegal hacking is never encouraged, such as a DDoS attack on the victim or the use of illegal means to obtain information.

In addition to this, deceptive operations designed to ‘capture’ and ‘report’ a sexual predator are not always easy, and above all, sometimes, they are on the edge of legality. Above all, for many, seeing up close this cruel world can be truly traumatic. GhostExodus, as an example, explains: “I am the type of person who cannot even so much as glimpse things of this nature without being traumatized, which is why I work exclusively in data analysis”.

O: “Now, starting from your statement we should talk about how everything is organized within the ETA-W1nterSt0rm group in Operation Child Safety. You said that a person was designated to offer visual confirmations. Another individual was selected to perform OSINT on usernames and phone numbers, thus showing a thoughtful and well-organized group“.

Ghost Exodus: “Yes, it’s quite simple actually. We run multiple teams which helps lighten to workload. One team uses OSINT to de-anonymize CSAM website users, and another does data analysis on the sites themselves, looking through records and sifting through source code to extract useful information. Another compiles the data and escalates it to a reporting platform”. 

Turtles all the way down: a huge CSAM network that unravels throughout the Internet

turtles all the way down

While members of ETA-W1nterSt0rm were investigating a website hosting child pornography materials, they discovered a vast network throughout the Internet hiding a real nightmare.

This all comes after – as we told before –  on February 28, 2024, when a member of Anonymous, discovered a link in a public Facebook group that ETA-W1nterSt0rm was investigating.

The first steps of the investigation

The first step to understand who was hosting the target was to obtain the WHOIS records: in this way, the first information on who hosted a certain website or the owner was found and then using Virus Total the hosts were passively scanned, mapping every relationship connected to the target, such as host information, ASN numbers, WHOIS, domain certificates, subdomains, mirrors, and of course, malicious files or code.

“That is when” as GhostExodus stressed “I discovered a mirror running by the same hosting provider”. From that moment the group started to work on two websites, instead of one. The domain name” GhostExodue said in his report also published on CyberNews “belonged to a Google Domains subscriber registered in Germany, but the hosting server was located in Nevada, USA. The subscriber’s information was marked as private.

In this case, it generated a large cluster of malicious code. However, inside one of these clusters was a malicious IoC indicator (Indicator of Compromise) pointing to a TOR onion link on the dark web. It indicated the possibility that Hydra subscribers could access the content using a bridge from the dark web to the clearnet. This would be easily accomplished by using tor2web, which is an HTTP proxy software designed to do just that. An IoC acts as a red flag, which indicates that something suspicious might be taking place on a network or endpoint. This can be anything from strange activity such as outside scans to data breaches. Any digital evidence left by an attacker will be detected by an IoC”.

On February 29th, both sites were reported to the Internet Watch Foundation, and to the National Center for Missing and Exploited Children.

O:At this point, we need to explain why it is important not to launch cyberattacks against the websites under investigation. I’ll also give you the floor to tell what happened on March 2nd after reporting the two websites”.

Ghost Exodus:”This is the truth: DDoSing websites that are being investigated for hosting media depicting victims of child molestation is insane, thoughtless, and the stupidest thing I’ve ever seen. I’m not sorry for saying that. This view isn’t just shared by me, but also by professional hunters who work with the FBI. People who do this need to be sat down and dismissed from OpChildSafety volunteer work, until they learn that this is not the way. 

All too often, overly ambitious hacktivists want to give off the appearance that they are swift, proactive, and powerful enough to cause “immediate results”. However, attacking sites under investigation effectively disrupts any further investigative initiatives, and forces it to come to a halt.  

Not only is the website taken offline, but the CSAM operators get a fair warning that they’re under investigation, shut down their operations, and move somewhere else where they can continue to host media depicting children being molested for profit. Forgive me for using such graphic terms. This issue is so persistent, and few people acknowledge the Charter because of Anonymous’ mob mentality. People who attack these sites are usually childish, selfish, self-promoting idiots who don’t deserve to wear the Mask. They are aiding the enemy, even if it is unintentional. Some have the right heart, even if their actions are misguided. Ultimately, when we lose access to the evidence, so does law enforcement. This gives CSAM operators and all who feed the monster a free pass to continue committing more crimes against a child’s innocence”. 

After the events of March 2, the data was reconstructed. “More importantly” GhostExodus pointed out, “after performing a reverse image search on the “model” the operators are using as a website preview image, we learned that the victim in question is featured all over Clearnet CSAM sites”. “By employing the same method, I ran a reverse image search on a screenshot of their user login page and discovered that one of the discontinued mirrors had been cached by Yandex, which pointed to a single snapshot of the site on Archive.org”.

Then, searching for domain names on Google, over a dozen mirrors were discovered, many of them linked to illicit password-protected content shared on Mega and other mirrors including links posted on Facebook.

Source code analysis and the use of artificial intelligence

One of the group’s researchers thus began scanning and checking Hydra’s sites, visually searching for its source code. Here’s the discovery:

  • The sites were all exact duplicates
  • The owner was using a Discord bridge to host web content
  • The owner’s Telegram ID
  • Cryptocurrency payment options
  • PayPal email address
  • The website was written sloppily

ChatGPT was then used to better understand the relationship between the different sections of the code: “I used ChatGPT to better understand the relationship between different sections of code” GhostExodus said “and had it search for useful investigative items, such as payment information, and how it hosted content. This is great for generating summaries of complex information because it can parse through programming languages you might be unfamiliar with”.

But ChatGBT was not the only AI tool used for the investigations. When one of the researchers infiltrated the Hydra’s WhatsApp group, he gained access to the users’ phone numbers and profile’s pictures. Artificial intelligence proved to be useful in this case too: through an application, they managed to regenerate one of the photos that belonged to a high-profile target that was cut in half. Unfortunately, using the reverse image search, no match was found.

The discovery of an insecure endpoint

One of the researchers discovered that Hydra suffers from an interesting server vulnerability: an endpoint allowed the researcher to obtain “GET requests from the server in real-time, exposing a log consisting of time stamps, user names, user IDs, which tier the user bought, how many users they invited to the site, and so on. This allowed us to dig even further into the network, understanding its criminal structure”.

#OpChildSafety: this is not the end

O: “After all these discoveries, this is not the end of the story: every day more and more mirrors are found. How many have been discovered to date and how many CSAM networks have come to light?” 

Ghost Exodus: “One of our data analysts has uncovered somewhere around 60 mirrors operated by Hydra. We are uncovering more mirrors every day”.

O: “In your research, you highlighted how Hydra rests on the shoulders of a common denominator: the technology industry. It is not directly a promoter, I think, however, criminals thrive on various online platforms and often there are no consequences. Is it right to think we can offer more transparency in this regard? Is it enough to delete content?” 

Ghost Exodus: “Merely deleting content without a prosecuting element is fruitless. The common denominator is this fight against predators is irrevocably the tech industry. Meta is a perfect example, but especially in the video game industry. There are no actionable safeguards for detecting it, even though the same technology is being used to censor free speech, flag suspicious news sources, fact-check things you post, and so on. Child safety is not a component of the industry, even if they say the opposite is true. The evidence speaks for itself. 

Take CloudFlare for example. Their services not only protect cybercriminals, but they are also protecting pedophiles in such vast numbers that you’d imagine heads would roll. Hydra has been using their services for over 2 years now. Their services have provided security and protection for unknown scores of domain names Hydra on the clearnet. There’s no obfuscation, their material is in full view of the public. 

Because protecting young users was born in the backseat. It never took a backseat because it’s never been a priority. We know this is the case because of how CSAM continues to proliferate on the open web and is showing no signs of slowing down. 

This is because the industry has provided the infrastructure, which has no actionable oversight, no guards standing watch at the gate, security checkpoints to limit bad actors from utilizing their infrastructure, and most importantly, no actionable detection technology – like Facebook. If we were talking about child molesters with the same determination as the US government’s war on terror, we wouldn’t be here, having this interview.
If I was caught running a multi-million dollar Ponzi scheme exit scam on CloudFlare, or discussing terrorist plots, they wouldn’t merely delete my account. They would work with law enforcement to ensure I was brought to justice. However, when children are victimized, even the IWF simply removes the sites and calls it a day. It’s going to take legislation to twist the arms of the tech industry to begin making this a top priority”. 

O: “After some research, ETA-W1nterSt0rm has learned that the Child Exploitation and Online Protection (CEOP) law enforcement unit will work with the public to investigate and arrest the people behind CSAM crimes. What other progress do you hope there will be in this regard?” 

Ghost Exodus: “The only progress I hope we will see is the Hydra CSAM operators put in prison, along with its many users, and the children rescued. Saving the child is the ultimate goal”.

O: “Why is it important for ETA-W1nterSt0rm to maintain a good reputation, and what does teaching ethical and conscious hacking mean to you?

Ghost Exodus: “Law enforcement does not work with cyber vigilantes, especially Anonymous. This has more to do with the way Anonymous carries itself, which does not have a positive reputation. Furthermore, organizations will not work with cyber vigilantes. Even worse, Anonymous attacks itself relentlessly, so we try to stay above reproach so we can continue to work unhindered. As for hacking, we don’t use hacking techniques. We use publicly available resources“.

Olivia Terragni
Author, former journalist, graduated in Economic History - Literature and Philosophy - and then in Architecture - great infrastructure - she deepened her studies in Network Economy and Information Economics, concluded with a Master in Cyber Security and Digital Forensics and a Master in Philosophy and Digital Governance. She is passionate about technological innovation and complex systems and their management in the field of security and their sustainability in international contexts. Criminalist. Optimistic sailor. https://www.redhotcyber.com/post/author/olivia-terragni/