Sunday, June 18, 2023

Exercise Choice


According to a comprehensive investigation by The Wall Street Journal, the Stanford Internet Observatory, and the University of Massachusetts Amherst, it has been found that Instagram has emerged as a platform that is the new preferred outlet for pedophiles.

The platform has been proliferated with child abusers, human traffickers and other perverts and allows them to buy and sell child pornography, and sometimes, set up meet-ups between paedophiles and children caught by human traffickers.

Shockingly, the study unveiled that Instagram facilitated the search for explicit content through hashtags such as “[inappropriate language]” and “#preeteensex,” enabling perverted users to find accounts selling child pornography.

Even more distressing, numerous accounts posing as children themselves surfaced, with handles like “little ------ for you.” These kinds of identities added another layer of deception and exploitation within this heinous network.

Remarkably, rather than openly sharing illicit sexual material, these accounts displayed ‘menus’ showcasing their available content, allowing potential buyers to select their preferred exploitative material.

Adding to the gravity of the situation, a disheartening revelation emerged that several of these accounts also provided customers with the abhorrent option to arrange in-person meetings with the exploited children, perpetuating the cycle of abuse and trafficking.

How the research was conducted

In a striking experiment, researchers established test accounts to gauge how swiftly Instagram’s “suggested for you” algorithm would recommend accounts involved in the sale of child sexual content. Shockingly, within a remarkably short period, the algorithm inundated these accounts with content that sexualizes children, some of which even directed users to off-platform sites dedicated to content trading.

Relying solely on hashtags, the Stanford Internet Observatory discovered a staggering 405 individuals selling what they referred to as “self-generated” child-sex material. These accounts purported to be operated by children themselves, with some claiming to be as young as 12 years old.

What further raised concerns was Instagram’s allowance for users to search terms that the algorithm recognized as potentially linked to illegal content. Rather than preventing access to such material, the platform permitted users to discover and explore explicit content.

Interestingly, when researchers employed specific hashtags to locate illicit material, a pop-up occasionally appeared on the screen, cautioning users about the presence of child sexual abuse images. It emphasized the extreme harm caused to children through the production and consumption of such material. However, this warning did not hinder the availability or circulation of the content.

The cunning and intricate system

Pedophiles operating on Instagram, cunningly employed an intricate system of emojis to engage in coded discussions about their facilitation of illicit material.

For instance, an innocuous-looking map emoji took on a sinister connotation, representing “MAP” or “Minor-attracted person,” serving as a covert reference to their deviant interests.

Equally troubling, a seemingly harmless cheese pizza emoji was cleverly abbreviated to signify “CP” or “Child Porn,” effectively evading detection by those unfamiliar with the coded language.

In their deceptive attempts to conceal their activities, these accounts often adopted monikers such as “seller” or creatively spelt variants like “s3ller.” Furthermore, rather than explicitly stating the ages of the exploited children, they resorted to euphemistic language, using phrases such as “on Chapter 14” instead of directly indicating the age of their victims.

This calculated employment of emojis and disguised language not only enabled pedophiles to communicate discreetly but also allowed them to operate within the shadows, perpetuating their heinous activities on the platform.

Meta knows about this, says its working tirelessly on it

Meta, the parent company of Instagram, has acknowledged the issues within its enforcement operations. Recognizing the severity of child exploitation as a horrific crime, Meta has established an internal task force dedicated to addressing the concerns raised.

The company has emphasized its commitment to actively defending against such abhorrent behavior and continuously exploring avenues to combat it effectively. Over the past two years, Meta claims to have dismantled 27 pedophile networks and intends to intensify its efforts with additional removals in the future.

-----

In the words of Rabbi Hillel Goldberg [Intermountain Jewish News]:


If social media foster bullying, why not choose never to turn on the device, remaining blithely unaware of the malicious attempts to bully?

If one cannot muster the discipline to turn off the phone rather than waste hours on it, why not choose not to own the phone?

If the government issues warnings about about teens’ mental health due to excessive time of the screen, why suggest a maximum limit on social media? Why not suggest never to turn it on to begin with, at least for a long period of mental healing?

If social media foster isolation, why not choose to back away from them rather than exacerbate the loneliness by going back to the device incessantly?

If students’ attention span falls radically, or academic achievement drops, why do parents and schools not choose to excise the cause?

If news accessed via social media is rarely significant, why not choose to ignore it? If the rare, significant piece of news is learned the next morning from a newspaper, or in a conversation with a friend, neighbor or colleague, what is the loss? If tornado or other vital warnings need to be issued, why not choose to pay attention only to the noisy alerts?

If hate is spread on the internet, why not choose to defeat it by denying it the validation it craves, i.e., by paying it no attention?