HomeNews'Vast Pedophile Network' Discovered on Instagram - Seemingly Harmless Emojis Used as...

    ‘Vast Pedophile Network’ Discovered on Instagram – Seemingly Harmless Emojis Used as Sick Code

    Published on

    In yet another sobering reminder that children should avoid most forms of social media, a devastating new report has uncovered a “vast pedophile network” on the increasingly popular Instagram platform.

    Instagram, which is owned by Facebook parent company Meta, is a more visually-oriented form of social media (Twitter initially concentrated on text or “140 characters,” whereas Facebook falls somewhere in between).

    The Wall Street Journal discovered that a great deal of this perversion was hiding in plain sight in the captions under photographs, where the majority of communication occurs.

    Wednesday’s report stated that those implicated were apprehended for using innocent emoticons as code to help spread their sick pedophilia network.

    Those emojis included a map (“MAP” is the acronym for “minor-attracted person”), a cheese pizza (CP is also the initials of “child pornography”) and a reverse arrow next to a person’s age (so “Age 31” in a bio really has some sort of connection to a 13-year-old).

    It’s all disgusting, but the report’s description of the network’s depth makes it all the more ominous.

    The Journal, in conjunction with researchers from Stanford University and the University of Massachusetts, Amherst, discovered that Instagram has enabled this community of pedophiles to fester, develop, and network, either through malicious ignorance or something more disturbing.

    Worse still?

    Due to how social media algorithms operate, it only takes one visit to one of these pedophile profiles for the app to begin promoting and bombarding your feed with relevant recommendations.

    Yes, Instagram’s automation has made it possible for pedophiles to game the system.

    These promotions and “follow” suggestions make the situation even more horrifying.

    The report observed that despite the fact that pedophiles have long used the darkest corners of the internet to sate their desires, it has always been a deliberate decision and something the pervert would have to seek out.

    By automating and streamlining the entire process, Instagram’s algorithms essentially eradicate the conscious element.

    The Journal added another sickening layer to this: “The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as ‘little slut for you.’”

    These identifiers aren’t even subtle, yet Instagram makes it easy to locate them.

    Alex Stamos, the director of the Stanford Internet Observatory and until 2018 the chief security officer of Meta, criticized how simple it was for individuals with “limited access” to delve so deeply into this pedophilia-related network.

    More on this story via The Western Journal:

    “That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” Stamos told the Journal. “I hope the company reinvests in human investigators.” CONTINUE READING…

    Latest articles

    Republican Congressman, Former Navy Seal, Announces Absolutely Tragic News

    On Tuesday, Representative Derrick Van Orden, 53, of Wisconsin announced his daughter's departure to...

    Republican Presidential Candidate Rushed To ER Hours Before GOP Debate

    According to a reliable source, North Dakota Governor Doug Burgum was taken to an...

    Democrats Blanket Milwaukee With Sick Billboard Message For GOP Debate

    Today, the National Democrats are actively spreading optimistic messages about President Biden in Milwaukee,...

    Trump At War With Melania Over Son Barron Ultimatum

    Donald Trump and former First Lady Melania's current marital predicament stems from their use...

    More like this