ElsaGate: The Problem With Algorithms

Caleb Townsend
Staff Writer United States Cybersecurity Magazine


Social Engineering can take many forms. Sometimes it’s a phone call attempting to extrapolate your mother’s maiden name. Other times it is an email attempting to gain rapport and get you to click on a sketchy link.  However, sometimes you get something much more bizarre. Something like Elsagate.

In 2017, many news sources started reporting on a bizarre trend taking over YouTube. From out of nowhere, YouTube has been flooded with thousands of videos containing very repetitive themes, animation styles, and content. These videos depict popular animation characters like Peppa Pig, Spider-man, Mickey Mouse, Elsa from Frozen, and many others engaging in bizarre and obscene acts. These videos target young children around the ages of 2-6 years old. Some common themes found in these types of videos include adultery, needle injections, bondage, violence, and pregnancy.

One popular video, as described by a BBC Trending story, depicts Peppa Pig having her teeth pulled out by a sadistic dentist. None of these videos feature any dialogue. Instead, the sound design features screeching, crying infant sounds, and peppy upbeat music. Other videos feature intense gore, frightening images, and overtly sexual content.

Elsagate Spreads Like a Virus 

The nightmare fuel of the infamous Peppa Pig video is far from unique. It is very easy to stumble upon hour long compilations of popular characters eating feces, getting their fingers cut off, being buried alive, ect. Thousands upon thousands of videos with titles like “Frozen Elsa gets BRAIN BELLY” and “Superhero Baby vs Nail Crying & Doctor Treats” are constantly posted, taken down, re-uploaded, and redistributed to a multitude of channels. These channels are often new and will host 20-40 videos within one week. Despite the short shelf life of these videos, many of them have accumulated millions of views. On the summer of 2017, some concerned Redditors who sought to investigate the disturbing nature of these videos coined the term Elsagate.

The Algorithm 

While YouTube hosts a variety of content, many parents will let their kids watch YouTube through an app called YouTube Kids. YouTube Kids offered a version of YouTube that had parental controlled features, pre-approved content, and intense video filtering to purge objectionable content. However, the app often would host Elsagate videos, triggering a strong backlash against YouTube. The head of family and children’s content, Malik Ducard, admitted that not all of the content was specifically curated by humans. In fact, much of YouTube Kid’s filtering process relies on an automated algorithm to proof their content. Of course, Elsagate content intentionally bypasses these sort of algorithms. Many slip under the radar by featuring colorful animation and keywords like “education” and “learn numbers”.

How Did This Happen? 

There are many theories that popped up surrounding Elsagate. Some people suggest that the videos were insidious attempts at child grooming. Others have theorized that these creators use bots to artificially inflate views as a method of spreading malware via the comment section.

Segmented market research has yielded that kids respond to videos that are mysterious, scary, or taboo. The type of videos that parents disapprove of are the exact type of videos that children desire to see. South Park, for example, is the 90’s equivalent of this; Adult oriented content that imitates the aesthetic of kid’s entertainment is an immediate draw for children. What Elsagate videos do is capitalize on this idea by escalating the appeal. In addition to the animated quality, these videos marry taboo keywords (feces, urine, needles, sex) to recognizable characters that children will endlessly consume. This unique hybrid of brand recognition and taboo curiosity gives these videos an addictive nature.

YouTube “Takes Action” 

Children are easy targets for manipulation and they are a metaphorical gold mine for both content creators, and YouTube itself. YouTube’s main goal is to keep people on their site as long as possible. Kids do not skip ads often, they don’t mind simple animation, and they will watch repetitive content ad nauseam. Therefore, creators can cheaply recycle any conceivable combination of characters, stock music, action, and backgrounds to make a large library of content in a short amount of time.  This content can be posted on any channel and due to the brand recognition, keywords, and addictive nature of this videos, they will generate a ton of money off of the ad revenue. Because YouTube want to keep people on their site longer, their algorithm will place these videos into the recommended auto-play feed, effectively blasting these kids with an infinite loop of Elsagate content.

The Dust has Settled but the Problem Remains

After the coverage in media around November 2017, the conversation surrounding Elsagate has died down. YouTube deleted many videos and responded with new guidelines on monetization. The policy seeks to demonetize videos that people flag as inappropriate, especially within the YouTube Kids app. However, a quick search on YouTube will reveal that these kind of videos are still floating around. It is common to hear reports of inactive channels being hacked, sold, and resurrected as Elsagate channels.

There is absolutely no way to completely purge these videos from YouTube permanently. Additionally, there is absolutely no reason to assume that algorithms have the best interest of kids, or any users, at heart. The only true way to keep your child from watching videos like this is to moderate what they see. Avoid letting your kid surf on YouTube unsupervised and review the content before they watch it. Report child-oriented videos that include adult content. Continue to hold YouTube accountable to their actions and to their content.


Tags: , , ,

Leave a Comment