Mods on Holiday

Mods on Holiday

Content Warning: While this post does not contain sexual images, it discusses sexual material that could be triggering or inappropriate for work settings. Please use your own discretion when choosing to open and/or read this post. 

Like (probably) many of you, I spent a sizable portion of my 2025 holiday season aimlessly scrolling on social media between glasses of wine and homemade desserts. There was little else for me to do, especially during that odd liminal period between Christmas and New Year’s, when calories and screen time don’t matter and shouldn’t be held against you. I was certainly not alone in my holiday-induced social media binge — various research suggests that people’s average screen time greatly increases on or around holidays. 

This well-deserved downtime appears to pose a challenge for Trust & Safety teams attempting to maintain content moderation standards during their busy holiday season, all while staff responsible for these standards take time off. This doom-scroll season, I stumbled upon TikTok discourse indicating some users are not only aware of the perceived holiday-induced content moderation conundrum, but are also actively exploiting it. 

“May the Force Be With You” Incident

My foray started with a video claiming TikTok moderators routinely “peace out” in December, leaving the platform with diminished moderation capacity. In the comment section of this initial video, numerous users provided links to accounts they claimed had posted specific pornographic videos in the midst of this alleged content moderation power vacuum. Comments like “[Linked TikTok username] has it,” or “see my reposted videos” were everywhere. In these users’ minds, the parents were out and the babysitter was asleep, so why not post porn? 

Unfortunately, I identified one such TikTok about three minutes after seeing the aforementioned video. By using the links posted in the video’s comment section, I was able to surface additional related content and comments. For example, one pornographic TikTok showed Chris Wade (an OnlyFans model) masturbating in a gaming chair, when he is joined by another character (also played by him). The video consists of the first character using “the force” from Star Wars to make the second character have an erection and ejaculate — all of which is readily apparent in the video. To be clear, Star Wars parodies involving the “Force” used in sexual situations are all over the normie Internet; when I mentioned this piece, hampton immediately supplied a handful of YouTube videos from a decade ago when such comedy was at its peak.

Last I saw, the individual repost of this specific TikTok had over 433,000 likes before being taken down, while discussion of the “May the Force be with you” incident had spread to other parts of the Internet and now has a designated explainer on Know Your Meme. However, the video is being continuously reuploaded and reposted (and apparently avoiding detection by TikTok’s automated moderation models), suggesting that the poor TikTok moderators — or at least the ones not on PTO — spent their holiday season playing whack-a-mole against a portion of the platform’s users determined to post pornography for clout and the amusement of others. 

Let sleeping mods lie

That platform users are emboldened to post violative content around the holidays (or at least during periods of perceived moderator absence) would likely not surprise anyone who has ever held a moderator role for an online space or community. There are even colloquialisms to describe this sudden absence of moderating authority. The phrase “mods are asleep,” for example, first emerged in 2006 and has been used to encourage Internet users of all kinds to post annoying, violative, inflammatory, pornographic, and/or outright illegal content to online forums and social media platforms. In 2011, for example, when 4chan moderators de facto banned the posting of “My Little Pony” content to the site’s forums, users repeatedly encouraged each other to post My Little Pony characters on the site’s “Mods are asleep” threads. 

To be clear, My Little Pony characters are definitely the tamer forms of content that online users may post when they think moderators are absent. The below Advice Dog meme (probably circa 2008 or so), for example, encourages users to post child porn (CP). 

Like many things on the Internet, this Advice Dog was a bad joke not meant to actually encourage the posting of illegal or violative online content. However, within minutes of digging into this trend on TikTok, I was able to identify multiple instances of recently posted pornographic content that explicitly violates the platform’s terms of service, implying that there may have been a concerted user effort to break the platform’s rules this holiday season. Notably this is not the first time even in the last six months that TikTok has had some porn-related controversy — in a October 2025 investigation, journalists with Global Witness determined that the platform’s algorithm was suggesting pornographic content to 13-year olds. 

Thin Mod Line — literally braver than the troops

All of this drives home the fact that active content moderation (or perhaps the perception that moderators are online and working) keeps some online users from acting like the Internet is a Purge-style free-for-all. It reminds me of how high schoolers behave when the teacher leaves the room: the sudden absence of authority, especially among young people, makes misbehaving (or in this case, violating platform policies) particularly alluring. Just like in a teacherless classroom, users egg each other on for clout and mutual amusement. 

Beyond acting as bulwarks against total online lawlessness, moderators keep people from seeing content that can take a severe emotional and mental toll on exposed users — even if those users actively sought out the content in the first place. Twitter (now X), for example, saw a surge in app downloads after Charlie Kirk was assassinated, almost certainly driven by users wanting to see the video of Kirk getting shot in the neck. Accordingly, research suggests that one in three American adults saw graphic images of Kirk’s death. While perhaps not as traumatizing as seeing someone shot to death, unexpectedly stumbling onto full frontal nudity or other pornographic content during your doomscroll session with your nephew looking over your shoulder isn’t necessarily a pleasant experience. Further, it seems highly likely that minors are encountering this content — research suggests that over 25 percent of TikTok’s global user base is 10 to 19 years old. Thus, with humans’ tendency towards morbid curiosity and our era of escalating political violence, moderators protect users both from themselves and, more specifically, from seeing all the gory details of the most recent violent incident or pornographic video. This makes it all the more unfortunate that moderators can’t take a well-deserved holiday without people rushing to post cock and balls.