Long before I found myself in marketing, I worked as a nightclub DJ.
There are many parallels between nightclubs and social media marketing: I provided the content and entertainment that attracted the audience into the space; the bar staff converted the engaged audience into customers with a variety of calls to action; and the bouncers kept a watchful eye for any objectionable or rule-breaking behaviour, ejecting troublemakers without hesitation.
Bouncers are the social media moderators of the nightclub. While I entertained the crowd, the bouncer usually had a less enjoyable evening arguing with drunks, handling complaints and taking the occasional punch.
Like a nightclub bouncer, a moderator’s job is to protect the online community from the worst aspects of social media. They absorb the abuse and roll with the punches so that everyone else can continue to have a great time, often completely unaware there was even an altercation.
Moderating social media can be a mentally bruising profession. Unfortunately, injuries to mental health aren’t as easy to spot as a black eye or torn shirt.
In September 2018, a group of Facebook moderators in the U.S. sued Facebook, claiming that the social media giant failed to provide a safe workplace, causing them to develop mental health issues, including PTSD.
Day in, day out, these moderators review all of the posts caught by Facebook’s algorithms or flagged by users, including extreme and distressing forms of content such as images of rape, murder and suicide – as well as every shade and type of abuse, vilification and all-round nastiness.
Last month, Facebook agreed to pay $52 million in damages in a preliminary settlement, along with an undertaking to improve the tools used by content moderators to reduce the impact of such content, as well as greater access to mental health support and counselling.
Most companies are unlikely to encounter the same sorts of extreme content as Facebook’s moderators – but that doesn’t mean the risk of trauma or harm to mental health isn’t there.
In many businesses, the moderator is also the marketer and/or content producer and/or whatever else is required to run an effective Facebook page, LinkedIn group or Twitter campaign. But whatever the job description is in your company, there will be someone responsible for ensuring everyone who visits and posts comments within the brand’s social media spaces remains on the right side of acceptable.
And controversial topics or extreme content/views can pop up almost anywhere. I’m pretty sure Vegemite never expected its Facebook page to be targeted by anti-Halal trolls in 2015, for example.
Then there’s the New Zealand subreddit on Reddit: “… for content and discussion surrounding Aotearoa, the land of the long white cloud.” It’s primarily New Zealanders expressing pride in their beautiful country – with some gorgeous photography as well. Other than telling users to knock it off if things get a little too political or having to deal with the occasional troll like most social media pages, it’s hard to imagine the moderators having too taxing a time of it.
But then last year’s Christchurch mosque shootings – and the widely-shared Facebook Live video – led to a horrific few days for social media moderators across the internet – and the New Zealand subreddit was definitely a flashpoint. A few days later, the moderators released statistics to demonstrate the scale of the issue, averaging at approximately two mod actions per minute during the busiest period. These included the banning of 460 users and the removal or deletion of 866 posts and 2,304 comments.
Even if you or your team never have to deal with content as extreme as Facebook or an event as unexpected as the Christchurch shootings, a moderator’s role is still to actively seek out, review and take action against negative or unacceptable comments and content.
And then there are the everyday activities a larger brand needs its social media team to handle – such as responding to legitimate complaints and any other customer support dramas of customers. Customer support teams have long understood how exhausting and demoralising it can be when every call is another problem to solve, another poor experience to turn around.
The role of social media moderator can definitely seem like a punching bag at times, taking the regular hits with professional courtesy and attention. Over time, this routine drip-drip-drip of negativity can gradually erode the moderator’s mental fortitude and stamina. Left unchecked, this can eventually lead to burnout or even more damaging mental health issues.
These days, people talk about burnout as if it’s a normal byproduct of the high-pressure workplace most of us will experience at some point. But burnout is far from trivial. It doesn’t just mean feeling worn out after a hectic few days.
The World Health Organisation describes burnout as “a syndrome conceptualized as resulting from chronic workplace stress that has not been successfully managed.”
Burnout can have long term impacts for the employer as well as the employee – such as reduced effectiveness and increased staff turnover.
Every year, over 7,000 Australians receive compensation for work-related mental health conditions. According to SafeWork Australia, employee depression, distress and disengagement lead to increased absenteeism and presenteeism (when employees attend but are less productive), costing employers $6.3 billion per annum.
The first step to protecting your team (or yourself) from developing mental health issues is to recognise the risk exists.
Unfortunately, most people won’t willingly ask for help or admit when they may be struggling, for fear of appearing weak or unsuited to the job. Instead, managers should be proactive in offering support where needed.
This might mean having regular and routine catchups with each member of the team to discuss what they’ve encountered and identify any red flags. When the whole point of moderation is to prevent as much negativity as possible reaching other people, you may not always be aware how much each person is having to deal with unless you make a point of finding out.
Also, make sure any member of the team knows they are allowed to “tap out” of a stressful situation for someone else to take over – such as when moderating a thread about race, gender or religion that risks attacking the team member’s own identity and beliefs.
And remember that everyone’s emotional resilience will likely be different every day. One day, you or a team member might be fully capable of handling whatever is thrown at them without the slightest worry. But that doesn’t mean their emotional resilience will be the same tomorrow, particularly as life outside the workplace also takes its toll. Flexibility is essential.
Social media practitioners also need to adopt some self-care as well, such as staying out of social when it’s time to rest or being aware of red flags such as increased anxiety or sudden mood changes. Sometimes, mental distress may not be the result of a specific comment or event but an incremental build-up over time. So being aware of your own mind is vital to identifying when things may be getting a little too much and burnout may be around the corner.
To outsiders, working in social media may appear fun, playful, even glamorous. After all, everyone else on the planet goes to social media to catch up with friends and spend some leisure time.
But on the inside, the story can be quite different. Facebook’s preliminary settlement is a major admission that social media can exact a significant toll on those employees tasked with keeping our online environments safe for everyone else.
As Sergeant Esterhaus said every at the end of every roll call in the classic TV series Hills Street Blues, “Let’s be careful out there.”