[WARNING: DISTRESSING CONTENT TO READ.]

Content can be amazing. Like this viral video of an Indian boy's anti-war message shutting down a reporter pushing him on his stance on Pakistan: "long live everyone in their own space. Long live you in your place."

But, content can be traumatising. For users, for moderators, and for people searching for more when news breaks. How often have you "doom-scrolled" and felt traumatised?

And how does social media (for example, Meta with 3.5 billion short videos, 1 billion Stories & 300 million photos posted, daily) moderate and safeguard users?

Let's talk about that... [WARNING: DISTRESSING CONTENT TO READ.]

How do we deal with traumatising content?

Charlie Kirk was shot at a public event, in front of hundreds of people at a Utah college campus, many of them holding up phones to record the celebrity right-wing speaker. Explicit footage of the shooting was instantly available online, from several angles, even in slow motion. Millions of people watched. The aftermath - memes, comments, graphic imagery, comparisons - were instantly at the top of the feeds for anyone with a smartphone.

Before Kirk was even confirmed dead, videos of his death were not only easy to find, but for many, unavoidable to watch.

Footage of the shooting spread on social media platforms X, Facebook, TikTok, Instagram, YouTube – and Truth Social, where US president Donald Trump posted official word of the conservative activist’s death.

Because of the feed-based nature of these platforms, designed to keep users hooked with a constant flow of new content (i.e. users don't choose what they see next) and videos on auto-play, people are suddenly and unexpectedly shown graphic, violent or disturbing content - without warning, choice, or the chance to prepare themselves.

Videos were posted and reposted at lightning speed. Users urged people not to spread the images: "For the love of God and Charlie's family, just stop."

On the same day, there was a shooting at a Colorado high school. Young people expressed their shock and dismay at the overload. "It's come to a point where we've normalised learning from one shooting to another shooting," said Leslie, a student at Santa Ana College.

The lasting effect on our psyche from seeing such content can be traumatising.

Dr Ricardo Whyte, psychiatrist with Dignity Health Community Hospital, said Charlie Kirk connected with millions of people through his videos online. "For many, he was part of the social media fabric," he says. "What we experience, even through a screen, can be absolutely profound and impactful."

So how do we process these disturbing images?

As human beings, Whyte says such a horrific event elicits feelings of fear, vulnerability and despair. 

His advice is to own how you feel.

"Giving yourself time to ponder, 'How is this impacting me? What does it mean to me?' Sometimes journaling, sometimes it might even be getting professional help to really get at, 'What did this mean to me?'" Whyte said.

He added that an important part of healing is to understand how you're feeling and to channel it into positive action. Working through trauma takes time, so Whyte advises to be kind to yourself and to each other.

Rituals like vigils and memorials allow us to reflect, examine and remind us not to lose hope. "We can put intention into not being desensitized. That's the reason we have rituals. Rituals allow us to understand, take a moment to understand the significance of this thing," Whyte said.

"We have the loss of an American here, and as a nation, we should all be able to rally around that. An important component of our healing is taking on the agency to do something positive and productive," he said.

"Politics is part of the human expression... but I think there are times to put politics aside and retreat to our humanity. The bottom line is, Charlie was human," Whyte added.

How can we help children?

It's likely kids who see these videos aren't sure how to process them, says Whyte. The language you use depends on their age, but you should mostly listen; "hear where they're at, what they're experiencing, because the most therapeutic thing is when we can give language to our feelings," Whyte says.

Catherine Knibbs, Cybertrauma and Online Harms Consultant, Speaker, Author and Pyschotherapist, posted this video about symptoms to look out for and what happens in 'cyber trauma' - when someone has been exposed to violent or disturbing content online - and how to help children.

"Pay particular attention to your children and their online activities during this time because unfortunately videos like this can often be shared in group chats, on Snapchat, without children knowing what they're watching until it's too late," she says. "Whether intentional or not, videos that show violence... like these can have a profound impact on our nervous system and you might already be noticing the symptoms I describe in this video... be kind to yourself and also know that your reactions are not 'over the top' or 'sensitive' as it's very common after viewing such distressing content."

I'll end with further writing on this topic from Catherine Knibbs' blog:

"Videos that depict violent images, graphic scenes, torture, child, people and animal cruelty should have been banned in the first place. Not because I disagree with the content per se and not because I disagree with free speech or condemnation of this type of violence in society.

It's the psychological, emotional, social and furthermore the physical damage to the brain that has probably occurred to people who would not necessarily be exposed to these types of videos throughout the last 7 years, unless they made the active choice to go looking for it.

The likelihood is younger people will find a way to access these videos and this will have an effect on the population of the younger generation of the Facebook community who see these videos. This is because children brains are not able to handle this kind of trauma. To be honest no-one’s brain is really equipped for this."

Dysregulated society, online.

BY JENNIFER ANN FALANDYS | FEBRUARY 2025

Right now, a lot of people are experiencing triggers. They may not be able to tell you that, and they don't have to unless they want to. Some people may not even realize this for themselves. 

We live in a highly dysregulated society at a time when we are experiencing a volatile rate of change, there is a lot of uncertainty which leads to a lack of predictability, ambiguity has increased due to panic posting, fact vs. fiction is hard to decipher, the use of AI is on the rise, and there are different types of biases to contend with.

You may be struggling with wondering, "What is true?" You may be navigating the difference between personal and collective truth when there is a lack of context or nuance. This may lead to increased trust issues. You may also be holding tightly to what is your truth, and that is understandable and a normal human response. 

All of this is coming at people in a volume and at a rate that I can't say that I have personally ever experienced and some days it feels like a bombardment. 

People are highly dysregulated and are experiencing trauma, re-traumatization, trying to navigate under the weight of toxic stress, adversity, and overwhelm. For many this is not necessarily new but adds to a cumulative effect.

Some people already have PTSD or c-PTSD from trauma, adversity, or repeated patterns of toxic stress and nervous system overwhelm. They may not have enough room to participate or respond how people expect or want them to, especially right now. There are days that sometimes feel heavy, like there is no end in sight, and we cannot get off this ride.

Some people will have no grace to give. Some people will scramble for power and control out of the need for self-preservation among other things. Others will be extremely anxious, extra emotional and may panic post. Many of these things start happening when we do not have space between a stimulus and a response. There is no room for a pause to breathe, reset, or contemplate and apply logic and reasoning to a situation. These are ways that humans cope and deal with things. 

What I do know is that I cannot control how people are thinking, behaving, emoting, and communicating (no matter much I would like to). This is a human thing. 

What can I do? 

  • If I need to keep scrolling, I keep scrolling. 

  • I have pulled back in some areas of advocacy.

  • If I need to take a social media break, I get up, get some movement in, or ground myself with water by either taking a shower or doing dishes. 

  • Movement has been shown to be helpful, aiding in grounding and regulation. 

  • We can also take time to sit in silence and embrace moments of stillness in a chaotic world. 

Many of us are trauma survivors, and many of us are struggling right now. I still believe that we will get through this. I try to give grace where possible and a little room to be a messy human when I/we struggle.

Where and when I need to, I do hold my boundaries when it comes to dignity and dehumanization of self.

If you need help with this for yourself, please reach out. I know that there are so many of us that have no choice but to be online for socialization, advocacy, work, or other personal purposes.  

If you have struggled with how to navigate all of this when you tend to live life in the grey area, you are not alone, and your needs matter too.

In case you need to hear this, you are not uncaring or "doing it wrong" by just needing to live your own life and get through the day.

[AD] Pawsitive recommendation:

Recharge your energy.

In the midst of trauma, dysregulation, volatility and overwhelm - whether it's from these few days, a longer time, or with adversity you deal with privately - we might not see where it's all heading.

But if you're overextending yourself and you have no idea how to keep it up, you're on a one-way track to burnout.

You need a way to stop, reassess, and recuperate.

One fun, gamified way to do this is inspired by spoon theory, Dungeons and Dragons, and energy work.

Just 20 minutes chatting to Melissa Cox gives you a framework to assess your actual capacity each day. The Arcane Energy System is designed to protect and recharge your spell slots to tackle each day.


> Chat to Melissa about how to cast energy spells each day. <

Global alliance for content moderators.

Content moderators who comb through harmful material uploaded to online platforms (workers for Meta, TikTok, Google and more) have formed a global trade union alliance to fight for working conditions.

The Global Trade Union Alliance of Content Moderators (GTUACM), launched in Nairobi, Kenya in April, aims to “hold Big Tech responsible” for failing to address workers’ issues like low wages, trauma, and lack of union representation across the industry.

Mophat Okinyi, founder/CEO of Techworker Community Africa (TCA), was part of the groundbreaking, historic moment:

"For the first time ever, content moderators from across the globe, came together to form a united front. This alliance is a collective demand for justice, dignity, and accountability in an industry that too often treats its most essential workers as disposable.

I’ve seen first hand the trauma, exploitation, and silence that surrounds content moderation. I’ve stood with workers punished for organizing, suffering in isolation, carrying the burden of the internet’s darkest content, without adequate support or recognition.

But that era is ending. We are organizing. We are global. We are rising.

Together, we’re holding Big Tech and their outsourcing partners accountable for the harm in their supply chains."

GTUACM provides a global platform to bargain with tech companies, alongside coordinating collective campaigns and researching occupational health. Content moderators will be part of the alliance through their trade unions, with unions in Ghana, Kenya, Turkey, Poland, Colombia, Portugal, Morocco, Tunisia, and the Philippines currently forming the alliance. Unions from other countries, including Ireland and Germany, are expected to join in the near future.

Companies like Meta, Bytedance, and Alphabet outsource content moderation on their platforms to contract workers. The job requires these workers to analyse and flag violent videos, hate speech, child-abuse imagery, and other harmful content. Because content moderation is what a hazardous 21st Century job looks like.

“The pressure to review thousands of horrific videos each day – beheadings, child abuse, torture – takes a devastating toll on our mental health, but it’s not the only source of strain. Precarious contracts and constant surveillance at work add more stress,” said Michał Szmagaj, a former Meta content moderator who is now helping workers to unionize in Poland. “We need stable employment, fair treatment, and real access to mental health support during work hours.”

Meta is facing three lawsuits relating to psychological distress inflicted by the contracted role and the treatment of these outsourced workers in Kenya and Ghana. A group of former content moderators who flagged graphic and violent videos on TikTok has also filed a lawsuit against their former contractor, Telus Digital, over claims they were fired for trying to unionize and improve their working conditions.

“The content we see doesn’t just disappear at the end of a shift. It haunts our sleep and leaves permanent emotional scars,” Özlem, a former Telus worker, said in a statement to the UNI Global Union. “When we raise it with our managers, they say these are the conditions TikTok, the client, requires. When we stand up for better conditions at our jobs, our coworkers get fired.”

GTUACM says many moderators in the industry experience “depression, post-traumatic stress disorder, suicidal ideation, and severe mental health consequences” due to being exposed to such content without adequate support. Workers also face unrealistic performance targets, employment uncertainties, and fear of being punished for speaking out.

For Emmanuel (not his real name), becoming a human firewall for violent and extreme content was initially a point of pride, recalling it felt like “a very big deal and a very nice thing to protect users from receiving such disturbing content. It was a big responsibility for me.”

But the cost of the work and his experience will stay with him for ever, which is why he is speaking out. He hopes future moderators will not face the same circumstances. “All the flashbacks, all the content, all the incidents of what happened to me are still with me.”

He would work through an average of 500 to 600 pieces of content at speed while moderating them with 85 per cent accuracy to confirm whether or not the post should be removed. “The first weeks I didn’t come across any disturbing things, [it was more] hate speech and comments, but gradually I started coming across some disturbing and graphic content."

In hindsight, he feels as if he was being brainwashed and did not realise how abnormal his response to the disturbing content was for a long time. “I did not intend to normalise [what I saw], but the system brought me to the level that I normalised it… For the first few weeks we [Emmanuel and colleagues] were very afraid. We were very scared of watching our screens. But gradually things became normal… we started normalising it.”

He was shocked and disturbed by what he had to watch, but over time became desensitised and sucked into a culture of black humour in his office. “I even started enjoying the content. I was very happy to see people being slaughtered and skinned alive. I was making fun out of it, which is not normal and against humanity."

“Companies like Facebook and TikTok can’t keep hiding behind outsourcing to duck responsibility for the harm they help create.” says Christy Hoffman, General Secretary of UNI Global Union. "This work can - and must - be safer and sustainable. That means living wages, long-term employment contracts, humane production standards, and a real voice for workers."

Social prescription saves lives.

"Social prescribers" are helping patients in Leicestershire.

When Tracy Moore was about to lose her job, she says her mental health hit rock bottom - she felt abandoned and even considered taking her own life. "I needed to work to keep my brain active," she said. "I'd worked since I was 15 so the thought of not having a job devastated me."

"One in five GP appointments are not for patients' medical health but for their social health and mental wellbeing," says Lucy Moore, manager of the Social Prescribing team at the Hinckley and Bosworth Medical Alliance, which represents 12 GP surgeries. Social prescribing is a person-centred approach that connects individuals to community activities, groups, and services to improve their mental health. This can include joining a choir, volunteering with animals or support with household bills.

In Tracy's case, volunteering for the NHS as a patient advisor gave her the sense of purpose she needed to improve her mental health.

"My social prescriber, Molly, listened to me, to find out what was at the root of my depression and suicidal thoughts. Having someone to listen to me was just amazing."

"It changed my life completely, I can't thank them enough," she said.

Reporter: Aren't you a little ashamed…?

"There are people over there too, here too. Muslims there too, Muslims here too. Hindus there too, Hindus here too. Everyone is human. Then why kill everyone? Tell me! Tell me why it should be destroyed? Everyone has a right to live. Everyone has that right. Then why destroy them? You're saying destroy that country, destroy that country, but everywhere people live."

Reporter: …who taught you this?

"Bro, I have a brain, man."

Muhammad Kaif.

Feature your business in Pawsitive News

Reply to meee

Amanda

View this email in your browser.

Copyright © {{right_now.year}}  {{location.name}}, All rights reserved.

Not so pawsitive? >> unsubscribe

Pawsitive Newsroom Facebook Group
Threads Amanda
Amanda Facebook