In a world of trolls and shamers, would you share your story?

Facebook comments can be a dark and disturbing place.

Facebook comments can be a dark and disturbing place.

OPINION: Telling your story to the world is a brave decision. The courage and honesty of those who agree to be part of our reporting, whether it's following the loss of loved one, facing adversity or discrimination, or even sharing private financial failures or successes, never ceases to amaze me. I admire these New Zealanders, and I feel a huge sense of responsibility in publishing their stories.

Which is why the impact of social media on the willingness of individuals to be part of stories, and the swift viciousness with which social reaction can take on a life of its own is really concerning to me and the Stuff team.

Social media is a strange beast, where misinterpretation can start a worldwide pile on, where users often make judgments based on a headline or photo rather than finding out or clicking in to read the whole story (as happened with this mother in an airport).

Molly Lensing and her daughter Antastasia were stuck at an airport for more than 20 hours when this picture was taken. ...

Molly Lensing and her daughter Antastasia were stuck at an airport for more than 20 hours when this picture was taken. Getting slammed on social media for being a bad mum was the last thing she needed.

A New Zealand journalism group I belong to recently debated a noticeable increase in difficulties with getting case studies to be named and photographed because of concern around social media reaction, and how to handle it.

We've certainly experienced this in our newsrooms. Our reporters are often asked by interviewees that a story not be posted on Facebook once it's published.

We regularly handle requests from people asking for stories to be removed from Facebook or from the website, or both, because of social media backlash.

I've been involved with Stuff's social accounts for more than five years now. I don't think it's that users are getting meaner, it's that the scale of reach and engagement in Facebook has skyrocketed. Not only do stories reach thousands more eyes than before, but the ability to discuss content is more public than ever before.

Every man and his mother-in-law's dog is on Facebook now. They all have opinions, and with the safety of posting from behind a computer in their own room, they aren't afraid to express them.

Previously if you took part in a news story, you couldn't overhear the pub conversations about the topic, about your actions, about even your appearance (yes, people still think this is fair game for debate). Importantly, you couldn't personally be searched for, downloaded, shared and contacted.

The harm done by vicious social media posts can reach far beyond the screen.

The harm done by vicious social media posts can reach far beyond the screen.


Our duty of care to those who share their stories with us includes managing the potential harm that may come to them from being part of our coverage.

When our reporters and editors know that a story has the potential to get a lot of flack online, they'll often talk about this with a subject, so they're prepared for it once we publish.

Ad Feedback

Sadly, there are important stories about vulnerable communities and people that we're hesitant to share on social due to concerns about hurtful comments that further put someone at risk. This means we end up limiting the exposure our audience gets to some vital issues. Simply put, the benefit of allowing the public to debate and discuss some topics is outweighed by the detrimental impact of that debate on those directly involved.

There are also stories we simply don't get to tell because those involved are too fearful of the feedback they'll get to proceed. We all lose when that happens.

Facebook doesn't allow for comments to be pre-moderated (viewed before published publicly), as we do with comments on We can only moderate comments after they have been posted. We post up to a hundred stories a day, and each garners hundreds of comments, so we simply don't have eyes on all the conversations our fans are having. This means that to some degree we rely on the community to let us know when a thread is getting out of hand. When making decisions about what is and isn't acceptable discussion, we have our own terms and conditions on community standards, and the Harmful Digital Communications Act to guide us.

In exceptional cases, we have taken steps to remove posts, if the comments turn toxic. It's not censorship, it's being decent human beings to those who can't protect themselves. 

Not cool guys, not cool.

Not cool guys, not cool.

Facebook itself has policies on bullying and abuse, and when necessary we use the network's channels for blocking and reporting horrible users (here's how to do that if you see one).

We can take these steps to protect our sources as much as possible, but we can only manage our own networks - our website, our comments, our Facebook page. We don't have any control over users sharing content from the site on to their own networks.

While we can do our part as a media organisation to limit the harm that comes to those in the news, you can also do your bit by being kind to each other, getting the full story before you pass judgment, and thinking before you hit 'post'.

 - Stuff


Ad Feedback
special offers
Ad Feedback