Stopping the harms without muzzling free speech

MIKE O'DONNELL
Last updated 05:00 05/04/2014

Relevant offers

Opinion & Analysis

Help the parents, not just the kids Time for TV to be freed from its historical flock What will be done with the RMA? Hidden dealings and hushed mouths Filmed tomfoolery could cost your job Many now dream of being forgotten Small wonder we don't get risk Big debts: bigger water bills SCF fraud judgment a shock Dirty Politics damages science

OPINION: In the late 19th century a brilliantly irreverent Irish playwright developed an aesthetics philosophy that's still popular today.

Oscar Wilde reckoned life imitates art more than art imitates life. In other words, the real world seeks to find expression and art provides it with such an avenue.

Wilde's words came back to me as I considered the unfortunate series of real world events that have unfolded as the new Harmful Digital Communications Bill passed through gestation.

The real world has provided a conveyor belt of examples over the last year illustrating the need for meaningful redress for victims of online speech harms.

Six months ago the Roastbusters, a group of young males with too much testosterone and too little sense, gained infamy with their online bragging about sex with inebriated minors.

At the same time Justice Minister Judith Collins tabled the Bill that would provide the Roastbusters' victims with quick and efficient redress.

Then in late February, as Parliament's justice select committee started to receive submissions on the Bill, entertainer and TV host Charlotte Dawson was tragically found dead in her Sydney home.

As well as battling depression, Dawson had been the target of sustained attacks on social media websites, and played a leading role in lobbying the likes of Facebook to better police inappropriate content.

The Bill sets the rules for how New Zealanders can express themselves online and seek amends when expression causes serious harm.

Those who suffer significant emotional distress as a result of someone posting something online that breaches a communications principle will have a legal remedy and an "approved agency" to help them put things right.

There will be recourse to the District Court to get content taken down.

In addition, the offence of aiding and abetting suicide no longer requires the person to actually commit suicide before Police can act.

There's also welcome clarity for all companies, amateur bloggers or Facebook users with pages where third parties can post comments or questions.

Until now, content hosts have operated in the dark, not knowing where the line is drawn and potentially taking on a blind publishing risk as a result.

To shine light into this murky area, the Bill includes a sensible "safe harbour" clause which gives content hosts clarity on where the legal line is.

The provision operates in the established manner of "notice and take down", and echoes similar arrangements in the US Communications Decency Act, and the EU Ecommerce Directive.

Ad Feedback

The safe harbour provision means if you have a website which enables user-generated content and some clod publishes a comment that is harmful, you won't be held responsible for it as long as you do something about it once you know about it.

To qualify for this protection the host must provide an accessible mechanism to allow people to complain, and then act quickly once the complaint lands.

However as it stands, the wording fails to recognise the situation where a comment might be true, albeit unwelcome.

For instance, a whistle-blower might use a website to expose unlawful or unethical practices. Or a forum may discuss events of general public interest, and one person may not like the way their involvement has been portrayed.

Under the current provisions, if a person complains to the host, the host may take the safe route and simply remove it to avoid losing their safe harbour protection.

This could result in a dodgy situation being buried, or preventing the debate of public issues as no content host is likely to take on the risk.

There is a way to avoid this chilling effect. For example, the content host could get the author's confirmation they stand behind their comment and share contact details for any legal action.

The host would leave the material live but not surrender their safe harbour. If the author is prepared to stump up in this way, then to forcibly gag them is against the principles of free speech.

To be clear, in most cases content hosts will be well-motivated to take the easy option and nuke it.

But, where the content has value, and the author will accept service, then hosts should have the option of letting the conversation continue.

Oscar Wilde was no stranger to controversy. His play Salome was banned for 40 years, and his comments on social issues were cutting and unpopular with the powers that be.

In fact, many would have preferred them to be hidden from the public.

If Wilde was about today, then he would be writing in an online forum. If his words weren't blatantly harmful and he was prepared to stand behind them, it would be a crying shame for a third party content host to remove them.

Mike "MOD" O'Donnell is an ecommerce manager at Trade Me and a professional director. His Twitter handle is @modsta and he peer reviewed the original Law Commission digital communication proposals in 2012.

Special offers

Featured Promotions

Sponsored Content