Two weeks ago, the Law Commission delivered its final briefing paper on harmful digital communications - what originating minister Simon Powers described as a way to regulate the wild west of the web.
This final paper was fast-tracked by current minister Judith Collins. She gave the commission clear direction to crystallise its advice on the right regulatory regime for the interface between people and the web.
The likely motivator here was not so much the broader piece around harm caused by statements made on the web so much as cyber bullying, which picked up momentum after the chief coroner identified it as a suicide factor. Regardless, they got lumped in together in May with advice to hit the gas. Not something that's natural for the commission.
The resulting report got a good whack of mainstream media coverage, but the main focus was again the cyber bullying stuff. The key recommendations - the creation of a new criminal offence around harmful digital communications, the expanded use of information contracts and a new legal requirement for schools to combat bullying - seem a reasonable approach to a gutless crime and surprised few except, apparently, the Secondary Principals' Association.
The broader piece around web regulation was largely ignored, and it's these changes that will have the biggest impact on the millions of Kiwis who generate online content, along with content hosts and the service providers.
The report provides the rules of engagement for how New Zealanders can express themselves and seek redress when expression causes serious harm.
Perhaps what is most interesting is what has changed from the original “News Media Meets New Media” report delivered in December.
The original offered two models for the main regulatory body - a communications commissioner and a communications tribunal. The final report has chosen the tribunal model, effectively a disputes tribunal for online speech harms. I reckon this is a good call because the commissioner model felt like a toothless wonder.
The proposed tribunal will have the power to require takedowns, issue ‘cease and desist" orders, and request apologies. While it can't make offenders pay money, its orders would be binding, so non-compliance would put offenders in contempt of court.
The suggestion is for a District Court judge with web smarts to lead the tribunal, complemented by expert advisers and with several designated judges.
The tribunal will target the authors of offending content, but if they can't track them down the ISP or hosting web entity may be required to take action.
And you'd be a mug not to think Judge David Harvey is the natural digital native to head the new entity.
While the original report made tribunal decisions subject to District Court appeal, the final version opts for a specialist appeal vehicle. Again, this makes good sense.
The new tribunal will be led and run by web-savvy people who know a troll from a phisher, and understand how caching makes it all permanent. The thought of having its decisions reviewed by Luddites is ludicrous.
The report recommends internet safety group NetSafe becomes the “approved agency” to provide a necessary first port of call before cases are escalated to the tribunal. To them goes the job of sorting the wheat of substance from the chaff of noise, as well as trying to resolve disputes through mediation and discussion.
Putting up NetSafe for the role of approved agency was probably the right call, but don't under-estimate the amount of work that will be required to turn this largely school-focused educator and facilitator into a first-tier regulator and compliance agency.
While NetSafe has the credentials for the education part of the role, it is going to need a substantial reorganisation to deliver what's needed, not to mention beefed-up resources.
Apart from reassuring libertarians that the new structures are not a Trojan horse for net fascism, the two new structures face a couple of huge challenges.
The first is dealing with the amount of humour and criticism raining down on them that doesn't come within a bull's roar of real harm. The second is providing the speed of response that a 24/7 publishing machine demands.
Operationalisation will be everything - delivering the 1-2 punch in synch and quickly will be vital. Equally vital will be publishing results to provide clear flags to the rest of the local digital worlds as to what's fair and what's not.
At the moment, no local content host has a clear picture of where the line is drawn, and who to believe when it comes to notice and takedown.
Talking of speed, the next question is, what happens next and when? And let's hope the glacial approach taken to acting on the Privacy Act recommendations doesn't extend to digital harms.
- Mike “MOD” O'Donnell is head of operations for Trade Me. Disclosure of interest - MOD is a former director of NetSafe and participated in the Law Commission's consultation process.
Little Shop: Harmful or harmless?Related story: Shopping giveaway 'harming children'
What do the stars have in store for you today?
Test your mind with our puzzles
The Little Things, Dilbert, Tom Scott and others