Web slurs can mean deep doo-doo
Our dear old dog is nearing the end of her days. Dallas the english pointer has hunted pheasant and quail everywhere from Central Otago to the East Cape. Originally bought for her hunting prowess, her defining feature turned out to be how bulletproof she is with children.
But now her body is starting to pack in, so she's finding it hard to keep weight on and her organ control is going. As a result we went online to find the best way to get rid of the smell of dog poop from our laundry floor.
The way that something as mind-blowing as the web has moved from the miraculous to the mundane is breath-taking, if a little sobering.
Its creative disturbance has knocked the living daylights out of industries such as postal companies and CD shops. It has also become the way to order groceries, pay the power bill and find out how to get rid of the smell of dog poop.
Surprisingly many existing laws have done a pretty good job of coping with these changes, particularly the principle-based statutes. The Fair Trading Act has proven itself useful in bringing the bevy of group-buying websites to heel.
The Criminal Justice Act has managed to deal convincingly with maverick bloggers breaking name suppression laws. Meanwhile, the Commerce Act proved no slouch when taming internet service providers.
However, one area that's proven thorny is unknowingly distributing potentially unlawful content. This can be as simple as a neo-Nazi group harnessing free web publisher Wordpress to distribute offensive material.
Or it can be as complex as a search engine returning an old (and previously deleted) piece of offending content that has been unintentionally preserved through the black magic of caching.
To date New Zealand's digerati have flitted nervously around the margins as to who is responsible for material put up by third parties on their websites, or indexed and stored. It's equally murky in social media, where a person may have private information posted about them, or be the target of obscenities or organised defamation.
Accepted practice has seen "notice and takedown" as the reasonable response. This sees the victim notify the host of the material, and the host investigates and takes it down if they determine it to be it unlawful.
But the actual implementation of notice and takedown is as varied as it is unrecognised by law. A trio of recent events has started to clarify this area.
One emerged last month when the Advertising Standards Authority issued guidance on social media. The key question was how should the ASA treat user-generated content on social media websites.
The ASA decided that if a company proactively solicits content from people on a product or service, or if the company reproduces user-generated content in its advertisements, then this can fairly be seen as advertising and be accountable to the Advertising Code.
So if you post a comment about how great your Toyota is, it's not advertising. But if Toyota runs a competition inviting people to post comments about their Corollas, it is advertising.
Two months prior to this the High Court heard a case brought by a New Zealand psychiatrist against Google after defamatory comments about him were posted on an American website. When a person searched on his name, Google's search algorithm ensured the search result referenced the defamatory material and gave a link to the offending website.
Google's defence included the argument that publication by a search engine provider which generates data automatically, does not amount to publication.
In the High Court, Judge Abbot found against the psychiatrist (because the case should have been taken against Google Inc rather than Google NZ), but noted it was reasonable to argue the search engine was a publisher.
In Australia last week Google was less fortunate, when a Victorian Supreme Court jury found Google was a publisher and potentially liable where its programmatically produced results include defamatory material.
The court awarded damages of $253,000 to Michael Trkulja after search engine results implied he was associated with the criminal underworld and Google didn't stop this from occurring when asked.
Last year the Law Commission's report New media meets News media noted that it was impossible and undesirable to proactively censor or moderate online discussions, and also referenced the import of notice and takedown.
The consequent final report recommended a two-tier regulator for speech harms, with the second tier being a communications tribunal chaired by a judge, who will make legally binding determinations between victims, perpetrators, and site hosts.
One of the first things the judge will have to do is define how notice and takedown should work, and whether it must be permanent. Until it does, there's a good chance other websites will find themselves in the poo - kind of like we did in our laundry last week.
Mike "MOD" O'Donnell is a professional director, eCommerce manager and author. He reckons english pointers are the world's best upland gun dogs.