16 July 2019
The risks associated with operating a public Facebook page have increased under Australian (state-based) defamation law following a recent decision of the Supreme Court of New South Wales in Voller v Nationwide News Pty Ltd; Voller v Fairfax Media Publications Pty Ltd; Voller v Australian News Channel Pty Ltd  NSWSC 766 (Voller).
The decision has deemed certain administrators of public Facebook pages ‘primary publishers’ of third party comments, ruling out the defence of innocent dissemination. Although the defendants in Voller were media companies, the reasoning could apply to any company or person operating a public Facebook page.
In reaching its decision, the court relied on the degree of control that an administrator has in operating a public Facebook page and the purpose for which the page exists, which in this case was for commercial gain.
Fundamentally, it asserted that each defendant could:
However, in our view and with all due respect, this finding appears to be open to challenge. We say this because Facebook’s functionality, affording administrators ‘control’, is either not always available or not easily accessible. Therefore, the ability to exercise absolute control in this context really depends on the tools made available for:
Currently, the blocking and moderation tools available for public Facebook pages (eg compared with group pages) appear to differ for posts by third parties versus comments by third parties, a distinction which, respectfully, does not appear to have been addressed by the court.
While the Voller decision may ultimately be appealed, it has immediate and significant implications for administrators of public Facebook pages and raises important considerations when using social media as a business tool, for these reasons:
What has changed?
The number of ‘primary publishers’ on social media has expanded due to the decision in Voller. This matters because businesses operating on social media may have previously assumed they were protected against defamation claims by the defence of ‘innocent dissemination’, or simply not thought of it as a risk at all.
Until now, businesses seeking to engage with their online community and customers on social media, have done so without needing to invest heavily in the monitoring and removal of such content. However, following Voller, the mere capability to exercise control over third party comments, including due to the existence of Facebook’s tools, is now enough to put companies in the firing line as a primary publisher.
Although the Voller case did not determine whether the comments themselves were defamatory, the preliminary question considered by the court (i.e. whether the media companies involved were ‘primary publishers’) has much wider significance for all companies running public Facebook pages or providing online ‘discussion forums’. Indeed, the emphasis given to the ability to exercise control could mean that Voller has paved the way for an ever-widening group of primary publishers.
This may cause concern for social media platform providers themselves. For example, Facebook and administrators each have an ability to exercise some control over third party content on Facebook. They each play a part in facilitating discussion either by providing ‘public pages for business use’, providing the ‘tools of control’ or activating the available tools. Both parties also derive commercial benefit from maximum end user activity on the platform (from advertising revenues generated by unique visitors and click-through rates). An administrator may ‘prompt’ discussion based on the original content it posts but Facebook’s algorithms that control what Facebook users see in their newsfeed, may equally prompt discussion. It would not be surprising if this critical question of control, and consequent liability, is tested further in the courts in the context of social media for business use.
Are the tools offered by Facebook enough to eliminate risk?
In our view, Facebook’s tools themselves will help but may not be enough to eliminate all risk relating to defamation when using Facebook for business. Rather, reducing risk effectively is likely to involve actively and manually monitoring all comments, establishing clear guidelines for the removal of defamatory content, activating Facebook’s filtering tools and considering blocking new posts.
The reason Facebook’s tools alone may not be enough is partly due to the limitations which, with respect, do not appear to have been identified in the Voller judgment. In our view, the terminology used in the judgment, which predominantly refers to 'comments', obscures exactly what third party content can be prevented from being published in the first place. In summarising the expert evidence of Ryan Shelly (given in cross-examination), the court reached the conclusion that “all comments” could be blocked “totally”. However, it is unclear whether this is a reference to:
The judgment suggests that an administrator can “forbid all comments” from being published entirely. This seems to encapsulate both posts and comments. However, it seems clear from Facebook’s tools (including guidance on Facebook’s Help Centre), that the level of control currently differs, depending on the type of third party content (namely, posts by third parties versus comments by third parties).
While our own independent investigations have demonstrated that it is possible to disable the capacity of third parties to publish any new posts, this functionality does not appear to be available for comments. If this is correct, it would mean that the highest level of control over comments will ultimately depend on the effectiveness of the filtering tools and any manual monitoring undertaken.
In using Facebook’s filtering tools, an administrator can ‘hide’ certain content containing specified words. However, this method is unlikely to be foolproof as it would be virtually impossible to anticipate every potential word which may be used, particularly given the potential for different spelling, and the fact that defamation arises from the imputation arising from the words.
Further, hidden comments can still be seen in grey colour to the owner of the public Facebook page, to the third party who posted the comment, and to Facebook friends of that third party. Given publication to a wide audience is not required for a defamation action, there is still a risk of liability for defamation for owners of public Facebook pages even when these tools are activated.
Given this apparent distinction between posts and comments is not addressed directly in the judgment, there is some uncertainty as to an administrator’s ability to moderate all third party content. As a result, the functionality of Facebook’s moderation tools (and how easily and readily available they are) may need further interrogation as this is not, with respect, set out conclusively either in the judgment or in wider commentary on the implications of this case. This might be an area that is explored further should the defendants appeal.
Key takeaways and tips
As with all case law, application of this decision will depend on the facts. A material consideration in Voller, as stated at the outset, was the commercial purpose for which the public Facebook pages were used. Therefore, an assessment of the particular public Facebook page in question is highly recommended in order to determine a proportionate response.
The information in this publication is of a general nature and is not intended to address the circumstances of any particular individual or entity. Although we endeavour to provide accurate and timely information, we do not guarantee that the information in this publication is accurate at the date it is received or that it will continue to be accurate in the future. We are not responsible for the information of any source to which a link is provided or reference is made and exclude all liability in connection with use of these sources.