||Our Community Answers:
||Community Implications of Article 14 of the European E-commerce Directive?
What would you say is the implication of the Article 14 of the European E-commerce Directive on the moderation of the community and how should it affect our approach to managing online community?
The directive states that:
Where an information society service is provided that consists of the storage of information provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the information stored at the request of a recipient of the service, on condition that:
(a) the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or
(b) the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information.
You raise an interesting and important question -- thank you!
From our years of experience in the community building industry, we have found that moderation provided by a staff of well trained moderators is critical to any community initiative. With regard to Article 14 there are two key areas to consider:
1. That where there is no moderation a community site can claim in an official sense that is does NOT have knowledge of materials which might be illegal, defamatory or otherwise inappropriate. However, liability issues would not be completely removed as explained below.
By adopting a laissez faire or "hands off" attitude the community would sidestep Article 14 commitments but at the same time this would lead to a number of other problems. The most important of these problems would be first, that the community itself would suffer from negative content. Obviously, this is all-important since your brand, your membership loyalty and the overall quality of your community is to a large extent defined by the communication and content of the community. Second, the community site owner would more than likely be approached by members of the community to "clean" up the community. At such point, the member's contact with the site owner about any illegal or inappropriate content would then constitute knowledge of the content (as defined by Article 14) and they would therefore have to act upon it "expeditiously" or be liable for it.
It would be unlikely that a community site owner could claim to have an absolute lack of knowledge of illegal activities or content, since it would be reasonable to assume that they visit their own site in order to assess and report on its success, to measure, and to rectify technical issues. Similarly, since the site owner would be contacted by its members about content within it so the protection of Article 14 might be very thin.
2. That where there is moderation in a community it can be claimed that the community owner DOES have knowledge of illegal content and is therefore liable. In such instances it is imperative that the moderation is as complete and rigorous as possible, so as to ensure that the fact of moderation does not lead to Article 14 liability.
Moderation promotes community growth and quality control and can ensure the following:
- An increase in brand awareness,
- Serve as a site resource to assist with member queries,
- Alleviate potential problems or issues through visibility within the community,
providing peace of mind for the site owner or company,
- Enable the company or site owner to spend more time on growing the business or services without having to worry about maintaining the community,
- Establish a rapport with members, thus increasing traffic and encouraging positive interaction,
- Most importantly, steer conversations away from potential "hot spots", and if necessary remove content or members that may create problems or potential legal issues as mentioned in Article 14 of the European E-commerce Directive.
With a strong moderation presence, members are more likely to maintain the site guidelines with their own participation, and ensure that others are doing the same. It is not uncommon, based on the relationship that they develop with the site via the moderators that members will assist with any potential community issues, thus diverting problems where possible.
To ensure that a moderation staff is present and visible within a message board community environment and also to ensure that any acts to remove illegal or offensive content is "expeditious" we recommend that moderators check their message boards regularly - often twice or more each day.
However, even with a comprehensive and complete moderation system in place, in order to truly protect oneself from Article 14 liability then a pre-publication moderation system may be required.
This would provide you, as a site owner, with a guarantee of knowledge of illegal or offensive postings BEFORE publication, thus avoiding any possible contravention of Article 14. In terms of Article 14 this form of community moderation is the only genuine means of providing cover for the content your community site might generate.
In sum, it is early days in terms of internet law and whilst Article 14 sets out in principle the means by which liability is adjudged, the practice is not quite so clear. What is certain is that community sites have found themselves in the spotlight recently because of content and communication liability. Understandably, this poses questions for all of us involved in online community. A common sense approach, backed up with a moderation system which is rapid, complete and rigorous as is practicable would seem at this point to be the most appropriate means of ensuring that one's community is safe, viable and effective, whilst at the same time assuring legal footing for you as a site owner or community manager.
- Jon Nix and Pam Thomas