As more organisations realise the importance of gathering their people online, they need to consider their legal and moral duty to create a safe space where those people can thrive around their shared purpose.
In her book, The Art of Gathering, Priya Parker writes about the responsibilities of bringing people together. She describes the fallacy of the “chill host”, who, fearing they will make it all about them if they step into a guiding role, plays it ‘chill’ and leaves guests to mingle. This may feel like a smart community building move, but it’s actually a killer. Members feel unmoored, dominant voices fill the void, and psychological safety - an essential community building block - becomes precarious.
Don’t fall into the trap of ‘chill hosting’; rather, be an engaged host that co-creates a rewarding culture and critical guardrails.
Central to proactive hosting is moderation.
Often misunderstood as merely ‘removing the bad stuff’, moderation is holistic, strategic work that informs both the experience and effectiveness of our communities.
For many countries, it’s also a matter of regulatory compliance (such as here in Australia, where we have a new Online Safety Act and legal precedents that mean community owners can face liability for their user behaviour).
Moderation is central to a safe, effective and value generating online community experience; it should never be an after-thought.
We can frame moderation in two ways: regulatory, and cultural.
What is regulatory moderation?
Regulatory moderation means keeping your digital community or platform compliant with legal and regulatory requirements around online content and behaviour (such as hate speech, threats of harm, defamation or misleading advertising).
What is cultural moderation?
Cultural moderation is when we moderate to instil or protect the tone and social norms of a group, community, audience or brand.
For example, a certain type of content or behaviour might be perfectly legal, but it doesn’t really fit with the social norms or standards of the specific group or community we are working with. This could be something like swearing. It’s not against the law, but your members or users may not feel it’s appropriate for their shared digital social setting, so it’s prohibited by guidelines, and moderated if it occurs.
Your users or members can and often will play an important role in regulating your online space, helping it stay healthy and productive for all. But this doesn’t happen in a vacuum - it takes careful nurturing to steward the psychological safety necessary for others to play their part.
For too many folks out there, the internet is a hostile, dangerous place. This is an opportunity for all community builders and managers to lean into creating spaces that counter those forces.
How can you create alternative social experiences that tap into the positive potential of connecting technologies so many of us know and appreciate?
At its best, moderation is an opportunity for our communities to learn-out-loud how to be a better human, and create a better internet.
Here’s a simple community moderation checklist to help you stay on top of risk, and build a safe, thriving community:
- Ensure you have clear Community Guidelines that spell out prohibited behaviour (noting any local regulatory requirements to your region or industry), and call out what it means to be a great community member. These should be agreed to as part of your member onboarding journey. (They don’t have to be a boring old list - you can be very creative in their presentation!)
- Create a Risk Matrix or similar document that ranks your leading risks, and for each, what actions to take, who to notify or escalate to (including contact details and back-ups), and timeframes to abide by.
- Engage in regular moderation record keeping, which you’ll need if things go sideways. Some platforms, like Guild, offer moderation audit trails or admin note-taking features, or you can use a simple spreadsheet as a shift report or incident log. Keep it simple and consistent.
- Consider a Response Guide to act as a playbook for common and high-risk scenarios that includes templates, guidance around tone and links to relevant resources (such as support services for mental health).
- Ensure you have moderation by a human happening - consistently and consequentially. Automated tools are increasingly helpful, but nothing beats a trained community professional or moderator who understands your key responsibilities and the unique context and culture of your people.
- Include yourself in your moderation and risk planning. Your well-being is crucial to doing the work of community and governance. If you’re burned out, experiencing vicarious trauma or just feeling overwhelmed, you can’t show up for your people. Make sure your plans allow for you to take regular time away, share burden, and access professional support (such as resilience training or counselling). Increasingly, organisations are thinking about online safety as a matter of OH & S - if you’re employed as a community professional or moderator, you have a right to institutional support for the challenging aspects of helping keep others safe.
About the author
Venessa Paech is Australia’s leading expert in online communities. She is Director of Australian Community Managers, the professional body for online community practitioners, and teaches community management at the University of Sydney. She founded Swarm, the APAC conference for community managers, and All Things in Moderation, the only professional conference dedicated to online moderation, launching May 2023.
Attend the All Things in Moderation Conference
All Things in Moderation (May 11-13) is a global online conference about the art and science of online moderation, gathering diverse experts from around the world to share knowledge and best practices across platforms and industries.
It's a must-attend for online community professionals and anyone gathering people online.
Available at: http://bit.ly/allthingsinmoderation
Join Venessa Paech at our AI + Community: Opportunity or risk? webinar - Tues 18 April
In Guild's community trends for 2023 we highlighted that AI would impact Community building.
But could anyone have predicted just how disruptive AI would be in such a short time frame?
*Join global community experts Richard Millington, Venessa Paech, Blaise Grimes-Viort and Gregor Young at this free, 45 min webinar/event.*
Tues April 18th 2023
14:00 - 14:45 BST
15:00 - 15:45 CEST
09:00 - 09:45 EDT
Share ideas on Community Moderation with fellow Community Managers
Come and join this free online community for community and social media professionals, however experienced you are.
If you’re a community strategist, community builder, community manager or social media professional, join Guild Community Collective.
Share best practices, ideas, inspiration, interesting content and resources. Get feedback from the group on ideas and initiatives, develop partnerships and make useful industry contacts and connections.
We run virtual, in-person and hybrid events where we hope many of you will be able to meet.
More community strategy training
Join Guild 🤝
See for yourself how the Guild experience is different to WhatsApp, Slack, LinkedIn or Facebook Groups.
Guild is a safe space to connect, communicate and collaborate with others.
Join us on a platform that is purpose-built for creating groups, communities and networks on mobile.
- Just want to join some groups? Simply join Guild and then look through the discoverable groups and communities to find relevant ones to join
- Thinking of running your own community? With an elegant and simple to use, mobile-first UX you’ve got everything you need to start a community - custom branding, analytics, group and user management and support. Get started with your own community here with our free and paid options
Contact us if you want to know more or have any questions