When a young person shares something personal for the first time, it’s a moment that can shape how and if they ask for help again. We see firsthand how significant that first step into support can be and how the right environment can make all the difference.
From working with thousands of young people, we know that technology and moderation are important, but it’s also about creating a space where someone feels able to open up for the first time, even if they’re not ready to do so face-to-face.
Confidential, but personal
Kooth is often the first place young people share their experiences, and for many, it may be the only place they feel safe doing so. Our model protects their identity while never compromising their safety.
Every user interaction, whether confidential or named, includes:
- •
A risk assessment and formulation, carried out by a trained practitioner
- •
Identification of risk indicators, protective factors, and available resources
- •
A safety plan, updated at each interaction
Where suicide risk or other serious safeguarding concerns are present, users are asked to share personally identifiable information (PII) so we can connect them with local services such as CAMHS, GPs, schools, or social care. If a user chooses not to provide PII, we remain with them in the space they’re comfortable with, offering clear signposting to appropriate crisis services.
This model reflects the reality that disengagement is a right, online or offline. But by offering a non-judgemental safe space, many young people choose to open up sooner than they would elsewhere.
Pre-moderated content, always age-appropriate
Everything on Kooth, from stories to discussion board posts, is fully pre-moderated. Not a single word is published without being checked by a trained moderator. This ensures:
- •
No triggering or harmful content is ever shared (eg. methods of self-harm or suicide)
- •
Content is age-gated using evidence-based brackets: 10–12, 13–15, 16–17 and 18+
- •
Users only see what is developmentally appropriate and relevant
Our moderators are qualified professionals, trained in safeguarding and aligned with NHS best practice.
Built on clinical best practice and safeguarding frameworks
Every practitioner on Kooth is a qualified mental health professional, not an AI or chatbot. This is a conscious choice. We believe meaningful support requires real human connection, guided by training, experience, and compassion.
Kooth adheres to strict NHS-aligned safeguarding frameworks and maintains robust policies and procedures to ensure the safety of all users. Our training programme ensures everyone, from counsellors to content moderators, is equipped to respond to risk, provide support and take the right action when needed.
Why this matters to you
Whether you’re a teacher supporting a young person in distress, a commissioner evaluating digital mental health provision, or a safeguarding lead exploring safe spaces online, Kooth offers an evidence-based, rigorously safe environment.
We understand that change can be hard to manage. For young people going through transitions, such as school moves and identity exploration, Kooth is a constant, safe, and supportive presence. And for professionals, it offers peace of mind, knowing that when a young person logs in, they’re entering a space that puts their safety first.




