TUTORIAL12 min read

Implementing SafeComms in Node.js

A comprehensive guide to adding content moderation to your Express.js and Next.js applications using the SafeComms Node.js SDK.

Overview

This tutorial covers two common Node.js integration patterns:

Express.js

Build a reusable moderation middleware that automatically scans incoming request bodies before they reach your route handlers.

Next.js (App Router)

Integrate moderation into API Routes and Edge Middleware for server-side content validation with a React client component.

By the end of this guide, you will have a working moderation layer that checks all user-submitted content before it reaches your database.

Prerequisites

Node.js v18 or higher
A SafeComms API Key Get Key →
npm or yarn package manager

Environment Setup

Store your API key as an environment variable. Never commit API keys to source control.

.env
# .env
SAFECOMMS_API_KEY=sk_live_your_api_key_here

Part 1: Express.js Integration

1. Install & Initialize

Install the SafeComms SDK and set up the client in your Express application.

$ npm install safecomms
server.js
import express from 'express';
import { SafeCommsClient } from 'safecomms';

const app = express();
app.use(express.json());

// Initialize SafeComms with your API key
const safecomms = new SafeCommsClient({
  apiKey: process.env.SAFECOMMS_API_KEY,
});

2. Create Moderation Middleware

Build a reusable middleware function that scans any request body field. This pattern lets you apply moderation to any route with a single function call.

middleware/moderate.js
// Reusable moderation middleware
function moderateBody(field = 'content') {
  return async (req, res, next) => {
    const text = req.body[field];
    if (!text) return next();

    try {
      const result = await safecomms.moderateText({ content: text });

      if (!result.isClean) {
        return res.status(400).json({
          error: 'Content flagged by moderation',
          severity: result.severity,
          categories: result.categoryScores,
        });
      }

      // Attach result for downstream use
      req.moderationResult = result;
      next();
    } catch (err) {
      console.error('SafeComms moderation error:', err);
      res.status(500).json({ error: 'Moderation service unavailable' });
    }
  };
}

Tip: The middleware attaches the moderation result to req.moderationResult so downstream handlers can access the full analysis (e.g., for logging or analytics).

3. Apply to Routes

Apply the middleware to any route that accepts user content. Specify which body field to scan for each route.

server.js
// Apply moderation to specific routes
app.post('/api/comments', moderateBody('content'), (req, res) => {
  // Content is clean — save to database
  res.json({ success: true, message: 'Comment posted' });
});

app.post('/api/posts', moderateBody('body'), (req, res) => {
  // Content is clean — save post
  res.json({ success: true, message: 'Post created' });
});

app.post('/api/reviews', moderateBody('review'), (req, res) => {
  // Content is clean — save review
  res.json({ success: true, message: 'Review submitted' });
});

app.listen(3000, () => console.log('Server running on port 3000'));

Complete Express Example

The full Express.js server with moderation middleware, ready to copy and run.

server.js
import express from 'express';
import { SafeCommsClient } from 'safecomms';

const app = express();
app.use(express.json());

const safecomms = new SafeCommsClient({
  apiKey: process.env.SAFECOMMS_API_KEY,
});

function moderateBody(field = 'content') {
  return async (req, res, next) => {
    const text = req.body[field];
    if (!text) return next();

    try {
      const result = await safecomms.moderateText({ content: text });
      if (!result.isClean) {
        return res.status(400).json({
          error: 'Content flagged by moderation',
          severity: result.severity,
          categories: result.categoryScores,
        });
      }
      req.moderationResult = result;
      next();
    } catch (err) {
      console.error('SafeComms moderation error:', err);
      res.status(500).json({ error: 'Moderation service unavailable' });
    }
  };
}

app.post('/api/comments', moderateBody('content'), (req, res) => {
  res.json({ success: true, message: 'Comment posted' });
});

app.listen(3000, () => console.log('Server running on port 3000'));
Next.js

Part 2: Next.js Integration (App Router)

1. API Route Handler

Create a dedicated API route that handles moderation. This route can be called from your client-side form components.

app/api/moderate/route.ts
// app/api/moderate/route.ts
import { NextRequest, NextResponse } from 'next/server';
import { SafeCommsClient } from 'safecomms';

const safecomms = new SafeCommsClient({
  apiKey: process.env.SAFECOMMS_API_KEY!,
});

export async function POST(request: NextRequest) {
  const { content } = await request.json();

  if (!content) {
    return NextResponse.json(
      { error: 'Content is required' },
      { status: 400 }
    );
  }

  try {
    const result = await safecomms.moderateText({ content });

    if (!result.isClean) {
      return NextResponse.json(
        {
          error: 'Content flagged',
          severity: result.severity,
          categories: result.categoryScores,
        },
        { status: 400 }
      );
    }

    return NextResponse.json({ success: true, moderation: result });
  } catch (error) {
    console.error('Moderation failed:', error);
    return NextResponse.json(
      { error: 'Moderation service unavailable' },
      { status: 500 }
    );
  }
}

2. Edge Middleware (Optional)

For global protection, use Next.js Edge Middleware to intercept POST requests before they reach your API routes. This acts as a blanket safety net.

middleware.ts
// middleware.ts (Next.js Edge Middleware)
import { NextRequest, NextResponse } from 'next/server';
import { SafeCommsClient } from 'safecomms';

const safecomms = new SafeCommsClient({
  apiKey: process.env.SAFECOMMS_API_KEY!,
});

export async function middleware(request: NextRequest) {
  // Only moderate POST requests to specific paths
  if (request.method !== 'POST') return NextResponse.next();

  const body = await request.json();
  const content = body.content || body.message || body.text;

  if (!content) return NextResponse.next();

  try {
    const result = await safecomms.moderateText({ content });

    if (!result.isClean) {
      return NextResponse.json(
        { error: 'Content blocked by moderation' },
        { status: 400 }
      );
    }
  } catch {
    // Fail open: allow request if moderation service is down
    console.warn('SafeComms moderation unavailable, allowing request');
  }

  return NextResponse.next();
}

export const config = {
  matcher: ['/api/comments/:path*', '/api/posts/:path*'],
};

Note: This middleware uses a fail-open strategy. If SafeComms is unreachable, the request proceeds. Adjust this behavior based on your risk tolerance.

3. Client-Side Form Component

A React component that submits content to your moderation API route and shows real-time feedback.

components/CommentForm.tsx
// components/CommentForm.tsx
'use client';
import { useState, FormEvent } from 'react';

export default function CommentForm() {
  const [content, setContent] = useState('');
  const [status, setStatus] = useState<'idle' | 'loading' | 'success' | 'error'>('idle');
  const [errorMsg, setErrorMsg] = useState('');

  async function handleSubmit(e: FormEvent) {
    e.preventDefault();
    setStatus('loading');

    const res = await fetch('/api/moderate', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ content }),
    });

    if (!res.ok) {
      const data = await res.json();
      setStatus('error');
      setErrorMsg(data.error || 'Content was flagged');
      return;
    }

    setStatus('success');
    setContent('');
  }

  return (
    <form onSubmit={handleSubmit}>
      <textarea
        value={content}
        onChange={(e) => setContent(e.target.value)}
        placeholder="Write your comment..."
      />
      <button type="submit" disabled={status === 'loading'}>
        {status === 'loading' ? 'Checking...' : 'Submit'}
      </button>
      {status === 'error' && <p className="error">{errorMsg}</p>}
      {status === 'success' && <p className="success">Comment posted!</p>}
    </form>
  );
}

Testing Your Integration

Use cURL or your favorite HTTP client to verify the moderation flow works correctly.

Terminal
# Test with clean content
curl -X POST http://localhost:3000/api/comments \
  -H "Content-Type: application/json" \
  -d '{"content": "Great article, thanks for sharing!"}'

# Test with flagged content
curl -X POST http://localhost:3000/api/comments \
  -H "Content-Type: application/json" \
  -d '{"content": "Some harmful or toxic content here"}'

Clean Content Response

{ "success": true, "message": "Comment posted" }

Flagged Content Response

{ "error": "Content flagged", "severity": "High" }

Next Steps

Your moderation layer is live. Here are some ways to extend it further:

  • Enable PII detection to automatically redact sensitive data
  • Configure custom moderation profiles for different content types
  • Set up webhooks to receive real-time moderation events
  • Use the dashboard analytics to monitor moderation patterns
Will Casey
Will Casey
Engineer at SafeComms

William is an engineer at SafeComms specializing in developer tools and integration patterns. He builds the SDKs and writes the guides that help developers ship safer platforms.