Image Upload Validation
User-generated images carry high risk for platforms. By integrating SafeComms into your upload pipeline, you can block NSFW, gore, or hateful imagery before it ever reaches your storage bucket.
01. DEPENDENCIES
We'll use `multer` for handling multipart/form-data uploads:
npm install express multer @safecomms/sdk02. IMPLEMENTATION
Use `memoryStorage` to avoid writing potentially harmful files to disk before validation:
const express = require('express');
const multer = require('multer');
const { SafeComms } = require('@safecomms/sdk');
const app = express();
const upload = multer({ storage: multer.memoryStorage() }); // Keep in memory for speed
const safecomms = new SafeComms({ apiKey: process.env.SAFECOMMS_API_KEY });
// > UPLOAD_ENDPOINT
app.post('/upload/avatar', upload.single('avatar'), async (req, res) => {
try {
if (!req.file) return res.status(400).send('No file uploaded.');
// 1. > SCAN_IMAGE_BUFFER
// Analyze directly from memory buffer without saving to disk
const analysis = await safecomms.image.analyze({
image: req.file.buffer,
mimeType: req.file.mimetype
});
// 2. > CHECK_SAFETY_STATUS
if (analysis.flagged) {
console.log(`> UNSAFE_IMAGE_BLOCKED: ${analysis.primaryCategory}`);
return res.status(406).json({
error: 'Upload Rejected',
message: 'Image violates content safety policy.',
category: analysis.primaryCategory
});
}
// 3. > SAVE_SAFE_IMAGE
// Only proceed to save to S3/Cloudinary/Disk if safe
// await s3.upload(req.file);
res.json({
success: true,
message: 'Avatar updated successfully'
});
} catch (error) {
console.error('> UPLOAD_ERROR:', error);
res.status(500).send('Internal Server Error');
}
});
app.listen(3000, () => console.log('> IMAGE_SERVER_ON_PORT_3000'));> COST_SAVING
Scanning before storage prevents you from paying hosting costs for content you'll eventually have to delete. It also protects your brand reputation by ensuring harmful images are never publicly accessible, even for a second.