Why I Don’t Trust Body Parsers (And Neither Should You)
I woke up this morning to three Slack notifications and a missed call from my devops lead. Nothing good ever happens before my first coffee, and today was no exception. Another critical vulnerability dropped. Another scramble to patch.
If you’ve been writing Node.js for as long as I have, you know the drill. You run npm audit, you see a wall of red text, and you sigh. But sometimes, the red text isn’t just a prototype pollution warning in some obscure dev-dependency you don’t even use in production. Sometimes it’s a CVSS 9.2 in a core library that handles every single request hitting your server.
We’re looking at you, body parsers.
The “It’s Just a File Upload” Fallacy
Here’s the thing. We treat npm install like a shopping spree, grabbing packages for everything. Need to parse JSON? There’s a package. Need to handle multipart form data? Grab another one. It feels like magic until the magic turns into arbitrary file writes.
The recent mess with AdonisJS—specifically that nasty CVE-2026-21440 in @adonisjs/bodyparser—is a perfect reminder of why I’m paranoid. The vulnerability allowed attackers to write files anywhere on the server. Anywhere. That means they could overwrite your configuration files, drop a webshell in your public directory, or just replace your index.js with something that mines crypto.
And honestly? It’s terrifyingly easy to mess this up when you’re building a library.
How the Sausage Gets Made (And Exploited)
When a server receives a file upload, it doesn’t just “get” the file. It has to stream bytes, buffer them, and usually write them to a temporary location before your application logic even sees them. The library handling this has to decide where to put that file and what to name it.
That’s where the wheels fall off.
I wrote some code a few years back that looked something like this. I thought I was being clever.
const fs = require('fs');
const path = require('path');
// Don't do this. Seriously.
function handleUpload(req, filename) {
const uploadDir = path.join(__dirname, 'uploads');
// I assumed filename was safe because... reasons?
const filePath = path.join(uploadDir, filename);
const stream = fs.createWriteStream(filePath);
req.pipe(stream);
return new Promise((resolve, reject) => {
stream.on('finish', () => resolve(filePath));
stream.on('error', reject);
});
}
See the problem? If an attacker sends a filename like ../../../../etc/passwd (okay, maybe not that simple due to permissions, but you get the idea), or ../../app/config.js, my code would happily traverse up the directory tree and overwrite whatever it found.
The AdonisJS flaw was more sophisticated than my rookie mistake, obviously. But the core issue remains: trusting input when generating file paths is a death sentence.
Why NPM Makes This Harder
The NPM ecosystem is built on a house of cards. We love it because it’s fast. I love it because I can spin up an API in twenty minutes. But the depth of the dependency tree means you’re often running code you’ve never looked at, written by people you don’t know, maintained by… well, sometimes nobody.
I spent an hour this morning tracing the dependency chain of a project I inherited. It had three different body parsers. Three. One for Express, one for a legacy utility, and one buried inside a logging library for some god-forsaken reason. If any one of those has a flaw like the one we saw this week, the whole ship sinks.
It’s not just about “patching immediately.” It’s about visibility. Do you even know which packages are parsing user input in your stack?
My Paranoia Protocol
After getting burned a few times (and losing a weekend to a crypto-miner incident in 2024), I changed how I handle dependencies. It’s annoying, it slows me down, and my team hates me for it during code reviews. But we haven’t been breached lately.
1. Pin everything.
I don’t care what semver says. I pin exact versions. If a patch comes out, I want to upgrade it manually. I want to read the changelog. I want to know why the version number bumped. Automated updates are great until they pull in a compromised package.
2. Sanitize filenames like your life depends on it.
Never use the filename provided by the client. Just don’t. Generate a UUID. Save the original filename in a database if you really need it for display. But on the disk? It’s a4e1-33b2-....jpg.
const crypto = require('crypto');
const path = require('path');
function getSafePath(extension) {
// Generate a random name. The user's input never touches the disk path.
const id = crypto.randomUUID();
// Whitelist extensions if you can
const safeExt = extension.replace(/[^a-z0-9]/gi, '');
return path.join('/tmp/uploads', ${id}.${safeExt});
}
3. Use npm audit but don’t trust it blindly.
Running the audit is the bare minimum. But half the time, the “critical” vulnerability is in a build tool that only runs on your laptop. You need to actually read the report. Is it a ReDoS (Regular Expression Denial of Service)? Is it a prototype pollution? Or is it an arbitrary file write?
If it’s a file write or remote code execution (RCE), drop everything. That’s a “fix it now, ask questions later” situation.
The Reality of 2026
We’re in 2026 and we’re still fighting the same battles we fought in 2016. Input validation. Path traversal. Buffer overflows. The languages get shinier, the frameworks get more opinionated, but the vulnerabilities stay the same.
The AdonisJS team patched this quickly, which is to their credit. But for every framework that patches fast, there are five unmaintained packages sitting in your node_modules folder, ticking away.
So, check your dependencies. Especially the ones that handle file uploads or body parsing. If you’re using @adonisjs/bodyparser, make sure you’re on the patched version (anything after the fix for CVE-2026-21440). If you’re using something else, maybe go take a look at their GitHub issues page. You might not like what you find, but at least you’ll know.
Now, if you’ll excuse me, I have some logs to comb through. Stay safe out there.
