I still remember the first time a production job stalled because a folder had 200,000 files. The code worked fine on my laptop, but in production it turned into a slow, brittle mess. The fix wasn’t a new library or a big refactor—it was using fs.readdir() correctly and with clear intent. Reading a directory sounds simple, but in real systems it touches security, performance, API design, and portability. If you build Node.js services that touch the filesystem—log processors, media pipelines, build tools, CLIs—this method shows up everywhere.
Here’s the mindset I want you to take: fs.readdir() is not just a way to list files. It’s an entry point into how your app interacts with the operating system. When you choose options like withFileTypes, when you decide to filter vs. stat, and when you set guardrails around errors, you’re shaping the reliability of the whole workflow. In this post I’ll show you how I use fs.readdir() in 2026-style Node.js projects, how I structure the data it returns, and how to avoid the common traps that show up only after deployment.
What fs.readdir() actually returns and why it matters
fs.readdir() asynchronously reads the contents of a directory and passes the result to a callback. At the simplest level, you get an array of file names. The function signature looks like this:
fs.readdir(path, options, callback)
The important thing is that the result is only names—no metadata, no file contents, no sizes, no type info unless you ask for it. That design choice is intentional. The filesystem can list a directory quickly, but fetching metadata for every entry can be expensive. So Node.js gives you a fast directory listing by default, and you can opt into richer data via withFileTypes.
In practice, I see three kinds of return payloads, depending on the options you pass:
- Default: an array of strings (file names). You’ll need to join with the original path to construct full paths.
- With encoding: the names can be returned as buffers if you set
encoding: ‘buffer‘. - With file types: an array of
fs.Direntobjects whenwithFileTypes: true.
I recommend thinking about the return type before you write any logic. If you’re going to separate files from directories, using withFileTypes will save you a second filesystem call for each entry. If you only need names and you’re filtering by extension, the default is faster and simpler.
One subtle but important detail: directory listings are not ordered by the OS in a guaranteed way. Most filesystems happen to return an order that looks stable, but you should treat the order as arbitrary. If your workflow depends on ordering, sort explicitly after reading.
The core parameters: path, options, callback
Let’s break down the three parameters, because how you use them is where reliability comes from.
path
The path parameter can be a string, a Buffer, or a URL. I mostly use string paths, but for modern tooling I often pass a URL created from import.meta.url in ESM contexts. This is especially helpful when you need a stable directory reference, like a project root or a package resource directory.
Two practical rules I follow:
- Prefer absolute paths inside services and long-running jobs. Relative paths can be brittle when the working directory changes.
- If your code runs in multiple environments (local dev, CI, production), resolve the path once at startup and reuse the resolved value instead of rebuilding it repeatedly.
options
You can pass a string encoding or an object. The object has two common fields:
encoding(default:‘utf8‘): Use this only if you truly need buffer names. Most of the time, UTF-8 is right.withFileTypes(default:false): Set totrueif you wantfs.Direntobjects.
The moment you use withFileTypes, the rest of your code can become cleaner. Dirent objects include methods like isFile(), isDirectory(), and isSymbolicLink() without requiring extra stat calls.
I also treat encoding as an explicit decision: if you’re dealing with rare non-UTF8 filename scenarios, you’ll know it. If you don’t know, stick with the default. Passing encoding: ‘buffer‘ is a reasonable choice when you want raw bytes and plan to do normalization yourself.
callback
The callback signature is (err, files). The error-first style is the same as many Node.js APIs, and it’s still common in 2026 even with async/await everywhere. If you want a promise, you can use fs.promises.readdir() or wrap fs.readdir() manually.
I also recommend a “single exit” pattern inside callbacks: handle errors first, return early, then operate on the results. That keeps logic readable and prevents accidental double-handling of errors.
Example 1: Basic directory listing, with and without withFileTypes
I like to show the difference with the same directory so it’s obvious how the data changes. This is a full, runnable example that prints both views.
// list-basic.js
const fs = require(‘fs‘);
// Default usage: returns file names as strings
fs.readdir(dirname, (err, files) => {
if (err) {
console.error(‘Read error:‘, err.message);
return;
}
console.log(‘\nCurrent directory filenames:‘);
files.forEach((file) => console.log(file));
});
// withFileTypes: returns Dirent objects
fs.readdir(dirname, { withFileTypes: true }, (err, entries) => {
if (err) {
console.error(‘Read error:‘, err.message);
return;
}
console.log(‘\nCurrent directory files:‘);
entries.forEach((entry) => console.log(entry));
});
When you run that, you’ll get output like:
Current directory filenames:
index.js
package.json
textfilea.txt
textfileb.txt
Current directory files:
Dirent { name: ‘index.js‘, [Symbol(type)]: 1 }
Dirent { name: ‘package.json‘, [Symbol(type)]: 1 }
Dirent { name: ‘textfilea.txt‘, [Symbol(type)]: 1 }
Dirent { name: ‘textfileb.txt‘, [Symbol(type)]: 1 }
Notice how the Dirent object gives you type info immediately. You can check if it’s a file or directory without additional syscalls. That matters when you have thousands of entries.
One more practical tip: if you only need names and you don’t need sorting, use the default. If you need types, withFileTypes is usually cheaper than a full stat per entry, but it’s not “free.” You still have overhead per entry, so it’s a conscious choice.
Example 2: Filtering by extension without extra syscalls
If all you need is a list of files that match a known extension, I keep it simple: fetch names, then filter with path.extname. This is a good case where you do not need withFileTypes.
// list-txt.js
const fs = require(‘fs‘);
const path = require(‘path‘);
fs.readdir(dirname, (err, files) => {
if (err) {
console.error(‘Read error:‘, err.message);
return;
}
console.log(‘\nFilenames with the .txt extension:‘);
files.forEach((file) => {
if (path.extname(file) === ‘.txt‘) {
console.log(file);
}
});
});
This approach is fast and clean. You avoid extra stat calls because the extension is enough for the filter you’re applying. I recommend this pattern for log processing, reports, or scripts that are only interested in a filename convention.
An extra step I sometimes add in production: normalize extension comparisons to be case-insensitive when you can’t control upstream filenames. That’s a common source of missing files when the filesystem is case-sensitive.
fs.readdir() vs. fs.promises.readdir() in modern Node.js
I use both, but I choose based on the code around it. If the surrounding code is already callback-based or performance-critical, I often keep it as fs.readdir() for minimal overhead. If the rest of the code is async/await, I use the promise version because it reads better and avoids callback nesting.
Here’s the same pattern using promises, which feels more natural in 2026 codebases that already use top-level await or async workflows:
// list-async.js
const fs = require(‘fs/promises‘);
async function listCurrentDirectory() {
try {
const files = await fs.readdir(dirname, { withFileTypes: true });
for (const entry of files) {
console.log(entry.name, entry.isDirectory() ? ‘[dir]‘ : ‘[file]‘);
}
} catch (err) {
console.error(‘Read error:‘, err.message);
}
}
listCurrentDirectory();
If you’re building a CLI or a service that already has a promise-based flow, using fs.promises.readdir() avoids callback-style splits. But the underlying behavior is the same, and the error handling rules don’t change.
One more consideration: if you want to control concurrency and error propagation, promises compose better. It’s easier to integrate into Promise.all, Promise.any, or into a queue or task runner.
Error handling that actually survives real-world usage
I see a lot of examples that just print the error and move on. That’s okay for quick scripts, but for production workloads you should treat errors as signals. Here are the two most common errors I hit and how I handle them.
Invalid directory path
If you try to read a directory that doesn’t exist, Node returns an ENOENT error. You should surface it clearly and avoid cascading failures.
const fs = require(‘fs‘);
fs.readdir(‘/invalid/path‘, (error, fileList) => {
if (error) {
console.error(‘Error reading directory:‘, error.message);
return;
}
console.log(‘Files:‘, fileList);
});
Permission issues
Permission errors are common in containerized environments or when running as a restricted user. You should handle them as a specific case and provide a clear message.
const fs = require(‘fs‘);
fs.readdir(‘/restricted/path‘, (error, fileList) => {
if (error) {
console.error(‘Permission error:‘, error.message);
return;
}
console.log(‘Files:‘, fileList);
});
When I’m building robust tools, I’ll usually check error.code and branch, so the UI or log output is actionable. For example, EACCES can trigger a “missing permissions” message, while ENOENT can prompt a hint about creating a directory or checking configuration.
Here’s a slightly more robust pattern I use in CLI tools:
const fs = require(‘fs‘);
function readDirSafe(dir, cb) {
fs.readdir(dir, (err, files) => {
if (!err) return cb(null, files);
switch (err.code) {
case ‘ENOENT‘:
return cb(new Error(Directory not found: ${dir}));
case ‘EACCES‘:
return cb(new Error(No permission to read: ${dir}));
case ‘ENOTDIR‘:
return cb(new Error(Not a directory: ${dir}));
default:
return cb(err);
}
});
}
This is lightweight, doesn’t overcomplicate the logic, and produces clean error messages without losing the original error context.
withFileTypes and fs.Dirent: small option, big impact
The most practical feature of fs.readdir() is withFileTypes. If you want to filter for only directories or only files, this saves you repeated stat calls. It also makes the code more readable.
Here’s a pattern I use all the time when I need to separate folders from files:
const fs = require(‘fs‘);
const path = require(‘path‘);
fs.readdir(dirname, { withFileTypes: true }, (err, entries) => {
if (err) {
console.error(‘Read error:‘, err.message);
return;
}
const folders = [];
const files = [];
for (const entry of entries) {
if (entry.isDirectory()) {
folders.push(entry.name);
} else if (entry.isFile()) {
files.push(entry.name);
}
}
console.log(‘Folders:‘, folders);
console.log(‘Files:‘, files);
});
This approach is faster than calling fs.stat() for each entry, especially on large directories or networked filesystems. I recommend it whenever you need to differentiate entry types.
One important caveat: Dirent tells you the entry type as reported by the directory entry. It’s usually accurate, but it can be misleading with certain filesystems or when entries change between the time of listing and the time of use. If accuracy is critical and you’re about to perform a sensitive action (like deleting), follow up with lstat on the specific entry you’re operating on.
When not to use fs.readdir()
I’m very selective about when I reach for fs.readdir(). It’s great for listing, but it’s not the right tool for everything.
You should avoid fs.readdir() when:
- You need recursive traversal. For that, use
fs.opendir()with an async iterator or a dedicated traversal utility. - You need directory metadata like sizes or timestamps. That still requires
statorlstat. - You’re working with very large directories and only want specific files; in that case, streaming directory entries via
fs.opendir()can be more memory-efficient.
Think of fs.readdir() as a fast snapshot of a folder’s immediate contents. It’s not a full filesystem crawler.
Performance considerations: small choices that add up
Performance isn’t just about speed. It’s about stable runtime characteristics. Here are the patterns I follow.
Avoid per-entry stat when possible
withFileTypes gives you cheap type checks. If you need more metadata, batch or cache calls. If you stat 200,000 entries, you’ll hit OS limits or runtime bottlenecks.
Be mindful of memory
fs.readdir() returns the entire list at once. For huge directories, that can take significant memory. In those cases, I switch to fs.opendir() and iterate entries instead of loading all at once.
Treat file system as slow I/O
Even on SSDs, directory reads can be slow under load or on network mounts. In production, I assume directory reads are moderately slow—often in the 10–50ms range for small directories, and much higher for large ones or remote filesystems. That’s why I keep the logic after readdir() simple and fast.
Avoid hot loops over giant arrays
If you read a giant directory and then do expensive work for each entry inside the same tick, you can block the event loop. One simple fix is to chunk processing or push work into a queue. That way your service stays responsive, even when the directory is huge.
Consider caching directory snapshots
If your application lists the same directory repeatedly, caching the list for a short window can reduce I/O load. I usually cache for a few seconds and invalidate on a timer. It’s a small trick, but it can cut file system load dramatically in busy services.
Handling cross-platform paths and URL inputs
One subtle issue is path handling across Windows, macOS, and Linux. If you build tools for multiple environments, you should normalize paths and avoid hardcoding separators. Use path.join() or path.resolve() when constructing file paths.
If you’re in ESM mode, a good pattern is to convert import.meta.url into a path and then use that for readdir().
import { readdir } from ‘fs/promises‘;
import { fileURLToPath } from ‘url‘;
import path from ‘path‘;
const filename = fileURLToPath(import.meta.url);
const dirname = path.dirname(filename);
const entries = await readdir(dirname);
console.log(entries);
This pattern avoids brittle assumptions about the working directory and makes your code more portable.
Another cross-platform issue: Windows has a special device namespace and reserved names (like CON, AUX, PRN). If you’re generating filenames, validate them before use, especially if your tool targets Windows users. A directory listing might include these names, and operations can fail in surprising ways.
Real-world use case: building a log archiver
Here’s a realistic example. Imagine you’re building a log archiver that moves .log files into a date-based folder. You want a safe, readable flow that filters by extension, checks file types, and handles errors cleanly.
const fs = require(‘fs/promises‘);
const path = require(‘path‘);
async function archiveLogs(logDir, archiveDir) {
try {
const entries = await fs.readdir(logDir, { withFileTypes: true });
// Only take regular files ending in .log
const logFiles = entries
.filter((entry) => entry.isFile() && entry.name.endsWith(‘.log‘))
.map((entry) => entry.name);
if (logFiles.length === 0) {
console.log(‘No log files to archive.‘);
return;
}
const dateStamp = new Date().toISOString().slice(0, 10);
const targetDir = path.join(archiveDir, dateStamp);
await fs.mkdir(targetDir, { recursive: true });
for (const fileName of logFiles) {
const from = path.join(logDir, fileName);
const to = path.join(targetDir, fileName);
await fs.rename(from, to);
console.log(Archived ${fileName});
}
} catch (err) {
console.error(‘Archive failed:‘, err.message);
}
}
archiveLogs(‘./logs‘, ‘./archives‘);
Notice what I’m doing:
- Using
withFileTypesto avoid extrastatcalls. - Filtering by
.logextension in-memory. - Creating an archive folder only once.
- Handling errors at the top level so the job fails cleanly.
This is the type of workflow where fs.readdir() shines. It provides the list, then you control the rest with clear logic.
Common mistakes I see and how to avoid them
I review a lot of Node.js code, and the same mistakes show up repeatedly. Here’s how to avoid them.
Mistake 1: Assuming file names are full paths
fs.readdir() returns names only. If you pass those names into file operations without joining with the directory path, you’ll end up accessing the wrong files. Always use path.join(dir, name).
Mistake 2: Ignoring errors
If you just log errors and continue, you may hide real problems. At minimum, return after an error. For production flows, treat errors as signals that the operation didn’t complete.
Mistake 3: Using withFileTypes and expecting strings
When you set withFileTypes: true, you get Dirent objects, not strings. I’ve seen code that tries to path.extname(entry) and crashes. Use entry.name instead.
Mistake 4: Readdir on huge directories without limits
If you have tens of thousands of files, loading everything into memory can spike RAM. Use fs.opendir() and process entries incrementally if you expect large directories.
Mistake 5: Forgetting to handle hidden files
On Unix-like systems, files starting with . are hidden. If you want to ignore them, filter explicitly. Don’t assume your directory list only includes “visible” files.
Mistake 6: Relying on ordering
Directory entries may appear in different orders across runs or environments. If you need consistent ordering, sort the list explicitly with files.sort().
fs.readdir() vs fs.opendir() for large directory scanning
Here’s a quick comparison I use when deciding:
Best choice
—
fs.readdir()
fs.opendir()
stat fs.readdir({ withFileTypes: true })
Dirent fs.opendir()
fs.opendir() or custom walker
The cutoff isn’t exact. I’ve seen 50k entries work fine in memory, and I’ve seen 5k entries cause pressure in constrained containers. The key is to know your environment and choose the tool that fits it.
Deep dive: controlling data shape with withFileTypes
When you use withFileTypes: true, you get Dirent objects, which are lightweight representations of directory entries. I think of them as “just enough information to decide what to do next.” The methods include:
isFile()isDirectory()isSymbolicLink()isFIFO()isSocket()isCharacterDevice()isBlockDevice()
Most apps only need isFile() and isDirectory(), but if you’re building tools that operate on system folders, those other types matter. I’ve run into sockets and FIFO files in /tmp and in container volumes. If you’re writing cleanup tools, be explicit about what you will delete.
Here’s a safer filter that only includes regular files and directories, ignoring everything else:
const fs = require(‘fs/promises‘);
const path = require(‘path‘);
async function listSafe(dir) {
const entries = await fs.readdir(dir, { withFileTypes: true });
return entries
.filter((e) => e.isFile() || e.isDirectory())
.map((e) => ({ name: e.name, type: e.isDirectory() ? ‘dir‘ : ‘file‘ }));
}
listSafe(process.cwd()).then(console.log).catch(console.error);
This “typed listing” is especially useful for CLI tools that need to show a human-friendly directory view without exposing odd system file types.
Edge cases that break naive code
I’ve been bitten by these enough times that I treat them as a checklist:
1) Directories changing during read: If files are being created or deleted while you read, you might get inconsistent results. Don’t assume a stable snapshot. If consistency matters, design your workflow to be tolerant of missing or new entries.
2) Symlinks: Dirent.isSymbolicLink() tells you a symlink exists, but it doesn’t tell you where it points. If you follow links, you should use lstat and realpath to avoid infinite loops or unexpected traversal outside the intended directory.
3) Non-UTF8 filenames: Rare but real. If you see weird characters or decoding errors, consider encoding: ‘buffer‘ and handle the bytes directly.
4) Long paths: On some platforms, very long paths can fail if the prefix isn’t handled. Normalize paths and keep filenames reasonable if your app generates them.
5) Race conditions: A file can exist at readdir time and be gone by the time you open it. Always handle file operations with their own error handling; never assume the list is still valid.
Practical scenario: build a recursive walker safely
I said earlier that fs.readdir() is not for recursive traversal. But you can still build a recursive workflow if you control recursion carefully and you understand the limits. Here’s a pattern that uses readdir with withFileTypes to traverse one level at a time, with a concurrency limit to avoid overwhelming the system.
const fs = require(‘fs/promises‘);
const path = require(‘path‘);
async function walk(dir, onFile, maxDepth = 5, depth = 0) {
if (depth > maxDepth) return;
const entries = await fs.readdir(dir, { withFileTypes: true });
for (const entry of entries) {
const fullPath = path.join(dir, entry.name);
if (entry.isDirectory()) {
await walk(fullPath, onFile, maxDepth, depth + 1);
} else if (entry.isFile()) {
await onFile(fullPath);
}
}
}
async function demo() {
await walk(process.cwd(), async (filePath) => {
if (filePath.endsWith(‘.md‘)) {
console.log(‘Found markdown:‘, filePath);
}
});
}
demo().catch((err) => console.error(err.message));
This is a straightforward version. In production, I add:
- A queue to limit concurrency.
- A guard against following symlinks.
- A timeout for each directory operation.
If you don’t want to implement this yourself, you should consider fs.opendir() and iterate lazily.
Practical scenario: content indexer for a media pipeline
Another real-world example: I had a media pipeline that needed a quick snapshot of incoming files, grouped by type. We only needed top-level information, so fs.readdir() was perfect. Here’s a trimmed version of that pattern:
const fs = require(‘fs/promises‘);
const path = require(‘path‘);
async function indexMedia(dir) {
const entries = await fs.readdir(dir, { withFileTypes: true });
const result = {
images: [],
videos: [],
audio: [],
other: [],
};
for (const entry of entries) {
if (!entry.isFile()) continue;
const ext = path.extname(entry.name).toLowerCase();
if ([‘.jpg‘, ‘.jpeg‘, ‘.png‘, ‘.webp‘].includes(ext)) {
result.images.push(entry.name);
} else if ([‘.mp4‘, ‘.mov‘, ‘.mkv‘].includes(ext)) {
result.videos.push(entry.name);
} else if ([‘.mp3‘, ‘.wav‘, ‘.flac‘].includes(ext)) {
result.audio.push(entry.name);
} else {
result.other.push(entry.name);
}
}
return result;
}
indexMedia(‘./incoming‘).then(console.log).catch(console.error);
I like this pattern because it keeps file type logic local and predictable. It also works well with caching: you can store the index and only refresh periodically.
Alternative approaches: fs.opendir() and async iteration
When directories are large or when you want to process entries incrementally, fs.opendir() is the better choice. It gives you a directory handle that you can iterate with for await and keep memory usage low.
Here’s a compact example that mirrors a readdir workflow but with streaming behavior:
const fs = require(‘fs/promises‘);
const path = require(‘path‘);
async function listLargeDir(dir) {
const dirHandle = await fs.opendir(dir);
for await (const dirent of dirHandle) {
if (dirent.isFile()) {
console.log(‘File:‘, path.join(dir, dirent.name));
}
}
}
listLargeDir(‘/some/huge/dir‘).catch(console.error);
If you’re dealing with very large datasets, this approach gives you finer control, especially when combined with a queue or a batch processor.
Security considerations when listing directories
Filesystem access is a security boundary. I treat directory reads as privileged operations and apply the same discipline as I do for database queries.
Here are the things I watch:
- Path traversal: If a user provides a directory path, validate or normalize it to prevent
../traversal into sensitive folders. - Symlink attacks: If you follow symlinks, a malicious or accidental symlink could point outside your intended directory. Use
lstatandrealpathto validate the resolved path is within the allowed root. - Least privilege: Run your service with the minimum permissions required. Then if
readdirfails withEACCES, you know the system is doing its job. - Logging: Avoid logging entire directory contents in production logs. It can leak sensitive filenames.
Here’s a tiny guard I use to keep paths within an allowed root:
const path = require(‘path‘);
function isInsideRoot(root, target) {
const resolvedRoot = path.resolve(root);
const resolvedTarget = path.resolve(target);
return resolvedTarget.startsWith(resolvedRoot + path.sep);
}
This isn’t bulletproof by itself, but it’s a solid baseline. If you’re building a multi-tenant system, build in stricter checks.
Observability: logging and metrics for directory reads
In production systems, “it works” is not enough. I want to know how often directory reads happen and how long they take. That helps me spot slowdowns and scale problems before they become outages.
At minimum, I log a concise message on errors and count them. If I can, I also track duration for readdir operations. This is straightforward in promise-based code:
const fs = require(‘fs/promises‘);
async function timedRead(dir) {
const start = Date.now();
try {
const files = await fs.readdir(dir);
const ms = Date.now() - start;
console.log(readdir ${dir} took ${ms}ms);
return files;
} catch (err) {
const ms = Date.now() - start;
console.error(readdir ${dir} failed after ${ms}ms: ${err.message});
throw err;
}
}
For high-traffic services, I’d send those durations to a metrics system, not to logs. But even simple logging will help you catch unexpected slowdowns.
Testing strategies for fs.readdir() workflows
Filesystem code is easy to test if you keep it modular. I usually split logic into:
- A small function that reads the directory.
- A pure function that filters or maps entries.
That way, you can unit-test the pure logic without touching the filesystem. For integration tests, I create a temporary folder and create files within it.
Here’s a simple pattern (no test framework required to understand the idea):
function filterLogs(entries) {
return entries.filter((name) => name.endsWith(‘.log‘));
}
// Unit test the filter
console.log(filterLogs([‘a.log‘, ‘b.txt‘, ‘c.log‘]));
Then your integration test can verify that readdir returns entries correctly, while the business logic remains deterministic.
Comparison: traditional vs modern usage patterns
Here’s a quick comparison of common approaches and how I think about them:
Good for
—
fs.readdir() Legacy code, minimal overhead
fs.promises.readdir() Async/await codebases, clarity
readdir + withFileTypes Quick type filtering
opendir streaming Massive directories
I treat fs.readdir() as the default and switch only when I have a specific reason.
AI-assisted workflows and filesystem utilities
In 2026, I often use AI-assisted tooling to generate or review file utilities, but I never let it “decide” the operational model. I’ll use assistance to:
- Generate a baseline
readdirflow. - Add guardrails (error codes, path validation).
- Suggest concurrency limits.
But I always make the final call on performance and safety constraints. A model can’t know your directory sizes, your deployment storage, or your failure tolerance. The good news is: fs.readdir() is simple enough that you can design it correctly with a few explicit decisions.
Production checklist for fs.readdir() usage
When I ship code that uses readdir, I run through this checklist:
- Do I need names only, or file types too?
- Is the directory size small enough for a full array?
- Do I need stable ordering?
- Am I correctly joining names to the directory path?
- Do I handle
ENOENT,EACCES, andENOTDIRclearly? - Do I need to protect against symlinks or traversal?
- Should I log or measure the read time?
This sounds like a lot, but after a while it becomes second nature. The goal is to prevent a “small function” from becoming a production weak point.
Wrapping up: how I think about fs.readdir() now
The first time I used fs.readdir(), I treated it like a simple list call. Now I treat it like a boundary between my app and the operating system. That shift matters. It makes you deliberate about performance, error handling, and portability.
If you remember just a few principles, make it these:
- Use
withFileTypeswhen you need types, and avoid extrastatcalls. - Don’t assume order, and don’t assume the list is stable.
- For huge directories, consider
fs.opendir(). - Always join names to paths explicitly.
- Treat filesystem errors as real signals, not just logs.
Do those things, and fs.readdir() will be one of the most reliable tools in your Node.js toolkit, even when the directory has 200,000 files and your job is running at 3 a.m. in a production container.


