I still see teams ship subtle bugs because a value that ‘looked right‘ was actually wrong: a string typed with a dash instead of an underscore, a status number copied from an old spec, a role name spelled three different ways across services. In my experience, enums are the simplest antidote. They turn loose values into a small, named set that your whole codebase can agree on. That agreement is the real benefit: when I name something, I create a shared vocabulary that helps you scan a file, recognize intent, and spot mistakes before they turn into customer-facing errors.
JavaScript does not ship with a native enum keyword, so the patterns are on us. The good news is that modern JavaScript gives us several ways to build strong enum-like constructs that feel clean in 2026 codebases. I will walk through the core approaches, show how to keep them safe, and explain where they fit best. Along the way, I will include runnable examples, common mistakes, and guidance you can act on immediately.
The core idea: named constants that tell a story
Enums are a set of named constants. That is it, yet the effect is powerful. When I read OrderStatus.SHIPPED, I instantly know the meaning. When I see 3, I do not. Think of enums like color tabs in a paper folder: the tab does not change the document, but it makes the contents visible at a glance. Your goal is clarity and consistency.
In JavaScript, the three big questions I ask before picking a pattern are:
- Do you need immutability so no one can change values at runtime?
- Do you need behavior attached to the enum (methods, validation helpers)?
- Will the enum cross system boundaries (APIs, databases, analytics)?
If you answer those, the pattern selection becomes straightforward.
Plain objects: the baseline that keeps you sane
Plain objects are the easiest, and often the best, starting point. They are readable, familiar, and quick to wire into a codebase. I use them for most enums, especially when values are strings that cross network or storage boundaries.
const OrderStatus = {
PENDING: ‘pending‘,
PAID: ‘paid‘,
SHIPPED: ‘shipped‘,
CANCELED: ‘canceled‘
};
function canFulfill(status) {
return status === OrderStatus.PAID || status === OrderStatus.SHIPPED;
}
console.log(OrderStatus.PAID); // ‘paid‘
console.log(canFulfill(OrderStatus.PAID)); // true
This is plain and effective. You get named constants, a central place for changes, and values that survive JSON serialization. For most business enums, I recommend string values because they are self-describing in logs and do not break when you reorder keys.
If you prefer numeric enums (perhaps for a compact protocol), keep a separate mapping for readability:
const Priority = {
LOW: 1,
MEDIUM: 2,
HIGH: 3
};
const PriorityLabel = {
[Priority.LOW]: ‘Low‘,
[Priority.MEDIUM]: ‘Medium‘,
[Priority.HIGH]: ‘High‘
};
console.log(PriorityLabel[Priority.HIGH]); // ‘High‘
A quick rule I follow: if humans will see the value in logs or analytics, prefer strings. If machines will see it and size is critical, numeric values can be fine, but you should still keep a label mapping.
Immutability: Object.freeze and deep-freeze patterns
Enums should not change at runtime. If a constant mutates, your whole system loses a source of truth. That is why I add immutability for any enum shared across modules.
const PaymentMethod = Object.freeze({
CARD: ‘card‘,
BANKTRANSFER: ‘banktransfer‘,
CASH: ‘cash‘
});
// Attempting to mutate will silently fail in non-strict mode
PaymentMethod.CASH = ‘coins‘;
console.log(PaymentMethod.CASH); // ‘cash‘
Object.freeze is shallow. If you store nested objects (rare for enums, but it happens), you need a deep-freeze helper:
function deepFreeze(obj) {
Object.getOwnPropertyNames(obj).forEach((prop) => {
const value = obj[prop];
if (value && typeof value === ‘object‘) {
deepFreeze(value);
}
});
return Object.freeze(obj);
}
const ErrorCodes = deepFreeze({
AUTH: {
INVALIDTOKEN: ‘authinvalid_token‘,
EXPIREDSESSION: ‘authexpired_session‘
},
PAYMENT: {
DECLINED: ‘payment_declined‘,
LIMIT: ‘payment_limit‘
}
});
// Nested mutation is blocked
ErrorCodes.AUTH.INVALID_TOKEN = ‘changed‘;
console.log(ErrorCodes.AUTH.INVALIDTOKEN); // ‘authinvalid_token‘
I only use nested enums when I need a namespace, not because it is stylish. The more complex the enum structure, the harder it is to serialize and validate at the edges of your system.
Class-based enums: when you want behavior too
Sometimes you want methods and validation helpers baked in. A static class can do that cleanly while still keeping the enum values easy to reference.
class Role {
static ADMIN = ‘admin‘;
static EDITOR = ‘editor‘;
static VIEWER = ‘viewer‘;
static values() {
return [Role.ADMIN, Role.EDITOR, Role.VIEWER];
}
static isValid(role) {
return Role.values().includes(role);
}
}
console.log(Role.ADMIN); // ‘admin‘
console.log(Role.isValid(‘editor‘)); // true
console.log(Role.isValid(‘guest‘)); // false
I prefer class-based enums when I need a dedicated API: Role.values(), Role.isValid(), and perhaps Role.label(value) for UI. It reads cleanly at the call site and reduces repeated helper code.
Be careful not to instantiate the class. It is a static holder, not a type of object you create per user. If you want instances, you are closer to a domain model than an enum, and you should name it accordingly.
Symbols and unique values: preventing accidental collisions
If you need enum values that cannot be spoofed, Symbol is a strong option. This is useful when values are only used in-memory and never serialized.
const CacheState = Object.freeze({
WARM: Symbol(‘warm‘),
COLD: Symbol(‘cold‘),
STALE: Symbol(‘stale‘)
});
function isCold(state) {
return state === CacheState.COLD;
}
console.log(isCold(CacheState.COLD)); // true
console.log(isCold(Symbol(‘cold‘))); // false
Symbols prevent a string like ‘cold‘ from pretending to be CacheState.COLD. That is strong safety. The tradeoff is that you cannot send a symbol through JSON or store it in a database. I use symbols for internal states, not for anything that crosses the network.
Enum modules: a small API that scales well
On larger teams, I often move beyond a single exported object and provide a tiny module around the enum. This keeps validation, labels, and lists in one place and prevents duplicate helper functions across the codebase.
const OrderStatus = Object.freeze({
PENDING: ‘pending‘,
PAID: ‘paid‘,
SHIPPED: ‘shipped‘,
CANCELED: ‘canceled‘
});
const OrderStatusLabel = Object.freeze({
[OrderStatus.PENDING]: ‘Pending‘,
[OrderStatus.PAID]: ‘Paid‘,
[OrderStatus.SHIPPED]: ‘Shipped‘,
[OrderStatus.CANCELED]: ‘Canceled‘
});
const OrderStatusValues = Object.freeze(Object.values(OrderStatus));
const OrderStatusSet = new Set(OrderStatusValues);
function isOrderStatus(value) {
return OrderStatusSet.has(value);
}
function assertOrderStatus(value) {
if (!isOrderStatus(value)) {
throw new Error(Invalid order status: ${value});
}
return value;
}
function orderStatusLabel(value) {
return OrderStatusLabel[value] || ‘Unknown‘;
}
export {
OrderStatus,
OrderStatusValues,
isOrderStatus,
assertOrderStatus,
orderStatusLabel
};
I like this shape because it makes the enum the single source of truth and gives the rest of the app a consistent API. It also keeps expensive work (like building a Set) out of hot loops and avoids repeated Object.values calls.
Validation, typing, and modern tooling in 2026
Enums shine when you enforce them at boundaries. In 2026, most teams I work with use a mix of runtime validation and static typing. You can do both without ceremony.
Here is a runtime guard that works with plain objects:
const ShippingSpeed = Object.freeze({
STANDARD: ‘standard‘,
EXPRESS: ‘express‘,
OVERNIGHT: ‘overnight‘
});
const ShippingSpeedSet = new Set(Object.values(ShippingSpeed));
function assertShippingSpeed(value) {
if (!ShippingSpeedSet.has(value)) {
throw new Error(Invalid shipping speed: ${value});
}
return value;
}
const speed = assertShippingSpeed(‘express‘);
console.log(speed); // ‘express‘
And here is the TypeScript side using as const, which gives you a union of literal values:
const TicketStatus = {
OPEN: ‘open‘,
INPROGRESS: ‘inprogress‘,
RESOLVED: ‘resolved‘
} as const;
type TicketStatus = typeof TicketStatus[keyof typeof TicketStatus];
function setStatus(status: TicketStatus) {
return status;
}
This pattern keeps the runtime enum and the compile-time type in sync, without duplicating anything. If you are in plain JavaScript, you can approximate this with JSDoc types and ESLint rules, but TypeScript does it better with fewer surprises.
AI-assisted workflows also help here: I often ask a code assistant to derive enum values from API docs, then I review for correctness. The key is still human ownership. If you do not own the list, you do not own the meaning.
Traditional vs modern enum workflows
Typical Pattern
—
Hand-coded strings in multiple files
Single enum object + as const for types
Schema validation + enum object
The modern pattern reduces duplication. When I can, I make the enum object the source of truth and derive everything else from it.
Exhaustiveness checks: catching missing cases early
One of the best hidden benefits of enums is that they allow exhaustive checks. When you switch over an enum and cover all cases, you can detect missing branches the moment a new value is added.
In plain JavaScript, I add a small helper to flag unexpected values:
function assertNever(value) {
throw new Error(Unexpected value: ${value});
}
const ReviewState = Object.freeze({
DRAFT: ‘draft‘,
INREVIEW: ‘inreview‘,
APPROVED: ‘approved‘,
REJECTED: ‘rejected‘
});
function nextReviewState(state, event) {
switch (state) {
case ReviewState.DRAFT:
return event === ‘submit‘ ? ReviewState.IN_REVIEW : state;
case ReviewState.IN_REVIEW:
return event === ‘approve‘ ? ReviewState.APPROVED : ReviewState.REJECTED;
case ReviewState.APPROVED:
case ReviewState.REJECTED:
return state;
default:
return assertNever(state);
}
}
In TypeScript, the same pattern gives you a compile-time error if you forget a case. Even in plain JS, the default branch gives you a clear runtime signal rather than silent misbehavior.
Iteration, mapping, and data boundaries
Enums become even more useful when you need to iterate or map across values. The key is to stay explicit so your intent is visible to teammates.
const SubscriptionTier = Object.freeze({
FREE: ‘free‘,
PRO: ‘pro‘,
TEAM: ‘team‘
});
const TierLabel = Object.freeze({
[SubscriptionTier.FREE]: ‘Free‘,
[SubscriptionTier.PRO]: ‘Pro‘,
[SubscriptionTier.TEAM]: ‘Team‘
});
function listTierOptions() {
return Object.values(SubscriptionTier).map((value) => ({
value,
label: TierLabel[value]
}));
}
console.log(listTierOptions());
For data boundaries, be explicit about serialization. If you use numeric enums internally, convert to strings before sending to analytics or APIs so the meaning survives. If you accept an external value, validate it immediately and fail fast. I prefer to validate at the boundary, not in the business layer, so the rest of the code can trust the value.
If you need reverse lookup, build it once:
const HttpStatus = Object.freeze({
OK: 200,
NOT_FOUND: 404,
SERVER_ERROR: 500
});
const HttpStatusName = Object.freeze({
200: ‘OK‘,
404: ‘NOT_FOUND‘,
500: ‘SERVER_ERROR‘
});
function describeStatus(code) {
return HttpStatusName[code] || ‘UNKNOWN‘;
}
console.log(describeStatus(404)); // ‘NOT_FOUND‘
This avoids iteration every time you need a name, and it keeps the mapping readable.
Bit flags and permission enums: compact, fast, and tricky
Sometimes you need a compact representation of many boolean options. Bit flags can be a good fit, but they require discipline. I use them for permissions and feature switches, not for general business states.
const Permission = Object.freeze({
READ: 1 << 0, // 1
WRITE: 1 << 1, // 2
DELETE: 1 << 2 // 4
});
function hasPermission(mask, perm) {
return (mask & perm) === perm;
}
function addPermission(mask, perm) {
return mask | perm;
}
function removePermission(mask, perm) {
return mask & ~perm;
}
let userPerms = 0;
userPerms = addPermission(userPerms, Permission.READ);
userPerms = addPermission(userPerms, Permission.WRITE);
console.log(hasPermission(userPerms, Permission.READ)); // true
console.log(hasPermission(userPerms, Permission.DELETE)); // false
This is efficient and easy to store, but it is easy to misuse. My rule is: only use flags when you truly need combinations of many values, and document them heavily. For most UI and API states, plain string enums are clearer and safer.
Enums across APIs, databases, and analytics
Enums become part of your system contract. That means you need to think about migrations and compatibility, not just code ergonomics.
I usually follow a few principles:
- Prefer string values at boundaries. They are self-describing and survive reordering.
- Avoid renames without migration. If you must rename, accept old values for a period and map them to the new ones.
- Write adapters at boundaries, not in core logic. Keep the domain model clean and translate at the edges.
Here is a safe pattern for deprecating a value:
const Plan = Object.freeze({
FREE: ‘free‘,
BASIC: ‘basic‘,
PRO: ‘pro‘
});
const DeprecatedPlan = Object.freeze({
STARTER: ‘starter‘
});
function normalizePlan(value) {
if (value === DeprecatedPlan.STARTER) return Plan.BASIC;
return value;
}
function assertPlan(value) {
const normalized = normalizePlan(value);
if (!Object.values(Plan).includes(normalized)) {
throw new Error(Invalid plan: ${value});
}
return normalized;
}
This lets you keep backward compatibility without letting old values leak into the rest of the system. I also recommend adding logging or analytics for deprecated values so you can see when it is safe to remove the compatibility path.
UI labels, localization, and display-safe enums
A common mistake is to use the enum value directly in the UI. That works for internal tools but breaks the moment you localize or want human-friendly labels.
Instead, keep a label mapping that is independent of the enum value:
const InvoiceStatus = Object.freeze({
DUE: ‘due‘,
PAID: ‘paid‘,
OVERDUE: ‘overdue‘
});
const InvoiceStatusLabel = Object.freeze({
[InvoiceStatus.DUE]: ‘Due‘,
[InvoiceStatus.PAID]: ‘Paid‘,
[InvoiceStatus.OVERDUE]: ‘Overdue‘
});
function getStatusLabel(value) {
return InvoiceStatusLabel[value] || ‘Unknown‘;
}
If you localize, you can replace the label map with a function that pulls from your i18n layer. The enum stays stable, and the presentation layer adapts.
Enum factories: reducing boilerplate
When you have many enums, you can build a small factory that standardizes behavior. I keep it minimal so it stays obvious and debuggable.
function createEnum(definition) {
const values = Object.values(definition);
const set = new Set(values);
const frozen = Object.freeze({ ...definition });
return Object.freeze({
...frozen,
values: () => values.slice(),
has: (value) => set.has(value),
assert: (value) => {
if (!set.has(value)) throw new Error(Invalid enum value: ${value});
return value;
}
});
}
const Channel = createEnum({
EMAIL: ‘email‘,
SMS: ‘sms‘,
PUSH: ‘push‘
});
console.log(Channel.EMAIL); // ‘email‘
console.log(Channel.has(‘sms‘)); // true
console.log(Channel.values()); // [‘email‘, ‘sms‘, ‘push‘]
This pattern keeps the enum and its helpers together without pushing you into class syntax. It also makes it easy to standardize how enums are validated across the app.
Common mistakes and how I avoid them
Here are the mistakes I see most, along with how I counter them:
- Mixing casing and naming. Decide on one style and stick to it. I use
UPPERSNAKEfor keys andlowersnakefor values when the values are used in APIs. - Using numbers without labels. If you cannot explain the number at a glance, add a label map or switch to strings.
- Duplicating the list. Keep one enum object and import it everywhere. If you need different shapes (labels, UI options), derive from it.
- Skipping validation at boundaries. Validate immediately when you parse user input or external payloads. It keeps downstream code simpler and safer.
- Overusing enums. If you cannot list all possible values on a single screen, it might not be an enum.
Performance concerns are usually minor. Enum lookups are property reads, which are fast. The slow part is often validation in tight loops. If you validate thousands of items, do it in batches or at ingestion. In most app workloads, the cost is small, typically in low milliseconds for batch validations, while the readability gain is huge.
Edge cases that bite in production
A few edge cases show up repeatedly in real systems. I try to anticipate them early.
Case sensitivity and whitespace
Inputs from browsers, integrations, and CSVs often bring weird casing or extra spaces. I normalize at the boundary, not inside core logic.
const Region = Object.freeze({
US: ‘us‘,
EU: ‘eu‘,
APAC: ‘apac‘
});
function normalizeRegion(input) {
return String(input).trim().toLowerCase();
}
function assertRegion(input) {
const value = normalizeRegion(input);
if (!Object.values(Region).includes(value)) {
throw new Error(Invalid region: ${input});
}
return value;
}
Drift between services
One service uses in_progress, another uses inProgress. This is the classic enum drift problem. The fix is boring but effective: publish a shared enum module or generate enums from a single schema that both services import.
Database constraints
If you use a database enum type, keep it synchronized. When I cannot change the DB enum easily, I keep the DB values as strings and add a check constraint or validation layer in the application. That keeps migrations safer.
Feature flags and experiments
Do not represent experiment names as enums unless the experiment set is stable. Experiments change weekly; enums are for stable value sets.
Performance considerations that matter in practice
I rarely optimize enum usage, but I do watch for two hot paths:
- Validation in loops. If you validate thousands of items, use a
Setand avoidObject.valuesper iteration. - Serialization cost. Symbols cannot be serialized, so avoid them at boundaries to prevent hidden errors.
Here is a simple micro-optimization I use in hot paths:
const Status = Object.freeze({
NEW: ‘new‘,
READY: ‘ready‘,
DONE: ‘done‘
});
const StatusSet = new Set(Object.values(Status));
function isStatus(value) {
return StatusSet.has(value);
}
This is not about shaving microseconds; it is about keeping the intent clear and avoiding repeated work. If you never hit a loop, skip the Set and keep it simple.
Testing enums: tiny tests, big confidence
Enums are small, but they are foundational. A tiny test suite can prevent surprising regressions.
Here is what I usually cover:
import { OrderStatus, isOrderStatus } from ‘./order-status.js‘;
test(‘order status is stable‘, () => {
expect(Object.values(OrderStatus)).toEqual([
‘pending‘,
‘paid‘,
‘shipped‘,
‘canceled‘
]);
});
test(‘order status validation‘, () => {
expect(isOrderStatus(‘paid‘)).toBe(true);
expect(isOrderStatus(‘unknown‘)).toBe(false);
});
These tests look trivial, but they catch accidental renames and ensure that validation helpers do not drift from the enum definition. When a breaking change happens, tests fail immediately instead of letting bugs slip into production.
Where enums shine: states, roles, events, and errors
I recommend enums when you have a fixed list that should be known in advance. Here are the sweet spots:
- State management:
OrderStatus,UploadState,SyncState. - User roles:
ADMIN,EDITOR,VIEWER. - Event types:
CLICK,SUBMIT,FOCUS. - Error codes: stable identifiers for error handling and analytics.
Here is a simple, complete state machine stub using enums:
const UploadState = Object.freeze({
IDLE: ‘idle‘,
UPLOADING: ‘uploading‘,
SUCCESS: ‘success‘,
ERROR: ‘error‘
});
function nextState(current, event) {
switch (current) {
case UploadState.IDLE:
return event === ‘start‘ ? UploadState.UPLOADING : current;
case UploadState.UPLOADING:
return event === ‘success‘ ? UploadState.SUCCESS : UploadState.ERROR;
default:
return current;
}
}
console.log(nextState(UploadState.IDLE, ‘start‘)); // ‘uploading‘
Even if you later migrate to a full state machine library, this pattern keeps the values explicit and easy to test.
When enums are the wrong tool
Enums are not for everything. I avoid them when:
- The set of values changes frequently or is user-defined (for example, tags, labels, categories).
- The list is too large to maintain by hand (for example, product IDs in a catalog).
- Values are experimental or tied to configuration files that vary per deployment.
In those cases, a configuration file or database table is the better source of truth. If the value set is dynamic, forcing it into code becomes a bottleneck. A helpful analogy: enums are like street signs; they should be stable and consistent. If the sign changes every week, it is not a sign anymore, it is noise.
Alternative approaches and tradeoffs
Enums are only one way to build consistency. A few alternatives are worth knowing so you can choose intentionally.
Sets for validation only
If you just need a list to validate against, a Set can be enough. The tradeoff is you lose names and readability.
const ValidStates = new Set([‘draft‘, ‘published‘, ‘archived‘]);
function isValidState(value) {
return ValidStates.has(value);
}
Config-driven values
If operations teams need to edit the values without a deploy, a config file or database table is a better fit. The tradeoff is weaker compile-time checks and more runtime validation.
Schema-first enums
If your system is schema-driven, you can define enums in a schema and generate code from it. The upside is consistency across services. The downside is build complexity and a dependency on code generation.
I do not think these alternatives replace enums. They are complements. Use the right tool for the job.
A practical checklist I use before shipping an enum
I keep a short checklist in my head:
- Is the list stable for months, not days?
- Will a human benefit from seeing the value in logs?
- Is the value going over the wire or into storage?
- Do I need helper methods or is a frozen object enough?
If I answer yes to the first three, I build a string enum. If I only need internal safety, I consider symbols. If I need helpers, I add a class wrapper or a tiny module that exports helpers.
A complete, real-world pattern I use in production
Here is a fuller example that shows how I structure an enum module for a production feature. This includes values, labels, validation, and integration helpers without adding too much complexity.
// billing-status.js
const BillingStatus = Object.freeze({
TRIAL: ‘trial‘,
ACTIVE: ‘active‘,
PASTDUE: ‘pastdue‘,
CANCELED: ‘canceled‘
});
const BillingStatusLabel = Object.freeze({
[BillingStatus.TRIAL]: ‘Trial‘,
[BillingStatus.ACTIVE]: ‘Active‘,
[BillingStatus.PAST_DUE]: ‘Past due‘,
[BillingStatus.CANCELED]: ‘Canceled‘
});
const BillingStatusValues = Object.freeze(Object.values(BillingStatus));
const BillingStatusSet = new Set(BillingStatusValues);
function isBillingStatus(value) {
return BillingStatusSet.has(value);
}
function assertBillingStatus(value) {
if (!isBillingStatus(value)) {
throw new Error(Invalid billing status: ${value});
}
return value;
}
function billingStatusLabel(value) {
return BillingStatusLabel[value] || ‘Unknown‘;
}
function billingStatusOptions() {
return BillingStatusValues.map((value) => ({
value,
label: billingStatusLabel(value)
}));
}
export {
BillingStatus,
BillingStatusValues,
isBillingStatus,
assertBillingStatus,
billingStatusLabel,
billingStatusOptions
};
It is not fancy, but it scales across backend validation, UI dropdowns, and logs. The value set stays centralized, and every other representation derives from it.
A final word on ergonomics and culture
Enums are small, but they shape the feel of a codebase. They nudge your project toward shared meaning instead of scattered strings. When you choose your enum pattern intentionally, you are choosing to make your future debugging sessions easier and your teammates more confident. That is the real payoff. I recommend you start with plain objects and freeze them by default, then grow into class-based or symbol-based variants only when the requirements force you there. The result is code that reads like a clear sentence rather than a riddle.
Take a quick pass through one module this week and replace a handful of magic strings with a named enum. Add a small validation helper at the boundary that uses a Set or Object.values. Watch how quickly the code becomes easier to scan and how quickly tests start telling you exactly what went wrong. This is one of those small habits that pays back repeatedly, especially as teams and systems scale.


