Last quarter I was wiring a checkout form that collected line items as a JavaScript array. The API accepted JSON text, not live objects. I passed the array straight into fetch and saw a 400 error. The array printed as [34, 24, 31, 48], so it looked like JSON. It wasn‘t. JSON is a string, and without stringifying the payload and setting the header, the server read an empty body. Once I fixed that boundary, the bug disappeared.
Since then I treat ‘convert an array to JSON‘ as two tasks: serialize the array into JSON text for transport or storage, and reshape the array into an object when the receiving side needs named keys. I‘ll show both, with runnable code, clear guidance on when to choose each path, and the edge cases that can turn simple conversions into production bugs. I‘ll also share the 2026-era workflow I use with TypeScript and schema validators so you can confirm that what you send is exactly what your API expects.
JSON is a string, not your array
JSON is a text format with a strict grammar: double-quoted keys, double-quoted strings, numbers, booleans, null, arrays, and objects. A JavaScript array lives in memory with methods, prototypes, and values that JSON does not understand, like functions or symbols. When people say ‘convert array to JSON‘, they often mean ‘turn this live array into JSON text.‘ That distinction matters because a JavaScript array and JSON text behave differently in transport and storage.
In JavaScript, the value [1, 2, 3] is an Array object. The JSON representation is the string ‘[1,2,3]‘ (including the quotes) or a sequence of bytes on the wire. If you send the live array through fetch without stringifying, the request body is empty in many runtimes because the body expects a string, Blob, ArrayBuffer, or stream. Some libraries coerce arrays with .toString(), which produces ‘1,2,3‘ — that is not JSON and will fail a JSON parser.
JSON also forbids trailing commas and comments. It has no representation for undefined, functions, or NaN. A JavaScript array can contain all of those, and your code will still run. The moment you cross the boundary, the rules change. I prefer JSON.stringify because it enforces the grammar for me and makes invalid values obvious.
On the way back you call JSON.parse to rebuild a JavaScript value. That round trip drops any methods and custom prototypes, and it forces everything into plain arrays and objects. I treat JSON as a shipping label: it describes the contents but it is not the box. I only attach the label at the boundary — network, localStorage, logs, or cross-process messaging — and I keep the array as a real array inside the app.
With that mindset, conversion becomes a boundary concern, not an internal data step. It also explains why there are two common outcomes: a JSON string when you need to transmit an array as-is, and a JSON object when you need to re-key the array for a contract that expects named properties.
The everyday choice: JSON.stringify on arrays
JSON.stringify is the canonical way to turn a JavaScript value into JSON text. It works on arrays, objects, and primitives, and it gives you predictable output for transport. The most important habit is to call it as close to the boundary as possible. I don‘t stringify in the middle of business logic; I stringify when I store or send.
Here is the simplest version:
const items = [34, 24, 31, 48]
const payload = JSON.stringify(items)
// payload is ‘[34,24,31,48]‘
That string is valid JSON and can be sent over the network, saved to a file, or persisted in localStorage.
Basic conversion with nested arrays
JSON.stringify handles nested arrays and arrays of objects as long as the values are JSON-friendly:
const lineItems = [
{ sku: ‘A12‘, qty: 2, price: 19.99 },
{ sku: ‘B55‘, qty: 1, price: 5.5 }
]
const payload = JSON.stringify(lineItems)
The output is a JSON array of objects. When you parse it later, you get a plain JavaScript array of plain objects.
Pretty printing for logs and debugging
When I‘m logging or storing files for humans to read, I use the spacing parameters:
const payload = JSON.stringify(lineItems, null, 2)
The second argument is a replacer, and the third is spacing. The result is larger, so I only use it for logs, fixtures, or documentation.
Custom replacers for safe serialization
If the array contains values that you need to transform (like Date objects), you can use a replacer function:
const payload = JSON.stringify(lineItems, (key, value) => {
if (value instanceof Date) return value.toISOString()
return value
})
The replacer runs for every value. It lets you shape the JSON without mutating the original array.
Using toJSON for domain objects
Another pattern is to define a toJSON method on your objects. JSON.stringify will call it automatically:
class Item {
constructor(sku, qty) {
this.sku = sku
this.qty = qty
this.internalId = Math.random()
}
toJSON() {
return { sku: this.sku, qty: this.qty }
}
}
const payload = JSON.stringify([new Item(‘A12‘, 2)])
This gives you control over what fields cross the boundary.
Sending arrays over the network (fetch, axios, and Node)
Most array-to-JSON bugs I see in the wild happen at the network boundary. The array itself is fine, but the request body is wrong or the header is missing.
fetch in the browser
When you use fetch, always stringify and set Content-Type to application/json:
const res = await fetch(‘/api/checkout‘, {
method: ‘POST‘,
headers: { ‘Content-Type‘: ‘application/json‘ },
body: JSON.stringify(lineItems)
})
If you skip the header, some servers will ignore the body or parse it as form data. If you skip JSON.stringify, many runtimes convert the array to ‘1,2,3‘ or send nothing at all.
axios and other clients
Axios will stringify objects for you, but it treats arrays differently depending on config. I still stringify explicitly when sending a raw array:
await axios.post(‘/api/checkout‘, JSON.stringify(lineItems), {
headers: { ‘Content-Type‘: ‘application/json‘ }
})
This keeps behavior consistent across environments.
Node and server-to-server calls
In Node, you might be using fetch, undici, or a custom HTTP client. The same boundary rules apply. One extra gotcha: some clients set Content-Length based on string size, so if you pass an array directly, the length might be zero and the server will reject it.
Double stringify and escaped JSON
I see a subtle bug when arrays are stringified twice. The server then receives a JSON string that contains JSON, not the array. The payload looks like a quoted array with extra quotes around it. If the server parses once, it gets a string, not an array. If you see extra quotes in logs, you‘re double stringifying.
Storing arrays as JSON text
Not every conversion is about HTTP. JSON is a common choice for storage because it‘s portable and readable.
localStorage and sessionStorage
These browser APIs only store strings. That means any array must be stringified:
localStorage.setItem(‘cart‘, JSON.stringify(lineItems))
When you load it back:
const raw = localStorage.getItem(‘cart‘)
const cart = raw ? JSON.parse(raw) : []
I always default to an empty array in case the key is missing or corrupted.
IndexedDB and modern storage
IndexedDB can store structured data without stringifying, but I still use JSON for portability across layers. It also makes debugging simpler because I can inspect the raw text in dev tools or logs.
Files, logs, and exports
When exporting data to a file, JSON arrays are straightforward:
const exportText = JSON.stringify(lineItems, null, 2)
This creates a JSON file that can be opened anywhere. For logs, I often keep one JSON per line (NDJSON) because it makes streaming and grep-friendly workflows easier.
Converting arrays into JSON objects (re-keying)
Sometimes the API doesn‘t want an array. It wants an object with named keys. That is still JSON, but the shape is different.
Convert positional arrays to named fields
If you have arrays that represent positions (like [x, y, z]), you can map them to keys:
const vector = [12, 5, 9]
const payload = JSON.stringify({ x: vector[0], y: vector[1], z: vector[2] })
This makes your payload self-describing and easier to validate.
Convert arrays to keyed objects
If you have pairs or tuples, Object.fromEntries is concise:
const pairs = [[‘sku‘, ‘A12‘], [‘qty‘, 2], [‘price‘, 19.99]]
const payload = JSON.stringify(Object.fromEntries(pairs))
Reduce for richer shapes
I use reduce when the mapping is more complex:
const lineItems = [
{ sku: ‘A12‘, qty: 2 },
{ sku: ‘B55‘, qty: 1 }
]
const bySku = lineItems.reduce((acc, item) => {
acc[item.sku] = { qty: item.qty }
return acc
}, {})
const payload = JSON.stringify(bySku)
This is useful when an API expects a dictionary keyed by id.
Arrays of objects: when to normalize
If your array is full of objects, you face a design choice: keep it as an array or normalize it to an object map.
Keep arrays when order matters
Arrays preserve order. If the order has meaning, keep it. For checkout items, order may not matter, but for a timeline or playlist it absolutely does.
Normalize when lookups are frequent
If you‘re going to look up by id or sku often, a keyed object reduces repeated scans. The typical pattern is to send both:
const payload = JSON.stringify({
order: lineItems.map(item => item.sku),
items: Object.fromEntries(lineItems.map(item => [item.sku, item]))
})
This mirrors the way many APIs and caches work.
Be mindful of duplicate keys
Normalization assumes unique keys. If the array can contain duplicates, you need to decide whether to merge, override, or group into arrays. I prefer explicit grouping:
const grouped = lineItems.reduce((acc, item) => {
acc[item.sku] = acc[item.sku] || []
acc[item.sku].push(item)
return acc
}, {})
Edge cases that break JSON conversion
JSON.stringify is strict, but JavaScript is not. Here are the edge cases I test for.
undefined, functions, and symbols
In arrays, undefined and functions become null in JSON. Symbols are dropped:
JSON.stringify([undefined, () => {}, Symbol(‘x‘)])
// ‘[null,null,null]‘
If those values matter, you need custom serialization or validation before stringify.
NaN and Infinity
JSON has no NaN or Infinity. They become null:
JSON.stringify([NaN, Infinity, -Infinity])
// ‘[null,null,null]‘
If you care about them, convert to strings or a sentinel value.
BigInt
BigInt throws a TypeError in JSON.stringify. I usually convert to string:
JSON.stringify({ id: 9007199254740993n })
// TypeError
JSON.stringify({ id: 9007199254740993n.toString() })
Date objects
Dates become ISO strings by default if they have a toJSON method, which they do. That is usually what I want, but I still make it explicit when correctness matters.
Map and Set
Map and Set stringify as empty objects unless you convert them. If you ever see {} in output where you expected data, check for Map or Set.
Circular references
JSON.stringify fails on circular structures with a clear error. This is the most common crash in complex apps. If your array can contain references back to itself, you need to de-cycle or replace cycles with ids.
Parsing JSON back into arrays safely
Converting to JSON is only half the story. The receiver must parse and validate.
Use try/catch with guardrails
I never call JSON.parse without a try/catch when input is external:
let items = []
try {
items = JSON.parse(raw)
} catch (err) {
items = []
}
Silent failure is bad, but a safe fallback beats a crash. In production, I log the error and the offending payload length.
Use a reviver for type repair
JSON.parse accepts a reviver function to adjust values:
const items = JSON.parse(raw, (key, value) => {
if (key === ‘createdAt‘ && typeof value === ‘string‘) {
return new Date(value)
}
return value
})
This keeps your array rich without storing non-JSON types.
Validate shape before trust
Parsing only guarantees syntax, not shape. I always validate the structure before using it, especially if the data comes from a user or another service.
The 2026 workflow: TypeScript + schema validators
This is the part that keeps me sane in production. I model the array as a type, then I validate it at runtime.
Define a schema once
With Zod, Valibot, or TypeBox, I define the expected shape:
import { z } from ‘zod‘
const LineItem = z.object({
sku: z.string(),
qty: z.number().int().min(1),
price: z.number().nonnegative()
})
const LineItems = z.array(LineItem)
Validate before stringify
When sending, I run validation so I don‘t ship garbage:
const safeItems = LineItems.parse(lineItems)
const payload = JSON.stringify(safeItems)
If validation fails, I surface an error before the network call.
Validate after parse on the server
On the server, I parse JSON and validate again:
const body = await req.json()
const items = LineItems.parse(body)
This protects against malformed clients and keeps internal logic consistent.
JSON schema for cross-language contracts
If I need a shared contract across services, I generate JSON Schema with TypeBox or Zod-to-JSON tools, then use AJV on the server. That gives me runtime validation plus a contract that other languages can consume.
Why I still validate even with TypeScript
TypeScript only exists at compile time. JSON payloads are runtime data. Validation is the bridge.
Performance considerations for large arrays
JSON is simple, but stringifying huge arrays can be expensive.
Time and memory costs
JSON.stringify runs in O(n) time relative to array size and creates a new string in memory. For arrays in the tens or hundreds of thousands of items, memory pressure and GC churn become visible. In my experience, the cost can jump from negligible to noticeable as soon as you cross a few megabytes.
Chunking and streaming
For very large arrays, I either chunk into multiple requests or switch to NDJSON, which streams items line by line. NDJSON is still JSON, but you don‘t need to hold the entire array in memory.
Compression
If the payload is large, I enable gzip or brotli at the transport layer. This reduces bandwidth with minimal code changes.
Beware of stringifying twice
If you stringify and then wrap it in another JSON object, you‘re adding overhead and confusion. If you need metadata plus array, send a proper object:
const payload = JSON.stringify({ items: lineItems, version: 1 })
Common pitfalls I still see
This is my short list of mistakes that show up in bug reports:
- Forgetting Content-Type: application/json
- Passing arrays directly to fetch or axios without stringify
- Double stringifying and receiving a JSON string instead of an array
- Expecting JSON.parse to restore class instances or methods
- Assuming undefined survives the boundary
- Using array.toString which produces comma-separated values, not JSON
- Storing arrays in localStorage without parse on retrieval
- Relying on object key order after converting arrays to objects
- Losing numeric precision for very large integers
- Logging JSON with circular references
Alternative approaches when JSON is not enough
JSON is ubiquitous, but it is not always the best format.
FormData for mixed payloads
If you need to upload files and arrays together, FormData makes sense. You can append the JSON string:
const form = new FormData()
form.append(‘items‘, JSON.stringify(lineItems))
form.append(‘file‘, fileInput.files[0])
This is a common compromise for uploads.
CSV or TSV for spreadsheets
If the output is for Excel or bulk imports, a CSV export might be easier. JSON arrays are still valid, but CSV makes certain workflows simpler.
Binary formats for performance
For high-throughput systems, MessagePack, Avro, or Protocol Buffers can reduce size and parsing time. These formats require extra tooling but can be worth it at scale.
Structured clone for same-process boundaries
Within a browser context, structured clone (postMessage, IndexedDB) lets you pass arrays without JSON at all. I only stringify when I need plain text or cross-language compatibility.
When I do not convert arrays to JSON
Conversion is not a default action. I skip it when:
- The array stays inside my app and never crosses a boundary
- I can use structured cloning between workers
- I only need to map or filter, not serialize
- I already have a binary protocol in place
- The receiving system expects form data, not JSON
Keeping arrays as arrays avoids unnecessary work and preserves types.
A practical checklist before you ship
Here is the mental checklist I run through:
- Is this a boundary? If yes, stringify.
- Does the receiver expect an array or an object map?
- Do I need custom serialization for dates, big integers, or special types?
- Are there any non-JSON values in the array?
- Did I set Content-Type to application/json?
- Did I validate shape with a schema before sending?
- Can I parse and validate safely on the receiving side?
If I can answer these quickly, I usually avoid surprises.
Closing thoughts
Converting an array to JSON sounds trivial, but in real systems it is a boundary contract. JSON.stringify is the tool, but you still need to care about shape, validation, and edge cases. I treat JSON as a transport format, not a data model, and that mindset saves me from subtle bugs.
When in doubt, stringify at the edge, validate on both sides, and keep arrays as arrays inside your application. That combination is boring in the best possible way, and it is the exact kind of boring you want in production.


