Introduction
Node.js has a default heap size limit (approximately 1.5GB on 64-bit systems). When parsing large JSON files with JSON.parse(), the string and resulting object tree can exceed this limit, causing "FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed" and process termination.
This is common when processing API responses, log files, or data exports that are hundreds of megabytes in size.
Symptoms
- Node.js crashes with "FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed"
- Process exits with code 134 (SIGABRT) or shows JavaScript heap out of memory
- Error occurs during JSON.parse() of a large string or response body
Common Causes
- JSON file or response body is larger than available heap space
- JSON.parse() creates a full object tree in memory, roughly 2-3x the string size
- Default V8 heap size is too small for the data being processed
Step-by-Step Fix
- 1.Increase V8 heap size: Allow Node.js to use more memory.
- 2.```bash
- 3.# Increase max old space size (in MB):
- 4.node --max-old-space-size=4096 app.js
# For npm scripts: NODE_OPTIONS="--max-old-space-size=4096" npm start
# Or in package.json: { "scripts": { "start": "node --max-old-space-size=4096 server.js" } } ```
- 1.Use streaming JSON parser for large files: Parse incrementally without loading the full file.
- 2.```javascript
- 3.const fs = require('fs');
- 4.const JSONStream = require('JSONStream');
- 5.const es = require('event-stream');
// Stream and parse large JSON array without loading it all: const stream = fs.createReadStream('large-data.json', { encoding: 'utf8' });
stream .pipe(JSONStream.parse('items.*')) // Parse each item in the array .pipe(es.mapSync((item) => { processItem(item); // Process one item at a time })) .on('error', (err) => console.error('Parse error:', err)) .on('end', () => console.log('Done processing')); ```
- 1.Use ndjson (newline-delimited JSON) format: One JSON object per line for streaming.
- 2.```javascript
- 3.const fs = require('fs');
- 4.const ndjson = require('ndjson');
// Instead of one giant JSON array, use one object per line: // data.ndjson: // {"id":1,"name":"Alice"} // {"id":2,"name":"Bob"} // {"id":3,"name":"Charlie"}
fs.createReadStream('data.ndjson') .pipe(ndjson.parse()) .on('data', (obj) => { // Each object is parsed independently processItem(obj); }) .on('end', () => console.log('All records processed')); ```
Prevention
- Use streaming JSON parsers (JSONStream, clarinet) for files larger than 100MB
- Prefer newline-delimited JSON (ndjson) for large datasets
- Monitor heap usage with process.memoryUsage() and set alerts
- Design APIs to support pagination instead of returning all data in one response