Introduction

PHP's memory_limit directive caps the amount of memory a single script can consume. When processing large CSV files (hundreds of MB or GB), loading the entire file into memory with file() or file_get_contents() exceeds this limit. Even fgetcsv() can accumulate memory if rows are stored in an array rather than processed immediately.

Symptoms

  • Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 67108864 bytes)
  • Script dies during CSV import with no meaningful error message
  • Works for small files (under 10MB) but fails for large files
  • memory_get_peak_usage() shows approaching memory_limit
  • str_getcsv() on large strings consuming excessive memory

``` PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 134217728 bytes) in /app/import.php on line 42

# Line 42: $data = file('large-export.csv'); // Loads ENTIRE file into memory array ```

Common Causes

  • file() or file_get_contents() loading entire CSV into memory
  • Storing all parsed rows in an array before processing
  • str_getcsv() on large file contents
  • Accumulating results in memory instead of streaming to output
  • PHP memory_limit too low for the file size

Step-by-Step Fix

  1. 1.Stream CSV with fgetcsv instead of loading all at once:
  2. 2.```php
  3. 3.// WRONG - loads entire file into memory
  4. 4.$lines = file('large-export.csv');
  5. 5.foreach ($lines as $line) {
  6. 6.$data = str_getcsv($line);
  7. 7.processRow($data);
  8. 8.}

// CORRECT - reads one line at a time $handle = fopen('large-export.csv', 'r'); if ($handle === false) { throw new RuntimeException('Cannot open CSV file'); }

// Skip header fgetcsv($handle);

// Process row by row - constant memory usage while (($row = fgetcsv($handle, 0, ',')) !== false) { processRow($row); // Memory freed after each iteration } fclose($handle); ```

  1. 1.Use generators for memory-efficient processing:
  2. 2.```php
  3. 3.function readCsvRows(string $filepath): Generator {
  4. 4.$handle = fopen($filepath, 'r');
  5. 5.if ($handle === false) {
  6. 6.throw new RuntimeException("Cannot open: $filepath");
  7. 7.}

$headers = fgetcsv($handle); if ($headers === false) { fclose($handle); return; }

$lineNum = 1; while (($row = fgetcsv($handle, 0, ',')) !== false) { $lineNum++; yield $lineNum => array_combine($headers, $row); }

fclose($handle); }

// Usage - constant memory regardless of file size foreach (readCsvRows('large-export.csv') as $lineNum => $row) { insertIntoDatabase($row);

// Periodically free up memory if ($lineNum % 1000 === 0) { gc_collect_cycles(); echo "Processed $lineNum rows\n"; } } ```

  1. 1.Process in database-friendly chunks:
  2. 2.```php
  3. 3.function importCsvInChunks(string $filepath, int $chunkSize = 500): void {
  4. 4.$handle = fopen($filepath, 'r');
  5. 5.$headers = fgetcsv($handle);
  6. 6.$chunk = [];
  7. 7.$totalProcessed = 0;

while (($row = fgetcsv($handle, 0, ',')) !== false) { $chunk[] = array_combine($headers, $row);

if (count($chunk) >= $chunkSize) { bulkInsert($chunk); $chunk = []; // Free memory $totalProcessed += $chunkSize; gc_collect_cycles(); } }

// Process remaining rows if (!empty($chunk)) { bulkInsert($chunk); $totalProcessed += count($chunk); }

fclose($handle); echo "Imported $totalProcessed rows\n"; } ```

  1. 1.Increase memory limit for specific scripts only:
  2. 2.```php
  3. 3.// At the top of the import script
  4. 4.ini_set('memory_limit', '512M');

// Or unlimited (use with caution) ini_set('memory_limit', '-1');

// Better: use with set_time_limit for long-running scripts ini_set('memory_limit', '512M'); set_time_limit(0); // No time limit ```

  1. 1.Monitor memory usage during processing:
  2. 2.```php
  3. 3.function processWithMonitoring(string $filepath): void {
  4. 4.$handle = fopen($filepath, 'r');
  5. 5.$lineNum = 0;

while (($row = fgetcsv($handle, 0, ',')) !== false) { $lineNum++; processRow($row);

// Log memory usage every 1000 rows if ($lineNum % 1000 === 0) { $usage = memory_get_usage(true) / 1024 / 1024; $peak = memory_get_peak_usage(true) / 1024 / 1024; error_log("Row $lineNum: current=${usage}MB, peak=${peak}MB"); } } fclose($handle); } ```

Prevention

  • Always use fgetcsv() with fopen() for CSV files over 10MB
  • Never use file() or file_get_contents() for large data files
  • Process rows immediately rather than storing in arrays
  • Use generators for memory-efficient iteration patterns
  • Set appropriate memory_limit per-script rather than globally in php.ini
  • Add gc_collect_cycles() calls during long-running batch processes
  • Consider using SplFileObject for object-oriented CSV reading:
  • ```php
  • $file = new SplFileObject('large.csv');
  • $file->setFlags(SplFileObject::READ_CSV);
  • foreach ($file as $row) {
  • processRow($row);
  • }
  • `