Introduction
PHP's memory_limit directive restricts how much memory a single script can consume. When processing large CSV uploads, loading the entire file into memory with functions like file() or str_getcsv() can quickly exceed this limit, causing a fatal error that terminates the script.
This is a common issue in data import features, where users upload CSV files with hundreds of thousands of rows.
Symptoms
- Script terminates with "Fatal error: Allowed memory size of X bytes exhausted"
- Error occurs at a specific row during CSV processing, not at the start
- Increasing memory_limit temporarily fixes the issue but fails on larger files
Common Causes
- Loading entire CSV file into memory with file() or file_get_contents()
- Storing all parsed rows in an array before processing
- PHP's memory_limit set too low for the expected file size
Step-by-Step Fix
- 1.Process CSV line by line with fopen/fgetcsv: Stream the file instead of loading it all at once.
- 2.```php
- 3.<?php
- 4.$handle = fopen($_FILES['upload']['tmp_name'], 'r');
- 5.if ($handle === false) {
- 6.throw new RuntimeException('Cannot open uploaded file');
- 7.}
// Read header row $headers = fgetcsv($handle);
// Process one row at a time - minimal memory usage $rowCount = 0; while (($row = fgetcsv($handle)) !== false) { $data = array_combine($headers, $row); processRow($data); // Process and discard each row $rowCount++;
if ($rowCount % 1000 === 0) { echo "Processed $rowCount rows\n"; } } fclose($handle); ```
- 1.Use generators for memory-efficient CSV parsing: Yield rows one at a time.
- 2.```php
- 3.<?php
- 4.function csvGenerator(string $filepath): Generator {
- 5.$handle = fopen($filepath, 'r');
- 6.$headers = fgetcsv($handle);
while (($row = fgetcsv($handle)) !== false) { yield array_combine($headers, $row); } fclose($handle); }
// Usage - processes one row at a time: foreach (csvGenerator($_FILES['upload']['tmp_name']) as $row) { importToDatabase($row); } ```
- 1.Increase memory_limit for specific scripts only: Set a higher limit for the import script.
- 2.```php
- 3.<?php
- 4.// Set higher limit for this script only (not global)
- 5.ini_set('memory_limit', '512M');
// Or for very large files: ini_set('memory_limit', '-1'); // Unlimited (use with caution) ```
Prevention
- Always use fopen/fgetcsv for CSV processing, never file() or file_get_contents()
- Use generators to stream data through processing pipelines
- Set memory_limit per-script rather than globally in php.ini
- Implement chunked database inserts (e.g., 1000 rows per transaction)