What's Actually Happening

Your PHP script crashes with a fatal error indicating that the allowed memory size has been exhausted. This occurs when your script attempts to allocate more memory than the configured limit. Common causes include loading large datasets into memory, processing big files, inefficient algorithms that accumulate data, memory leaks from circular references, handling large images or media files, or recursive functions that grow memory usage.

The error terminates your script immediately, potentially leaving incomplete operations, corrupted data, or a poor user experience. In development environments, you see the error message directly. In production, it may appear as a blank page, 500 error, or silently logged while breaking functionality. Memory exhaustion often happens gradually, making it harder to debug than immediate crashes.

The Error You'll See

``` Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 20480 bytes) in /var/www/html/process.php on line 234

Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 7168 bytes) in /var/www/app/services/DataProcessor.php:156

Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 32768 bytes) in /home/user/public_html/import.php on line 89

PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 36 bytes) in /var/www/html/wp-includes/class-wp-image-editor.php on line 456

Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 12288 bytes) in Unknown on line 0

[08-Apr-2026 16:15:23 UTC] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32768 bytes) in /var/www/html/includes/process_data.php on line 178 Stack trace: #0 /var/www/html/process.php(178): processLargeDataset() #1 /var/www/html/index.php(45): handleRequest() #2 {main} thrown

Fatal error: Out of memory (allocated 5242880) (tried to allocate 4194304 bytes) in /var/www/html/image_resize.php on line 67 ```

Memory values converted: - 134217728 bytes = 128 MB (default on many systems) - 268435456 bytes = 256 MB - 536870912 bytes = 512 MB - 67108864 bytes = 64 MB

Why This Happens

  1. 1.Large dataset loading: Loading entire database tables, huge CSV files, or complete API responses into arrays without chunking or streaming. Processing 100,000+ records in memory exceeds typical limits.
  2. 2.Image and media processing: Working with high-resolution images, video frames, or audio files. A single 4K image can consume 50+ MB during processing operations.
  3. 3.Inefficient data structures: Using nested arrays instead of objects, storing redundant copies of data, or accumulating results in memory instead of writing incrementally.
  4. 4.Memory leaks: Circular object references that prevent garbage collection, static variables accumulating data, closures capturing large contexts, or unbounded caches growing indefinitely.
  5. 5.Recursive operations: Deep recursion that duplicates stack frames, or recursive functions that accumulate results without clearing intermediate data.
  6. 6.Third-party library overhead: Libraries like PDF generators, spreadsheet processors, or ORM frameworks that cache extensive metadata or load entire documents into memory.
  7. 7.Default limits too low: PHP's default memory limit (128 MB on most systems) may be insufficient for legitimate heavy operations, especially on modern datasets.
  8. 8.Multiple concurrent requests: Shared hosting or containers with overall memory limits that get exhausted by multiple PHP processes running simultaneously.
  9. 9.Unoptimized queries: Loading entire result sets with all columns when only a subset is needed, or fetching related data eagerly that could be loaded lazily.
  10. 10.String concatenation in loops: Building large strings through repeated concatenation creates multiple copies in memory before the final result.

Step 1: Identify Current Memory Configuration

Check your current memory settings and usage:

```bash # Check PHP memory limit php -i | grep "memory_limit" php -r "echo ini_get('memory_limit');"

# For web context, create a test file cat > /var/www/html/memory_test.php << 'EOF' <?php header('Content-Type: text/plain'); echo "memory_limit: " . ini_get('memory_limit') . "\n"; echo "Current usage: " . round(memory_get_usage() / 1024 / 1024, 2) . " MB\n"; echo "Peak usage: " . round(memory_get_peak_usage() / 1024 / 1024, 2) . " MB\n"; echo "Real usage: " . round(memory_get_usage(true) / 1024 / 1024, 2) . " MB (allocated)\n"; echo "PHP Version: " . PHP_VERSION . "\n"; echo "SAPI: " . php_sapi_name() . "\n"; ?> EOF

curl "http://localhost/memory_test.php"

# Check php.ini location and configuration php --ini grep "memory_limit" /etc/php/*/fpm/php.ini grep "memory_limit" /etc/php/*/apache2/php.ini

# Check PHP-FPM pool settings grep -r "php_admin_value[memory_limit]" /etc/php/*/fpm/pool.d/

# Check system memory availability free -m cat /proc/meminfo | grep -E "MemTotal|MemAvailable" ```

Monitor memory during script execution:

```php <?php // memory_monitor.php - Track memory usage throughout execution

class MemoryMonitor { private $log = []; private $startTime; private $startMemory;

public function __construct() { $this->startTime = microtime(true); $this->startMemory = memory_get_usage(); }

public function checkpoint($label) { $current = memory_get_usage(); $peak = memory_get_peak_usage(); $elapsed = microtime(true) - $this->startTime;

$this->log[] = [ 'label' => $label, 'current' => $current, 'current_mb' => round($current / 1024 / 1024, 2), 'peak' => $peak, 'peak_mb' => round($peak / 1024 / 1024, 2), 'delta' => $current - ($this->log ? $this->log[count($this->log)-1]['current'] : $this->startMemory), 'delta_mb' => round(($current - ($this->log ? $this->log[count($this->log)-1]['current'] : $this->startMemory)) / 1024 / 1024, 2), 'elapsed' => round($elapsed, 3) ];

return $this; }

public function getLog() { return $this->log; }

public function printLog() { echo "Memory Usage Log:\n"; echo str_repeat("-", 80) . "\n"; printf("%-20s | %-10s | %-10s | %-10s | %-10s\n", "Label", "Current", "Peak", "Delta", "Time"); echo str_repeat("-", 80) . "\n";

foreach ($this->log as $entry) { printf("%-20s | %8s MB | %8s MB | %8s MB | %6s s\n", $entry['label'], $entry['current_mb'], $entry['peak_mb'], $entry['delta_mb'], $entry['elapsed'] ); }

echo str_repeat("-", 80) . "\n"; echo "Memory Limit: " . ini_get('memory_limit') . "\n"; echo "Final Usage: " . round(memory_get_usage() / 1024 / 1024, 2) . " MB\n"; echo "Peak Usage: " . round(memory_get_peak_usage() / 1024 / 1024, 2) . " MB\n"; }

public function getBottlenecks($thresholdMb = 1) { $bottlenecks = [];

foreach ($this->log as $entry) { if ($entry['delta_mb'] > $thresholdMb) { $bottlenecks[] = $entry; } }

return $bottlenecks; } }

// Usage $monitor = new MemoryMonitor(); $monitor->checkpoint('script_start');

// Your operations $data = loadData(); $monitor->checkpoint('after_load_data');

$processed = processData($data); $monitor->checkpoint('after_process_data');

$results = generateReport($processed); $monitor->checkpoint('after_generate_report');

$monitor->printLog();

// Find memory bottlenecks $bottlenecks = $monitor->getBottlenecks(5); // More than 5 MB increase if ($bottlenecks) { echo "\nMemory Bottlenecks (>5 MB):\n"; foreach ($bottlenecks as $b) { echo " {$b['label']}: +{$b['delta_mb']} MB\n"; } } ?> ```

Step 2: Increase Memory Limit Configuration

Adjust the memory limit in php.ini or at runtime:

```bash # Edit php.ini sudo nano /etc/php/8.2/fpm/php.ini

# Find and modify: # memory_limit = 256M ; Increase from 128M to 256M # memory_limit = 512M ; For heavy operations # memory_limit = 1024M ; For very large datasets (use with caution)

# Restart PHP-FPM sudo systemctl restart php8.2-fpm

# Verify change php -i | grep memory_limit

# For Apache mod_php sudo systemctl restart apache2

# For specific PHP-FPM pool sudo nano /etc/php/8.2/fpm/pool.d/www.conf

# Add or modify: # php_admin_value[memory_limit] = 512M

# Restart pool sudo systemctl restart php8.2-fpm ```

Increase at runtime in your script:

```php <?php // Increase memory limit at runtime ini_set('memory_limit', '512M'); // Set to 512 MB ini_set('memory_limit', '1024M'); // Set to 1 GB ini_set('memory_limit', '-1'); // No limit (dangerous, use with caution)

// Check if increase worked $limit = ini_get('memory_limit'); echo "Memory limit: $limit\n";

// For specific operations that need more memory function processLargeFile($file) { $originalLimit = ini_get('memory_limit'); ini_set('memory_limit', '512M');

try { $result = processFileInternal($file); return $result; } finally { // Restore original limit ini_set('memory_limit', $originalLimit); } }

// Increase with memory and execution time together ini_set('memory_limit', '512M'); ini_set('max_execution_time', '300');

// For WordPress (in wp-config.php) define('WP_MEMORY_LIMIT', '256M'); define('WP_MAX_MEMORY_LIMIT', '512M');

// For Laravel (in .env file) # PHP_MEMORY_LIMIT=512M

// Bootstrap file modification $app->configure(['php' => ['memory_limit' => '512M']]); ?> ```

Use .htaccess for Apache-hosted sites:

```apache # .htaccess - Increase memory limit php_value memory_limit 256M php_value max_execution_time 300

# For specific directories <Directory /var/www/html/admin> php_value memory_limit 512M </Directory> ```

Step 3: Optimize Data Loading with Chunking

Process large datasets in chunks instead of loading everything:

```php <?php // BAD: Load all records at once $allRecords = $db->query("SELECT * FROM large_table")->fetchAll(); foreach ($allRecords as $record) { processRecord($record); }

// GOOD: Process in chunks function processInChunks($table, $chunkSize = 1000) { $offset = 0;

do { $records = $db->query( "SELECT * FROM {$table} LIMIT {$chunkSize} OFFSET {$offset}" )->fetchAll();

foreach ($records as $record) { processRecord($record); }

$offset += $chunkSize;

// Free memory after each chunk unset($records); gc_collect_cycles();

} while (count($records) === $chunkSize); }

// GOOD: Use unbuffered queries (MySQL) function processWithUnbufferedQuery() { $pdo = new PDO($dsn, $user, $pass, [ PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false ]);

$stmt = $pdo->query("SELECT * FROM million_row_table");

while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) { processRecord($row); }

// No memory buildup because rows aren't buffered }

// GOOD: Process CSV file line by line function processLargeCsv($file) { $handle = fopen($file, 'r');

while (($line = fgetcsv($handle)) !== false) { processCsvLine($line); }

fclose($handle); }

// GOOD: Batch processing with progress tracking function batchProcess($items, $batchSize = 100, $callback) { $batches = array_chunk($items, $batchSize); $processed = 0; $total = count($items);

foreach ($batches as $batch) { foreach ($batch as $item) { $callback($item); $processed++; }

// Log progress if ($processed % 1000 === 0) { echo "Processed $processed / $total\n"; }

// Clear batch memory unset($batch);

// Optional: collect garbage if ($processed % 5000 === 0) { gc_collect_cycles(); } } }

// GOOD: Generator-based processing (memory efficient) function* generateRecords($table, $chunkSize = 1000) { $offset = 0;

do { $records = $db->query( "SELECT * FROM {$table} LIMIT {$chunkSize} OFFSET {$offset}" )->fetchAll();

foreach ($records as $record) { yield $record; // Yield one at a time }

$offset += $chunkSize; unset($records);

} while (count($records) === $chunkSize); }

// Usage with generator foreach (generateRecords('large_table', 100) as $record) { processRecord($record); // Memory stays constant because generator yields one item at a time } ?> ```

Step 4: Fix String Concatenation Memory Issues

Avoid inefficient string building patterns:

```php <?php // BAD: Concatenation in loop creates many copies $html = ''; foreach ($items as $item) { $html .= '<div>' . $item['name'] . '</div>'; // Creates new string each iteration }

// BAD: Building huge HTML this way $content = ''; for ($i = 0; $i < 10000; $i++) { $content .= "Line $i\n"; // Memory grows exponentially }

// GOOD: Use array and implode $parts = []; foreach ($items as $item) { $parts[] = '<div>' . $item['name'] . '</div>'; } $html = implode('', $parts);

// GOOD: Write directly to output or file foreach ($items as $item) { echo '<div>' . $item['name'] . '</div>'; }

// GOOD: Use output buffering for building content ob_start(); foreach ($items as $item) { echo '<div>' . $item['name'] . '</div>'; } $html = ob_get_clean();

// GOOD: Write large content to file incrementally $fp = fopen('output.txt', 'w'); for ($i = 0; $i < 100000; $i++) { fwrite($fp, "Line $i\n"); } fclose($fp);

// GOOD: Use string builder pattern class StringBuilder { private $parts = [];

public function append($string) { $this->parts[] = $string; return $this; }

public function toString() { return implode('', $this->parts); }

public function clear() { $this->parts = []; } }

$builder = new StringBuilder(); foreach ($items as $item) { $builder->append('<div>') ->append($item['name']) ->append('</div>'); } $html = $builder->toString();

// GOOD: Use HEREDOC/NOWDOC for large static strings $template = <<<HTML <div class="container"> <h1>{$title}</h1> <p>{$content}</p> </div> HTML; ?> ```

Step 5: Optimize Array and Object Usage

Reduce memory overhead of data structures:

```php <?php // Arrays consume more memory than objects in PHP 7+ // Each array element: ~72 bytes overhead // Each object property: ~32-40 bytes overhead

// BAD: Large nested arrays $users = [ ['id' => 1, 'name' => 'John', 'email' => 'john@test.com', 'data' => [...]], ['id' => 2, 'name' => 'Jane', 'email' => 'jane@test.com', 'data' => [...]], // ... 10000 entries = ~720 MB overhead alone ];

// GOOD: Use objects with typed properties (PHP 7.4+) class User { public int $id; public string $name; public string $email; public ?array $data; }

$users = []; foreach ($rawData as $row) { $user = new User(); $user->id = $row['id']; $user->name = $row['name']; $user->email = $row['email']; $users[] = $user; }

// GOOD: Use stdClass for simple objects $users = []; foreach ($rawData as $row) { $user = new stdClass(); $user->id = $row['id']; $user->name = $row['name']; $user->email = $row['email']; $users[] = $user; }

// GOOD: Avoid unnecessary nested structures // Instead of: $config = [ 'database' => [ 'mysql' => [ 'host' => 'localhost', 'port' => 3306, 'name' => 'mydb' ] ] ];

// Use flat structure when possible: $config = [ 'db_host' => 'localhost', 'db_port' => 3306, 'db_name' => 'mydb' ];

// GOOD: Don't store unnecessary data // Only select columns you need $users = $db->query("SELECT id, name FROM users")->fetchAll(); // Instead of: $users = $db->query("SELECT * FROM users")->fetchAll(); // Loads all columns

// GOOD: Unset variables when done $data = loadData(); $result = processData($data); unset($data); // Free memory immediately saveResult($result);

// GOOD: Use references for large data passing (when appropriate) function processLargeData(&$data) { // Pass by reference // Modify data in place instead of creating copy foreach ($data as &$item) { $item['processed'] = true; } }

// GOOD: Avoid deep copies $original = loadData(); // BAD: Creates full copy $copy = $original; // GOOD: Reference when you don't need separate copy $reference = &$original; ?> ```

Step 6: Handle Image Processing Memory

Optimize memory usage for image operations:

```php <?php // Image processing is memory-intensive // Formula: width × height × channels × bytes_per_channel // Example: 4000 × 3000 × 3 (RGB) × 1 byte = ~36 MB per image

// BAD: Loading multiple large images at once $images = []; foreach ($files as $file) { $images[] = imagecreatefromjpeg($file); // Each consumes ~36 MB }

// GOOD: Process one at a time foreach ($files as $file) { $img = imagecreatefromjpeg($file); processImage($img); imagedestroy($img); // Free immediately after processing

gc_collect_cycles(); // Force garbage collection }

// GOOD: Check image size before loading function processImageSafely($file) { $info = getimagesize($file); $width = $info[0]; $height = $info[1]; $channels = $info['channels'] ?? 3;

// Estimate memory needed (width × height × channels × 1.5 for overhead) $estimatedMemory = $width * $height * $channels * 1.5; $currentLimit = ini_get('memory_limit');

// Convert limit to bytes $limitBytes = return_bytes($currentLimit); $currentUsage = memory_get_usage(true); $availableMemory = $limitBytes - $currentUsage;

if ($estimatedMemory > $availableMemory * 0.8) { throw new RuntimeException( "Image too large. Estimated: " . round($estimatedMemory/1024/1024, 2) . " MB, Available: " . round($availableMemory/1024/1024, 2) . " MB" ); }

$img = imagecreatefromjpeg($file); // Process image... imagedestroy($img); }

function return_bytes($val) { $val = trim($val); $last = strtolower($val[strlen($val)-1]); $val = (int)$val; switch($last) { case 'g': $val *= 1024; case 'm': $val *= 1024; case 'k': $val *= 1024; } return $val; }

// GOOD: Use Imagick which handles large images better try { $imagick = new Imagick($file); $imagick->resizeImage(800, 600, Imagick::FILTER_LANCZOS, 1); $imagick->writeImage($outputFile); $imagick->clear(); // Free memory $imagick->destroy(); } catch (ImagickException $e) { // Handle error }

// GOOD: Process in tiles for very large images function processLargeImageInTiles($file, $tileSize = 512) { $img = imagecreatefromjpeg($file); $width = imagesx($img); $height = imagesy($img);

$output = imagecreatetruecolor($width, $height);

for ($y = 0; $y < $height; $y += $tileSize) { for ($x = 0; $x < $width; $x += $tileSize) { $tileWidth = min($tileSize, $width - $x); $tileHeight = min($tileSize, $height - $y);

$tile = imagecreatetruecolor($tileWidth, $tileHeight); imagecopy($tile, $img, 0, 0, $x, $y, $tileWidth, $tileHeight);

// Process tile processTile($tile);

// Copy back to output imagecopy($output, $tile, $x, $y, 0, 0, $tileWidth, $tileHeight); imagedestroy($tile); } }

imagedestroy($img); imagejpeg($output, $outputFile); imagedestroy($output); }

// GOOD: Use command-line tools for very large operations // Let external tools handle memory function resizeImageWithExternalTool($input, $output, $width, $height) { exec("convert {$input} -resize {$width}x{$height} {$output}"); // Or use ImageMagick command line exec("magick {$input} -resize {$width}x{$height} {$output}"); // Or GraphicsMagick exec("gm convert {$input} -resize {$width}x{$height} {$output}"); } ?> ```

Step 7: Fix Memory Leaks and Circular References

Identify and fix memory leaks:

```php <?php // Memory leak causes: circular references, static accumulation, closures

// BAD: Circular reference prevents garbage collection class ParentClass { private $child;

public function setChild($child) { $this->child = $child; $child->setParent($this); // Circular reference! } }

class ChildClass { private $parent;

public function setParent($parent) { $this->parent = $parent; } }

// GOOD: Use weak references (PHP 8.0+) or break cycles class ParentClass { private $child;

public function setChild($child) { $this->child = $child; $child->setParent($this); }

public function __destruct() { // Break circular reference on destruction if ($this->child) { $this->child->setParent(null); } $this->child = null; } }

// PHP 8.0+: Use WeakReference class ChildClass { private WeakReference|null $parent;

public function setParent($parent) { $this->parent = $parent ? WeakReference::create($parent) : null; }

public function getParent() { return $this->parent?->get(); } }

// BAD: Static array accumulating data class Cache { private static $data = [];

public static function add($key, $value) { self::$data[$key] = $value; // Never cleared, grows forever } }

// GOOD: Limit static cache size class Cache { private static $data = []; private static $maxSize = 1000;

public static function add($key, $value) { if (count(self::$data) >= self::$maxSize) { // Remove oldest items $removeCount = self::$maxSize / 10; self::$data = array_slice(self::$data, $removeCount, null, true); } self::$data[$key] = $value; }

public static function clear() { self::$data = []; } }

// BAD: Closure capturing large context $largeData = loadData(); // 100 MB $callback = function() use ($largeData) { return $largeData[0]; // Closure keeps entire $largeData in memory };

// GOOD: Capture only what's needed $largeData = loadData(); $neededItem = $largeData[0]; unset($largeData); // Free memory

$callback = function() use ($neededItem) { return $neededItem; // Only captures the small item };

// GOOD: Explicit garbage collection function processWithGarbageCollection($iterations) { for ($i = 0; $i < $iterations; $i++) { $data = loadDataChunk($i); processChunk($data); unset($data);

// Collect garbage every 100 iterations if ($i % 100 === 0) { gc_collect_cycles(); } } }

// Enable garbage collection (default is on) gc_enable();

// Force collection gc_collect_cycles();

// Check garbage collection stats $gcStats = gc_status(); echo "Runs: " . $gcStats['runs'] . "\n"; echo "Collected: " . $gcStats['collected'] . "\n"; echo "Threshold: " . $gcStats['threshold'] . "\n";

// Detect memory leaks with comparison function detectMemoryLeak($callback, $iterations = 100) { $startMemory = memory_get_usage();

for ($i = 0; $i < $iterations; $i++) { $callback(); }

$endMemory = memory_get_usage(); $delta = $endMemory - $startMemory;

if ($delta > 1024 * 1024) { // More than 1 MB growth echo "Potential memory leak detected: $delta bytes growth\n"; return true; }

return false; } ?> ```

Step 8: Optimize Database Operations

Reduce memory footprint of database queries:

```php <?php // BAD: Loading entire result set $pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, true); $stmt = $pdo->query("SELECT * FROM users"); $allUsers = $stmt->fetchAll(); // All in memory

// GOOD: Unbuffered query for large result sets $pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false); $stmt = $pdo->query("SELECT * FROM users");

while ($user = $stmt->fetch()) { processUser($user); }

// GOOD: Select only needed columns $stmt = $pdo->query("SELECT id, name FROM users WHERE active = 1");

// GOOD: Use LIMIT and OFFSET for pagination $stmt = $pdo->query("SELECT * FROM users LIMIT 100 OFFSET 0");

// GOOD: Process in batches function processUsersBatch($pdo, $batchSize = 500) { $offset = 0;

do { $stmt = $pdo->prepare("SELECT * FROM users LIMIT :limit OFFSET :offset"); $stmt->bindValue(':limit', $batchSize, PDO::PARAM_INT); $stmt->bindValue(':offset', $offset, PDO::PARAM_INT); $stmt->execute();

$batch = $stmt->fetchAll();

foreach ($batch as $user) { processUser($user); }

$offset += $batchSize;

// Clear memory $stmt->closeCursor(); unset($batch);

} while (count($batch) === $batchSize); }

// GOOD: Use PDO::FETCH_LAZY for minimal memory $stmt = $pdo->query("SELECT * FROM users"); $stmt->setFetchMode(PDO::FETCH_LAZY);

while ($user = $stmt->fetch()) { // $user is a lazy object, only creates properties when accessed echo $user->name; // Only name is materialized }

// GOOD: Count without fetching // Instead of: $stmt = $pdo->query("SELECT * FROM users"); $count = count($stmt->fetchAll()); // Fetches all!

// Use: $stmt = $pdo->query("SELECT COUNT(*) FROM users"); $count = $stmt->fetchColumn();

// GOOD: Aggregate in SQL, not PHP // Instead of: $stmt = $pdo->query("SELECT amount FROM orders"); $amounts = $stmt->fetchAll(PDO::FETCH_COLUMN); $total = array_sum($amounts); // Memory for all amounts

// Use: $stmt = $pdo->query("SELECT SUM(amount) FROM orders"); $total = $stmt->fetchColumn(); // Single value

// GOOD: Group in SQL $stmt = $pdo->query(" SELECT category, COUNT(*) as count, SUM(amount) as total FROM products GROUP BY category "); $stats = $stmt->fetchAll(); // Much smaller result set

// BAD: Eager loading all relationships (ORM) $users = User::with('posts', 'comments', 'profile')->get(); // Loads everything

// GOOD: Lazy loading or selective eager loading $users = User::with('posts')->get(); // Only posts // Or load per user as needed $users = User::all(); foreach ($users as $user) { if ($needsPosts) { $user->load('posts'); // Load only when needed } } ?> ```

Step 9: Use Streaming and File Operations

Stream large data instead of loading into memory:

```php <?php // Stream file reading function readLargeFile($file) { $handle = fopen($file, 'rb');

while (!feof($handle)) { $chunk = fread($handle, 8192); // 8 KB chunks processChunk($chunk); }

fclose($handle); }

// Stream file writing function writeLargeFile($data, $file) { $handle = fopen($file, 'wb');

foreach ($data as $item) { fwrite($handle, serialize($item) . "\n"); }

fclose($handle); }

// Stream CSV processing function streamCsv($inputFile, $outputFile, $processor) { $input = fopen($inputFile, 'r'); $output = fopen($outputFile, 'w');

// Skip header if present $header = fgetcsv($input); fputcsv($output, $header);

while (($row = fgetcsv($input)) !== false) { $processed = $processor($row); fputcsv($output, $processed); }

fclose($input); fclose($output); }

// Stream JSON from file function streamLargeJson($file) { $handle = fopen($file, 'r'); $buffer = ''; $inObject = false; $objectStart = 0;

while (!feof($handle)) { $chunk = fread($handle, 4096); $buffer .= $chunk;

// Parse complete JSON objects from buffer while (preg_match('/\{[^{}]*\}/', $buffer, $match, PREG_OFFSET_CAPTURE)) { $json = $match[0][0]; $pos = $match[0][1];

$data = json_decode($json, true); if ($data !== null) { processJsonObject($data); }

// Remove processed object from buffer $buffer = substr($buffer, $pos + strlen($json)); }

// Prevent buffer from growing too large if (strlen($buffer) > 100000) { throw new RuntimeException('Buffer overflow - malformed JSON'); } }

fclose($handle); }

// Use memory-efficient JSON streaming library // Example with salanhe/json-streaming-parser function parseLargeJsonFile($file) { $parser = new JsonStreamingParser\Parser(fopen($file, 'r'), new MyListener()); $parser->parse(); }

// Stream HTTP response function streamHttpResponse($url, $processor) { $handle = fopen($url, 'rb');

while (!feof($handle)) { $chunk = fread($handle, 8192); $processor($chunk); }

fclose($handle); }

// Memory-efficient XML parsing function parseLargeXml($file) { $xml = new XMLReader(); $xml->open($file);

while ($xml->read()) { if ($xml->nodeType === XMLReader::ELEMENT && $xml->name === 'item') { $item = simplexml_load_string($xml->readOuterXml()); processItem($item);

// Clear memory $xml->next('item'); } }

$xml->close(); }

// Stream output to client function streamLargeResponse($data) { header('Content-Type: application/json');

echo '['; $first = true;

foreach ($data as $item) { if (!$first) { echo ','; } echo json_encode($item); $first = false;

// Flush to client if (ob_get_level() > 0) { ob_flush(); } flush(); }

echo ']'; } ?> ```

Step 10: Implement Memory Monitoring and Alerts

Add memory monitoring to your application:

```php <?php // Memory threshold monitoring class MemoryGuard { private $warningThreshold; private $criticalThreshold; private $limitBytes;

public function __construct($warningPercent = 70, $criticalPercent = 85) { $this->limitBytes = $this->parseMemoryLimit(ini_get('memory_limit')); $this->warningThreshold = $this->limitBytes * ($warningPercent / 100); $this->criticalThreshold = $this->limitBytes * ($criticalPercent / 100); }

public function check() { $usage = memory_get_usage(true); $percent = ($usage / $this->limitBytes) * 100;

if ($usage > $this->criticalThreshold) { $this->handleCritical($usage, $percent); return 'critical'; }

if ($usage > $this->warningThreshold) { $this->handleWarning($usage, $percent); return 'warning'; }

return 'ok'; }

public function getStats() { return [ 'limit' => $this->limitBytes, 'limit_mb' => round($this->limitBytes / 1024 / 1024, 2), 'usage' => memory_get_usage(true), 'usage_mb' => round(memory_get_usage(true) / 1024 / 1024, 2), 'percent' => round((memory_get_usage(true) / $this->limitBytes) * 100, 2), 'peak' => memory_get_peak_usage(true), 'peak_mb' => round(memory_get_peak_usage(true) / 1024 / 1024, 2), 'available' => $this->limitBytes - memory_get_usage(true), 'available_mb' => round(($this->limitBytes - memory_get_usage(true)) / 1024 / 1024, 2) ]; }

private function handleWarning($usage, $percent) { error_log(sprintf( "Memory warning: %.2f MB used (%.1f%% of %.2f MB limit)", $usage / 1024 / 1024, $percent, $this->limitBytes / 1024 / 1024 )); }

private function handleCritical($usage, $percent) { error_log(sprintf( "Memory CRITICAL: %.2f MB used (%.1f%% of %.2f MB limit)", $usage / 1024 / 1024, $percent, $this->limitBytes / 1024 / 1024 ));

// Try to free memory gc_collect_cycles(); }

private function parseMemoryLimit($limit) { $limit = trim($limit); if ($limit === '-1') { return PHP_INT_MAX; }

$unit = strtolower(substr($limit, -1)); $value = (int)substr($limit, 0, -1);

switch ($unit) { case 'g': $value *= 1024 * 1024 * 1024; break; case 'm': $value *= 1024 * 1024; break; case 'k': $value *= 1024; break; }

return $value; } }

// Usage in long-running script $guard = new MemoryGuard(70, 85);

foreach ($data as $item) { processItem($item);

$status = $guard->check(); if ($status === 'critical') { // Stop processing, save progress saveProgress($processedCount); break; }

if ($status === 'warning') { // Try to free memory gc_collect_cycles(); } }

// Register shutdown function for memory tracking register_shutdown_function(function() { $error = error_get_last();

if ($error && strpos($error['message'], 'Allowed memory size') !== false) { // Log detailed info error_log(sprintf( "Memory exhausted. Peak: %.2f MB, Limit: %s", memory_get_peak_usage(true) / 1024 / 1024, ini_get('memory_limit') ));

// Could save state for recovery saveStateForRecovery(); } });

// Continuous monitoring for web requests class RequestMemoryMonitor { private static $stats = [];

public static function start() { self::$stats['start'] = memory_get_usage(); self::$stats['start_time'] = microtime(true); }

public static function end() { self::$stats['end'] = memory_get_usage(); self::$stats['peak'] = memory_get_peak_usage(); self::$stats['end_time'] = microtime(true);

$delta = self::$stats['end'] - self::$stats['start']; $elapsed = self::$stats['end_time'] - self::$stats['start_time'];

// Log if memory usage was high if ($delta > 10 * 1024 * 1024) { // More than 10 MB error_log(sprintf( "Request used %.2f MB memory in %.3f seconds. Peak: %.2f MB. URL: %s", $delta / 1024 / 1024, $elapsed, self::$stats['peak'] / 1024 / 1024, $_SERVER['REQUEST_URI'] ?? 'CLI' )); }

return self::$stats; } }

// Usage in application bootstrap RequestMemoryMonitor::start();

// ... application execution ...

$stats = RequestMemoryMonitor::end(); ?> ```

Checklist

StepActionVerified
1Identified current memory configuration
2Increased memory limit appropriately
3Implemented chunked data processing
4Fixed string concatenation patterns
5Optimized arrays to objects
6Handled image processing memory
7Fixed circular references and leaks
8Optimized database query memory
9Implemented streaming for large data
10Added memory monitoring and alerts
11Tested with realistic data volumes
12Verified garbage collection is enabled

Verify the Fix

  1. 1.Test memory limit configuration:
  2. 2.```bash
  3. 3.php -i | grep memory_limit
  4. 4.curl "http://localhost/memory_test.php"

# Create memory test cat > /tmp/test_memory.php << 'EOF' <?php ini_set('memory_limit', '256M'); echo "Memory limit: " . ini_get('memory_limit') . "\n";

// Allocate and free $data = str_repeat('x', 100 * 1024 * 1024); // 100 MB echo "Allocated 100 MB. Current: " . round(memory_get_usage() / 1024 / 1024, 2) . " MB\n";

unset($data); gc_collect_cycles(); echo "After free. Current: " . round(memory_get_usage() / 1024 / 1024, 2) . " MB\n"; ?> EOF

php /tmp/test_memory.php ```

  1. 1.Test chunked processing:
  2. 2.```php
  3. 3.<?php
  4. 4.// test_chunked.php
  5. 5.$monitor = new MemoryMonitor();
  6. 6.$monitor->checkpoint('start');

// Process 10,000 items in chunks $items = range(1, 10000); batchProcess($items, 100, function($item) { usleep(100); });

$monitor->checkpoint('end'); $monitor->printLog(); ?> ```

  1. 1.Run memory leak detection:
  2. 2.```php
  3. 3.<?php
  4. 4.// detect_leaks.php
  5. 5.function potentiallyLeakyFunction() {
  6. 6.static $accumulator = [];
  7. 7.$accumulator[] = str_repeat('x', 10000);
  8. 8.}

$isLeak = detectMemoryLeak(function() { potentiallyLeakyFunction(); }, 1000);

if ($isLeak) { echo "Memory leak detected!\n"; } ?> ```

  1. 1.Check garbage collection:
  2. 2.```bash
  3. 3.php -r "
  4. 4.echo 'GC enabled: ' . (gc_enabled() ? 'yes' : 'no') . '\n';
  5. 5.gc_enable();
  6. 6.\$stats = gc_status();
  7. 7.echo 'GC runs: ' . \$stats['runs'] . '\n';
  8. 8.echo 'GC collected: ' . \$stats['collected'] . '\n';
  9. 9."
  10. 10.`
  11. 11.Verify file streaming works:
  12. 12.```bash
  13. 13.# Create large test file
  14. 14.dd if=/dev/zero bs=1M count=100 | tr '\0' 'x' > /tmp/large.txt

# Test streaming php -r " \$handle = fopen('/tmp/large.txt', 'r'); \$total = 0; while (!feof(\$handle)) { \$chunk = fread(\$handle, 8192); \$total += strlen(\$chunk); } fclose(\$handle); echo 'Read: ' . round(\$total / 1024 / 1024, 2) . ' MB\n'; echo 'Peak memory: ' . round(memory_get_peak_usage() / 1024 / 1024, 2) . ' MB\n'; " ```

  1. 1.Monitor in production:
  2. 2.```bash
  3. 3.# Watch memory errors
  4. 4.tail -f /var/log/php-fpm/error.log | grep -i "memory"

# Check PHP-FPM memory usage ps aux | grep php-fpm | awk '{sum += $6} END {print "Total PHP-FPM memory: " sum/1024 " MB"}'

# Count memory exhausted errors grep -c "Allowed memory size" /var/log/php-fpm/error.log ```

  • [Fix PHP Composer Out of Memory](/articles/fix-php-composer-out-of-memory)
  • [Fix PHP Maximum Execution Time Exceeded](/articles/fix-php-maximum-execution-time-exceeded)
  • [Fix PHP FPM Pool Exhausted](/articles/fix-php-fpm-pool-exhausted)
  • [Fix MySQL Query Execution Timeout](/articles/fix-mysql-query-execution-timeout)
  • [Fix Laravel Memory Limit Exhausted](/articles/fix-laravel-memory-limit-exhausted)
  • [Fix WordPress Memory Exhausted Error](/articles/fix-wordpress-memory-exhausted-error)
  • [Fix PHP GD Library Image Processing](/articles/fix-php-gd-library-missing)
  • [Fix Redis Memory Usage High](/articles/fix-redis-memory-usage-high)
  • [Fix PHP Large File Upload Memory](/articles/fix-php-large-file-upload-memory)
  • [Fix Symfony Doctrine Memory Leak](/articles/fix-symfony-doctrine-memory-leak)