What's Actually Happening

Your PHP script stops executing mid-way through its operation, and you see a fatal error message indicating that the maximum execution time has been exceeded. This commonly occurs during file uploads, database operations, API calls to external services, bulk data processing, report generation, or any long-running task. The script may have worked fine in your development environment but fails in production where timeout limits are more restrictive.

The error typically appears in your browser as a blank page or a server error, while your PHP error logs contain the fatal error message. This issue affects web applications, command-line scripts run through web interfaces, cron jobs triggered via HTTP, and any PHP process that needs extended processing time.

The Error You'll See

``` Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/html/includes/processor.php on line 156

Fatal error: Maximum execution time of 60 seconds exceeded in /home/user/public_html/import.php on line 42

Fatal error: Maximum execution time of 120 seconds exceeded in /var/www/app/services/DataExporter.php on line 289

PHP Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/html/wp-content/plugins/bulk-importer.php on line 234

[08-Apr-2026 15:30:45 UTC] PHP Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/html/api/process.php on line 89 ```

In browser consoles or network responses:

``` GET http://example.com/process-data.php 500 (Internal Server Error)

Failed to load resource: the server responded with a status of 500 (Internal Server Error)

Error: Request timed out after 30000ms ```

In Nginx/Apache error logs:

``` 2026/04/08 15:30:45 [error] 1234#1234: *56789 upstream prematurely closed connection while reading response header from upstream, client: 192.168.1.100, server: example.com, request: "POST /import.php HTTP/1.1", upstream: "fastcgi://unix:/run/php/php8.1-fpm.sock:", host: "example.com"

[Fri Apr 08 15:30:45.123456 2026] [php:error] [pid 1234] [client 192.168.1.100:54321] PHP Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/html/long-process.php on line 156 ```

Why This Happens

  1. 1.Default PHP configuration limits: PHP installations come with conservative default execution time limits (typically 30 seconds) that may be too restrictive for legitimate long-running processes like data imports, report generation, or batch processing operations.
  2. 2.Large file uploads or imports: Processing large CSV files, XML imports, image batch processing, or video transcoding operations naturally require extended execution time beyond standard limits.
  3. 3.Complex database operations: Heavy database queries, bulk updates across millions of records, complex join operations, or database migrations can easily exceed default timeouts.
  4. 4.External API calls: Slow third-party API responses, network latency, rate limiting from external services, or processing large API responses can cause execution to exceed limits.
  5. 5.Recursive algorithms: Poorly optimized recursive functions, infinite loops, or algorithms with exponential time complexity can run indefinitely until hitting the execution limit.
  6. 6.Memory-intensive operations: Operations that process large datasets in memory may take longer due to garbage collection overhead and memory management.
  7. 7.Inefficient code: Nested loops with database queries (N+1 problems), redundant calculations, or lack of caching can dramatically increase execution time.
  8. 8.Shared hosting restrictions: Many shared hosting providers enforce strict execution time limits that cannot be overridden, requiring code optimization instead of configuration changes.

Step 1: Identify the Current Execution Time Limit

Check your current PHP configuration to understand what limit you're working with:

```bash # Check PHP configuration from command line php -i | grep "max_execution_time"

# Check via PHP script cat > /var/www/html/phpinfo.php << 'EOF' <?php echo "max_execution_time: " . ini_get('max_execution_time') . " seconds\n"; echo "memory_limit: " . ini_get('memory_limit') . "\n"; echo "PHP Version: " . PHP_VERSION . "\n"; echo "SAPI: " . php_sapi_name() . "\n"; phpinfo(INFO_CONFIGURATION); ?> EOF

# Check loaded php.ini file location php -i | grep "Loaded Configuration File"

# For web-based check curl -s "http://localhost/phpinfo.php" | grep -i "max_execution_time"

# Check PHP-FPM pool configuration grep -r "max_execution_time" /etc/php/*/fpm/pool.d/ grep -r "request_terminate_timeout" /etc/php/*/fpm/pool.d/ ```

Understanding your current limit helps determine if you need to increase it or optimize your code:

```php <?php // debug_timeout.php - Place at the start of your script error_log("Script started at: " . date('Y-m-d H:i:s')); error_log("Current max_execution_time: " . ini_get('max_execution_time')); error_log("Current memory_limit: " . ini_get('memory_limit')); error_log("PHP_SAPI: " . php_sapi_name());

// Check if we can modify it at runtime if (function_exists('set_time_limit')) { error_log("set_time_limit is available"); } else { error_log("set_time_limit is disabled"); }

// Check safe mode (deprecated but may affect older PHP versions) if (version_compare(PHP_VERSION, '5.4.0', '<')) { error_log("Safe mode: " . (ini_get('safe_mode') ? 'ON' : 'OFF')); } ?> ```

Step 2: Increase Execution Time via php.ini

The most reliable method is to modify the php.ini configuration file:

```bash # Find your php.ini file php --ini

# Common locations for php.ini: # Ubuntu/Debian with Apache: /etc/php/8.x/apache2/php.ini # Ubuntu/Debian with PHP-FPM: /etc/php/8.x/fpm/php.ini # CentOS/RHEL: /etc/php.ini # XAMPP: C:\xampp\php\php.ini # MAMP: /Applications/MAMP/bin/php/php8.x/conf/php.ini

# Backup the original file sudo cp /etc/php/8.2/fpm/php.ini /etc/php/8.2/fpm/php.ini.backup

# Edit php.ini sudo nano /etc/php/8.2/fpm/php.ini

# Find and modify these settings: # max_execution_time = 300 ; Change from 30 to 300 seconds (5 minutes) # max_input_time = 300 ; Maximum time to parse input data # memory_limit = 256M ; Increase memory if needed

# For very long-running processes: # max_execution_time = 600 ; 10 minutes # max_execution_time = 1800 ; 30 minutes # max_execution_time = 3600 ; 1 hour (use with caution)

# After modifying, restart PHP-FPM sudo systemctl restart php8.2-fpm

# For Apache mod_php sudo systemctl restart apache2

# Verify the change took effect php -i | grep "max_execution_time" ```

For different PHP versions and setups:

```bash # Check all PHP configurations on multi-version system for version in 7.4 8.0 8.1 8.2 8.3; do if [ -f "/etc/php/$version/fpm/php.ini" ]; then echo "PHP $version FPM:" grep "max_execution_time" /etc/php/$version/fpm/php.ini | grep -v "^;" fi done

# Apply changes to all PHP-FPM pools sudo systemctl restart php*-fpm

# For Nginx with PHP-FPM sudo systemctl restart nginx php8.2-fpm ```

Step 3: Configure PHP-FPM Pool Settings

When using PHP-FPM, you need to configure pool settings as well:

```bash # Edit PHP-FPM pool configuration sudo nano /etc/php/8.2/fpm/pool.d/www.conf

# Find or add these settings: # ; request_terminate_timeout is similar to max_execution_time # ; but enforced by PHP-FPM instead of PHP itself # request_terminate_timeout = 300

# ; slowlog helps identify which scripts are slow # slowlog = /var/log/php-fpm-slow.log # request_slowlog_timeout = 10s

# For specific pool configuration cat > /etc/php/8.2/fpm/pool.d/long-running.conf << 'EOF' [long-running] user = www-data group = www-data listen = /run/php/php8.2-fpm-long.sock listen.owner = www-data listen.group = www-data pm = dynamic pm.max_children = 5 pm.start_servers = 2 pm.min_spare_servers = 1 pm.max_spare_servers = 3

; Extended timeout for long-running processes php_admin_value[max_execution_time] = 1800 php_admin_value[max_input_time] = 1800 php_admin_value[memory_limit] = 512M

; Pool-level timeout request_terminate_timeout = 1800

; Enable slowlog for debugging slowlog = /var/log/php-fpm-long-slow.log request_slowlog_timeout = 30s EOF

# Restart PHP-FPM sudo systemctl restart php8.2-fpm

# Verify the pool is running sudo systemctl status php8.2-fpm ls -la /run/php/ ```

Configure Nginx to use the long-running pool for specific endpoints:

```nginx # /etc/nginx/sites-available/example.com server { # ... existing configuration ...

# Regular PHP handling location ~ \.php$ { fastcgi_pass unix:/run/php/php8.2-fpm.sock; # ... other fastcgi params ... }

# Long-running endpoints use dedicated pool location ~ ^/(import|export|process|batch)\.php$ { fastcgi_pass unix:/run/php/php8.2-fpm-long.sock; fastcgi_read_timeout 1800s; fastcgi_send_timeout 1800s; fastcgi_connect_timeout 60s; include fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; }

# Or match specific scripts location = /wp-content/plugins/batch-importer/process.php { fastcgi_pass unix:/run/php/php8.2-fpm-long.sock; fastcgi_read_timeout 1800s; include fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; } }

# Test and reload Nginx sudo nginx -t sudo systemctl reload nginx ```

Step 4: Increase Execution Time at Runtime in PHP

For specific scripts, increase the timeout within your PHP code:

```php <?php // Method 1: Using set_time_limit() - resets the counter set_time_limit(300); // 5 minutes from now

// Method 2: Using ini_set() for more control ini_set('max_execution_time', '300');

// Method 3: Remove time limit entirely (use with caution) set_time_limit(0); // No time limit

// Combined with memory limit for heavy operations ini_set('max_execution_time', '600'); ini_set('memory_limit', '512M');

// For specific operations that need more time function processLargeFile($filename) { // Increase limit for this specific operation $originalLimit = ini_get('max_execution_time'); set_time_limit(300); ini_set('memory_limit', '512M');

try { $result = []; $handle = fopen($filename, 'r');

while (($line = fgetcsv($handle)) !== false) { // Process each line $result[] = processLine($line);

// Reset timer periodically for very long operations set_time_limit(30); // Add 30 more seconds }

fclose($handle); return $result; } finally { // Restore original limit set_time_limit($originalLimit); } }

// Safe approach with progress tracking function processWithTimeout($items, $callback, $timeoutSeconds = 300) { $startTime = time(); $processed = 0;

foreach ($items as $key => $item) { // Check if we're approaching timeout if (time() - $startTime > $timeoutSeconds - 10) { // Save progress and exit gracefully return [ 'processed' => $processed, 'remaining' => count($items) - $processed, 'last_key' => $key, 'timeout' => true ]; }

$callback($item, $key); $processed++; }

return [ 'processed' => $processed, 'remaining' => 0, 'timeout' => false ]; } ?> ```

Step 5: Handle Execution Time in Web Server Configuration

Configure your web server to allow longer request processing:

```nginx # Nginx configuration - /etc/nginx/sites-available/example.com server { listen 80; server_name example.com; root /var/www/html;

# Increase client body timeout for uploads client_body_timeout 300s;

# Increase general timeouts send_timeout 300s; keepalive_timeout 300s;

location ~ \.php$ { fastcgi_pass unix:/run/php/php8.2-fpm.sock; fastcgi_index index.php; include fastcgi_params;

# Critical: FastCGI read timeout fastcgi_read_timeout 300s; fastcgi_send_timeout 300s; fastcgi_connect_timeout 60s;

fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; }

# For specific long-running scripts location = /import.php { fastcgi_pass unix:/run/php/php8.2-fpm.sock; include fastcgi_params;

# Extended timeout for import operations fastcgi_read_timeout 1800s; fastcgi_send_timeout 1800s;

# Increase buffer sizes for large responses fastcgi_buffer_size 128k; fastcgi_buffers 4 256k; fastcgi_busy_buffers_size 256k;

fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; } }

# Apply changes sudo nginx -t && sudo systemctl reload nginx ```

For Apache:

```apache # /etc/apache2/sites-available/example.com.conf <VirtualHost *:80> ServerName example.com DocumentRoot /var/www/html

# Increase timeout directives Timeout 300 ProxyTimeout 300

<Directory /var/www/html> # PHP settings via .htaccess or directory config php_value max_execution_time 300 php_value max_input_time 300 php_value memory_limit 256M

# For specific files <FilesMatch "^(import|export|process)\.php$"> php_value max_execution_time 1800 php_value max_input_time 1800 php_value memory_limit 512M </FilesMatch> </Directory>

# For mod_proxy_fcgi (PHP-FPM) <Proxy "unix:/run/php/php8.2-fpm.sock|fcgi://localhost"> ProxySet timeout=300 </Proxy>

<FilesMatch \.php$> SetHandler "proxy:fcgi://localhost" </FilesMatch> </VirtualHost>

# Enable required modules sudo a2enmod proxy proxy_fcgi headers sudo systemctl reload apache2 ```

Step 6: Optimize Your PHP Code to Reduce Execution Time

Often the best solution is to optimize your code to run faster:

```php <?php // BAD: N+1 query problem - causes timeouts function getPostsWithCommentsBad($postIds) { $posts = []; foreach ($postIds as $id) { // This runs a separate query for EACH post! $posts[] = $db->query("SELECT * FROM posts WHERE id = ?", [$id]); $posts['comments'] = $db->query("SELECT * FROM comments WHERE post_id = ?", [$id]); } return $posts; }

// GOOD: Single query with join function getPostsWithCommentsGood($postIds) { $placeholders = implode(',', array_fill(0, count($postIds), '?')); return $db->query(" SELECT p.*, c.* FROM posts p LEFT JOIN comments c ON p.id = c.post_id WHERE p.id IN ($placeholders) ", $postIds); }

// BAD: Loading all data into memory function processAllRecordsBad() { $allRecords = $db->query("SELECT * FROM large_table")->fetchAll(); foreach ($allRecords as $record) { processRecord($record); } }

// GOOD: Process in chunks function processAllRecordsGood() { $chunkSize = 100; $offset = 0;

do { $records = $db->query( "SELECT * FROM large_table LIMIT ? OFFSET ?", [$chunkSize, $offset] )->fetchAll();

foreach ($records as $record) { processRecord($record); }

$offset += $chunkSize;

// Reset execution timer and free memory set_time_limit(30); gc_collect_cycles();

} while (count($records) === $chunkSize); }

// GOOD: Use unbuffered queries for large datasets function processLargeDataset() { $pdo = new PDO($dsn, $user, $pass, [ PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false ]);

$stmt = $pdo->query("SELECT * FROM million_row_table");

while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) { processRecord($row); } }

// GOOD: Batch processing with progress function batchProcess($items, $batchSize = 100) { $totalItems = count($items); $batches = array_chunk($items, $batchSize);

foreach ($batches as $index => $batch) { processBatch($batch);

// Log progress $processed = ($index + 1) * $batchSize; $percent = round(($processed / $totalItems) * 100, 1); error_log("Processed $processed of $totalItems items ($percent%)");

// Reset timer for next batch set_time_limit(60); } } ?> ```

Step 7: Use Background Processing for Long Operations

For truly long operations, move them to background processes:

```php <?php // Option 1: Use a job queue (Laravel-style) // In your web controller public function startImport(Request $request) { // Validate and queue the job ProcessImportJob::dispatch($request->file('import_file')) ->onQueue('imports');

return response()->json([ 'message' => 'Import started', 'job_id' => $jobId ]); }

// Option 2: Simple background execution function runInBackground($command) { // Execute command in background $outputFile = sys_get_temp_dir() . '/process_' . uniqid() . '.log'; $pidFile = sys_get_temp_dir() . '/process_' . uniqid() . '.pid';

if (PHP_OS_FAMILY === 'Windows') { pclose(popen("start /B php $command > $outputFile 2>&1", "r")); } else { exec("nohup php $command > $outputFile 2>&1 & echo $! > $pidFile"); }

return [ 'output_file' => $outputFile, 'pid_file' => $pidFile ]; }

// Option 3: Use Symfony Process component use Symfony\Component\Process\Process; use Symfony\Component\Process\PhpExecutableFinder;

function startLongRunningProcess($script, $args = []) { $phpBinary = (new PhpExecutableFinder())->find(); $scriptPath = base_path($script);

$process = new Process( array_merge([$phpBinary, $scriptPath], $args) );

$process->setTimeout(null); // No timeout $process->setWorkingDirectory(base_path()); $process->start();

// Store process ID for tracking $pid = $process->getPid(); cache()->put("process_$pid", [ 'status' => 'running', 'started_at' => now(), 'script' => $script ], now()->addHours(24));

return $process; }

// Check process status function checkProcessStatus($pid) { $processInfo = cache()->get("process_$pid");

if (!$processInfo) { return ['status' => 'not_found']; }

// Check if process is still running if (PHP_OS_FAMILY === 'Windows') { exec("tasklist /FI \"PID eq $pid\" 2>NUL", $output); $isRunning = count($output) > 1; } else { $isRunning = file_exists("/proc/$pid"); }

return [ 'status' => $isRunning ? 'running' : 'completed', 'started_at' => $processInfo['started_at'] ]; } ?> ```

Create a dedicated CLI script for long-running operations:

```php <?php // long_process.php - Run from command line // Usage: php long_process.php --file=data.csv --batch-size=100

if (php_sapi_name() !== 'cli') { die('This script must be run from command line'); }

$options = getopt('', ['file:', 'batch-size::', 'resume::']);

$file = $options['file'] ?? null; $batchSize = (int)($options['batch-size'] ?? 100); $resumeFrom = (int)($options['resume'] ?? 0);

if (!$file || !file_exists($file)) { die("Error: File not found. Usage: php long_process.php --file=path/to/file.csv\n"); }

// No timeout for CLI scripts by default, but set it explicitly set_time_limit(0); ini_set('memory_limit', '512M');

$handle = fopen($file, 'r'); $lineNumber = 0; $processed = 0; $progressFile = $file . '.progress';

// Resume from last position if ($resumeFrom > 0 || file_exists($progressFile)) { $savedPosition = file_exists($progressFile) ? (int)file_get_contents($progressFile) : 0; fseek($handle, $savedPosition); echo "Resuming from position: $savedPosition\n"; }

while (($line = fgetcsv($handle)) !== false) { $lineNumber++;

// Skip to resume point if ($lineNumber < $resumeFrom) { continue; }

try { processLine($line); $processed++;

// Save progress every batch if ($processed % $batchSize === 0) { $position = ftell($handle); file_put_contents($progressFile, $position); echo "Processed $processed rows...\n"; }

} catch (Exception $e) { error_log("Error on line $lineNumber: " . $e->getMessage()); } }

fclose($handle); unlink($progressFile);

echo "Completed. Processed $processed rows.\n";

function processLine($line) { // Your processing logic here // Simulate work usleep(1000); // 1ms per line } ?> ```

Step 8: Implement Chunked Processing with Progress Tracking

For operations that must run in a web context, implement chunked processing:

```php <?php // chunked_process.php - Process in chunks via AJAX calls session_start();

class ChunkedProcessor { private $totalItems; private $chunkSize; private $dataFile; private $progressKey;

public function __construct($items, $chunkSize = 50) { $this->totalItems = count($items); $this->chunkSize = $chunkSize; $this->progressKey = 'process_' . md5(serialize($items));

// Store data in temp file (for large datasets) $this->dataFile = sys_get_temp_dir() . '/' . $this->progressKey . '.data'; file_put_contents($this->dataFile, serialize($items));

// Initialize progress $_SESSION[$this->progressKey] = [ 'total' => $this->totalItems, 'processed' => 0, 'last_chunk' => 0, 'status' => 'pending' ]; }

public function processChunk($chunkNumber) { $items = unserialize(file_get_contents($this->dataFile)); $offset = $chunkNumber * $this->chunkSize; $chunk = array_slice($items, $offset, $this->chunkSize);

set_time_limit(60); // Each chunk has 60 seconds

$results = []; foreach ($chunk as $item) { $results[] = $this->processItem($item); }

// Update progress $_SESSION[$this->progressKey]['processed'] = min($offset + count($chunk), $this->totalItems); $_SESSION[$this->progressKey]['last_chunk'] = $chunkNumber; $_SESSION[$this->progressKey]['status'] = ($offset + count($chunk) >= $this->totalItems) ? 'completed' : 'processing';

return [ 'chunk' => $chunkNumber, 'processed_in_chunk' => count($chunk), 'total_processed' => $_SESSION[$this->progressKey]['processed'], 'total' => $this->totalItems, 'percent' => round(($_SESSION[$this->progressKey]['processed'] / $this->totalItems) * 100, 1), 'status' => $_SESSION[$this->progressKey]['status'], 'next_chunk' => ($offset + count($chunk) < $this->totalItems) ? $chunkNumber + 1 : null ]; }

public function getProgress() { return $_SESSION[$this->progressKey] ?? null; }

private function processItem($item) { // Your processing logic usleep(10000); // 10ms simulated work return ['item' => $item, 'processed' => true]; } }

// API endpoint usage if (isset($_GET['action'])) { header('Content-Type: application/json');

switch ($_GET['action']) { case 'init': $items = json_decode(file_get_contents('php://input'), true); $processor = new ChunkedProcessor($items, 50); echo json_encode([ 'success' => true, 'progress_key' => $processor->progressKey, 'total' => count($items), 'chunk_size' => 50 ]); break;

case 'process': $chunk = (int)$_GET['chunk']; // Load processor state from session // Process the chunk echo json_encode($result); break;

case 'progress': echo json_encode($processor->getProgress()); break; } } ?> ```

Frontend JavaScript for chunked processing:

```javascript // chunked-processor.js class ChunkedProcessor { constructor(apiEndpoint, items, options = {}) { this.apiEndpoint = apiEndpoint; this.items = items; this.chunkSize = options.chunkSize || 50; this.onProgress = options.onProgress || (() => {}); this.onComplete = options.onComplete || (() => {}); this.onError = options.onError || (() => {}); this.currentChunk = 0; this.processed = 0; }

async start() { try { // Initialize processing const initResponse = await fetch(${this.apiEndpoint}?action=init, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify(this.items) });

const initData = await initResponse.json(); this.progressKey = initData.progress_key; this.totalChunks = Math.ceil(initData.total / this.chunkSize);

// Process chunks sequentially await this.processNextChunk();

} catch (error) { this.onError(error); } }

async processNextChunk() { if (this.currentChunk >= this.totalChunks) { this.onComplete({ processed: this.processed }); return; }

try { const response = await fetch( ${this.apiEndpoint}?action=process&chunk=${this.currentChunk} ); const data = await response.json();

this.processed = data.total_processed; this.onProgress({ processed: this.processed, total: data.total, percent: data.percent, chunk: this.currentChunk, totalChunks: this.totalChunks });

if (data.status === 'completed') { this.onComplete({ processed: this.processed }); } else { this.currentChunk++; await this.processNextChunk(); }

} catch (error) { this.onError(error, this.currentChunk); } }

async getProgress() { const response = await fetch(${this.apiEndpoint}?action=progress); return await response.json(); } }

// Usage const processor = new ChunkedProcessor('/api/process.php', items, { chunkSize: 50, onProgress: (progress) => { console.log(Processed ${progress.processed} of ${progress.total} (${progress.percent}%)); document.getElementById('progress-bar').style.width = ${progress.percent}%; document.getElementById('progress-text').textContent = ${progress.processed} / ${progress.total} (${progress.percent}%); }, onComplete: (result) => { console.log('Processing complete!', result); alert('Processing completed successfully!'); }, onError: (error, chunk) => { console.error(Error on chunk ${chunk}:, error); } });

processor.start(); ```

Step 9: Handle Timeouts Gracefully

Implement proper timeout handling and user feedback:

```php <?php // timeout_handler.php class TimeoutHandler { private $startTime; private $timeoutSeconds; private $gracePeriod = 10; // Seconds before actual timeout to stop gracefully

public function __construct($timeoutSeconds = 30) { $this->startTime = time(); $this->timeoutSeconds = $timeoutSeconds; set_time_limit($timeoutSeconds + 5); }

public function shouldStop() { return (time() - $this->startTime) > ($this->timeoutSeconds - $this->gracePeriod); }

public function getRemainingTime() { return max(0, $this->timeoutSeconds - (time() - $this->startTime)); }

public function checkAndSave($data, $file) { if ($this->shouldStop()) { // Save current state file_put_contents($file, json_encode([ 'data' => $data, 'timestamp' => time(), 'remaining' => $this->getRemainingTime() ])); return true; } return false; } }

// Usage in long-running script function processLargeDataset($dataFile, $outputFile) { $timeout = new TimeoutHandler(55); // 55 seconds, will stop at 45

$input = fopen($dataFile, 'r'); $output = fopen($outputFile, 'a'); $stateFile = $outputFile . '.state';

// Load previous state if exists $resumeFrom = 0; if (file_exists($stateFile)) { $state = json_decode(file_get_contents($stateFile), true); $resumeFrom = $state['data']['last_line'] ?? 0; fseek($input, $state['data']['file_position'] ?? 0); }

$lineNumber = 0; $results = [];

while (($line = fgets($input)) !== false) { $lineNumber++;

if ($lineNumber < $resumeFrom) { continue; }

// Process the line $result = processLine($line); $results[] = $result;

// Check for timeout if ($timeout->checkAndSave([ 'last_line' => $lineNumber, 'file_position' => ftell($input), 'results_count' => count($results) ], $stateFile)) { // Write results so far foreach ($results as $r) { fwrite($output, json_encode($r) . "\n"); }

// Return with indication to continue return [ 'status' => 'timeout', 'processed' => $lineNumber, 'resume_from' => $lineNumber + 1, 'message' => 'Timeout reached. Call again to continue processing.' ]; } }

// Cleanup fclose($input); fclose($output); if (file_exists($stateFile)) { unlink($stateFile); }

return [ 'status' => 'completed', 'processed' => $lineNumber, 'results' => $results ]; }

// Client-side timeout handling function fetchWithTimeout($url, $timeout = 25, $options = []) { $context = stream_context_create([ 'http' => [ 'timeout' => $timeout, 'method' => $options['method'] ?? 'GET', 'header' => $options['headers'] ?? [], 'content' => $options['body'] ?? null ] ]);

$result = @file_get_contents($url, false, $context);

if ($result === false) { $error = error_get_last(); if (strpos($error['message'] ?? '', 'timeout') !== false) { return [ 'success' => false, 'error' => 'timeout', 'message' => 'Request timed out' ]; } return [ 'success' => false, 'error' => 'request_failed', 'message' => $error['message'] ?? 'Unknown error' ]; }

return [ 'success' => true, 'data' => json_decode($result, true) ]; } ?> ```

Step 10: Monitor and Debug Timeout Issues

Implement comprehensive logging and monitoring:

```php <?php // timeout_monitor.php class TimeoutMonitor { private $logFile; private $startTime; private $checkpoints = [];

public function __construct($logFile = null) { $this->logFile = $logFile ?? sys_get_temp_dir() . '/php_timeout_' . date('Y-m-d') . '.log'; $this->startTime = microtime(true); }

public function checkpoint($name) { $elapsed = microtime(true) - $this->startTime; $memory = memory_get_usage(true) / 1024 / 1024; // MB

$this->checkpoints[] = [ 'name' => $name, 'time' => $elapsed, 'memory' => $memory ];

$this->log(sprintf( "[%s] Checkpoint: %s - Time: %.3fs - Memory: %.2fMB", date('Y-m-d H:i:s'), $name, $elapsed, $memory ));

return $elapsed; }

public function log($message) { file_put_contents($this->logFile, $message . "\n", FILE_APPEND); }

public function getReport() { $totalTime = microtime(true) - $this->startTime; $peakMemory = memory_get_peak_usage(true) / 1024 / 1024;

$report = [ 'total_time' => $totalTime, 'peak_memory' => $peakMemory, 'checkpoints' => $this->checkpoints, 'php_config' => [ 'max_execution_time' => ini_get('max_execution_time'), 'memory_limit' => ini_get('memory_limit'), 'sapi' => php_sapi_name() ] ];

return $report; }

public function analyzeBottlenecks() { $bottlenecks = [];

for ($i = 1; $i < count($this->checkpoints); $i++) { $duration = $this->checkpoints[$i]['time'] - $this->checkpoints[$i-1]['time'];

if ($duration > 1.0) { // More than 1 second between checkpoints $bottlenecks[] = [ 'between' => [$this->checkpoints[$i-1]['name'], $this->checkpoints[$i]['name']], 'duration' => $duration, 'memory_increase' => $this->checkpoints[$i]['memory'] - $this->checkpoints[$i-1]['memory'] ]; } }

return $bottlenecks; } }

// Usage $monitor = new TimeoutMonitor();

$monitor->checkpoint('start');

// Database query $data = $db->query("SELECT * FROM large_table"); $monitor->checkpoint('after_db_query');

// Process data foreach ($data as $item) { processItem($item); } $monitor->checkpoint('after_processing');

// Generate output $output = generateReport($data); $monitor->checkpoint('after_report_generation');

// Get analysis $report = $monitor->getReport(); $bottlenecks = $monitor->analyzeBottlenecks();

print_r($bottlenecks); ?>

<?php // Monitor script execution time and log warnings register_shutdown_function(function() { $error = error_get_last(); if ($error && $error['type'] === E_ERROR) { if (strpos($error['message'], 'Maximum execution time') !== false) { $logEntry = sprintf( "[%s] TIMEOUT: %s in %s on line %d\nScript: %s\nMemory: %d bytes\n", date('Y-m-d H:i:s'), $error['message'], $error['file'], $error['line'], $_SERVER['SCRIPT_NAME'] ?? $_SERVER['PHP_SELF'] ?? 'CLI', memory_get_peak_usage(true) ); file_put_contents('/var/log/php/timeouts.log', $logEntry, FILE_APPEND); } } });

// Proactive timeout warning function checkExecutionTime($threshold = 0.8) { $maxTime = (int)ini_get('max_execution_time'); if ($maxTime <= 0) return; // No limit

$elapsed = microtime(true) - $_SERVER['REQUEST_TIME_FLOAT']; $ratio = $elapsed / $maxTime;

if ($ratio > $threshold) { trigger_error( "Execution time at " . round($ratio * 100) . "% of limit ($elapsed / $maxTime seconds)", E_USER_WARNING ); } } ?> ```

Checklist

StepActionVerified
1Identified current execution time limit
2Modified php.ini max_execution_time
3Configured PHP-FPM pool settings
4Updated web server timeout settings (Nginx/Apache)
5Implemented runtime time limit increases in code
6Optimized database queries and code
7Implemented chunked or batch processing
8Set up background processing for long operations
9Added graceful timeout handling
10Implemented monitoring and logging
11Tested with realistic data volumes
12Documented timeout requirements for the script

Verify the Fix

  1. 1.Test execution time limit:
  2. 2.```bash
  3. 3.# Create test script
  4. 4.cat > /var/www/html/test_timeout.php << 'EOF'
  5. 5.<?php
  6. 6.$limit = ini_get('max_execution_time');
  7. 7.echo "Current max_execution_time: $limit seconds\n";

// Test actual limit set_time_limit(10); $start = microtime(true); $counter = 0;

while (true) { $counter++; usleep(100000); // 100ms

$elapsed = microtime(true) - $start; if ($elapsed > 12) { echo "SUCCESS: Script ran for " . round($elapsed, 2) . " seconds (limit was set to 10)\n"; echo "Note: Script stopped because we're calling set_time_limit(0) implicitly\n"; break; }

if ($elapsed > 5) { echo "Running for " . round($elapsed, 2) . " seconds...\n"; } } ?> EOF

php /var/www/html/test_timeout.php ```

  1. 1.Verify php.ini changes:
  2. 2.```bash
  3. 3.# Check PHP configuration
  4. 4.php -i | grep max_execution_time
  5. 5.php -r "echo ini_get('max_execution_time');"

# For web context curl -s "http://localhost/test_timeout.php"

# Check PHP-FPM status sudo systemctl status php8.2-fpm php-fpm8.2 -tt 2>&1 | grep max_execution_time ```

  1. 1.Test long-running operation:
  2. 2.```php
  3. 3.<?php
  4. 4.// test_long_process.php
  5. 5.set_time_limit(120);
  6. 6.ini_set('memory_limit', '512M');

echo "Starting long process at " . date('H:i:s') . "\n"; echo "Max execution time: " . ini_get('max_execution_time') . " seconds\n"; echo "Memory limit: " . ini_get('memory_limit') . "\n\n";

$startTime = microtime(true);

// Simulate work for 30 seconds for ($i = 1; $i <= 30; $i++) { sleep(1); $elapsed = microtime(true) - $startTime; echo "Second $i - Total elapsed: " . round($elapsed, 2) . "s\n";

// Periodically reset timer for very long operations if ($i % 20 === 0) { set_time_limit(30); echo "Timer reset to 30 seconds\n"; } }

echo "\nCompleted successfully at " . date('H:i:s') . "\n"; echo "Total time: " . round(microtime(true) - $startTime, 2) . " seconds\n"; ?> ```

  1. 1.Verify web server timeout configuration:
  2. 2.```bash
  3. 3.# Nginx configuration test
  4. 4.sudo nginx -t
  5. 5.curl -I "http://localhost/" | grep -i timeout

# Check Nginx fastcgi settings grep -r "fastcgi_read_timeout" /etc/nginx/

# Apache configuration test sudo apachectl configtest sudo apache2ctl -S | grep -i timeout ```

  1. 1.Monitor production logs:
  2. 2.```bash
  3. 3.# Watch for timeout errors
  4. 4.tail -f /var/log/php-fpm/error.log | grep -i timeout
  5. 5.tail -f /var/log/nginx/error.log | grep -i timeout
  6. 6.tail -f /var/log/apache2/error.log | grep -i timeout

# Check PHP slowlog tail -f /var/log/php-fpm-slow.log

# Monitor in real-time watch -n 1 'ps aux | grep php | grep -v grep | head -20' ```

  1. 1.Test chunked processing endpoint:
  2. 2.```javascript
  3. 3.// Test via browser console or Node.js
  4. 4.fetch('/api/chunked-process.php?action=init', {
  5. 5.method: 'POST',
  6. 6.headers: { 'Content-Type': 'application/json' },
  7. 7.body: JSON.stringify(Array.from({length: 1000}, (_, i) => i))
  8. 8.})
  9. 9..then(r => r.json())
  10. 10..then(data => {
  11. 11.console.log('Init:', data);
  12. 12.// Process chunks
  13. 13.for (let i = 0; i < data.total_chunks; i++) {
  14. 14.fetch(/api/chunked-process.php?action=process&chunk=${i})
  15. 15..then(r => r.json())
  16. 16..then(chunk => console.log('Chunk', i, chunk));
  17. 17.}
  18. 18.});
  19. 19.`
  • [Fix PHP Composer Out of Memory Error](/articles/fix-php-composer-out-of-memory)
  • [Fix PHP FPM Pool Exhausted](/articles/fix-php-fpm-pool-exhausted)
  • [Fix PHP Memory Limit Exhausted](/articles/fix-php-memory-limit-exhausted-on-wordpress-hosting-account)
  • [Fix Nginx Reverse Proxy 502 Bad Gateway](/articles/fix-nginx-reverse-proxy-502-bad-gateway)
  • [Fix PHP Session Files Filling TMP Directory](/articles/fix-php-session-files-filling-tmp-directory-on-shared-hosting)
  • [Fix Nginx Upstream Timed Out](/articles/fix-nginx-upstream-timed-out)
  • [Fix Apache Server Timeout Error](/articles/fix-apache-server-timeout)
  • [Fix MySQL Query Execution Timeout](/articles/fix-mysql-query-execution-timeout)
  • [Fix PHP File Upload Size Limit Error](/articles/fix-php-file-upload-size-limit)
  • [Fix Laravel Queue Job Timeout](/articles/fix-laravel-queue-job-timeout)