# Fix Nginx Gzip Compression Not Working Despite Configuration

You have added gzip configuration to Nginx, verified it, and reloaded -- but responses are still uncompressed. Browser developer tools show Content-Encoding is missing, and page load times are 3x slower than expected.

Checking Current Configuration

First, verify gzip is actually enabled:

bash
nginx -T 2>/dev/null | grep -E "gzip|gzip_"

You should see:

bash
gzip on;
gzip_vary on;
gzip_proxied any;
gzip_comp_level 6;
gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
gzip_min_length 256;
gzip_buffers 16 8k;

Common Mistake: Missing gzip_types

The most common reason gzip "does not work" is that gzip_types does not include the content type your application serves. By default, Nginx only gzips text/html. All other types must be listed explicitly.

Check what content type your application returns:

bash
curl -sI https://example.com/api/data | grep Content-Type
# Content-Type: application/json

If application/json is not in gzip_types, the response will not be compressed. Add it:

nginx
gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript image/svg+xml;

Common Mistake: Response Too Small

Nginx will not compress responses smaller than gzip_min_length (default 20 bytes). For JSON API responses that return small payloads, this is rarely the issue. But for very short responses, check:

bash
curl -sI https://example.com/api/health | grep Content-Length

If the response is smaller than gzip_min_length, Nginx skips compression. Lower the threshold:

nginx
gzip_min_length 20;

Common Mistake: Proxy Response Already Compressed

If your upstream application (Node.js, Python, etc.) already compresses responses, Nginx will not double-compress them. Check the upstream response:

bash
curl -sI http://127.0.0.1:3000/api/data | grep -i encoding

If you see Content-Encoding: gzip from the upstream, Nginx correctly passes it through. If you want Nginx to handle compression instead, disable compression in the application:

javascript
// Express - disable compression, let Nginx handle it
const compression = require('compression');
// Remove or do NOT use: app.use(compression());

Testing Gzip Is Working

bash
curl -sI -H "Accept-Encoding: gzip" https://example.com/ | grep -E "Content-Encoding|Content-Length|Transfer-Encoding"

You should see Content-Encoding: gzip in the response headers.

For a more thorough test, compare compressed and uncompressed sizes:

```bash # Uncompressed size curl -s -H "Accept-Encoding: identity" https://example.com/ | wc -c # Output: 145230

# Compressed size curl -s -H "Accept-Encoding: gzip" https://example.com/ | wc -c # Output: 32456 ```

If both sizes are similar, gzip is not working.

Gzip with HTTPS

Gzip works identically over HTTP and HTTPS. However, some proxies and CDNs strip the Accept-Encoding header. Check what the browser actually sends:

bash
curl -v -H "Accept-Encoding: gzip, deflate, br" https://example.com/ 2>&1 | grep -E "Accept-Encoding|Content-Encoding"

If the response includes Content-Encoding: br (Brotli), that means a higher-priority compression algorithm was negotiated. Brotli is generally better than gzip, so this is not a problem -- but it explains why you do not see gzip.

Gzip Configuration for API Servers

API servers benefit from a different gzip configuration than static file servers:

```nginx gzip on; gzip_vary on; gzip_proxied any; gzip_comp_level 4; gzip_min_length 100; gzip_types application/json application/xml text/plain; gzip_disable "msie6";

# Do not gzip images or already-compressed binary data # gzip is only effective for text-based content types ```

Lower compression level (4 instead of 6-9) reduces CPU usage while still achieving 60-70% size reduction for JSON responses. Higher levels (7-9) provide diminishing returns for the extra CPU cost.