Introduction

When Redis reaches its configured maxmemory limit and has an eviction policy that prevents key removal (like noeviction), it returns OOM errors for write commands. This protects existing data but blocks new writes, causing application failures.

Symptoms

  • Redis client errors: OOM command not allowed when used memory > 'maxmemory'.
  • redis-cli INFO memory shows used_memory near maxmemory setting
  • Applications fail to write new cache entries
  • used_memory_human approaches configured limit
  • Error responses for SET, HSET, LPUSH, and other write commands
  • Monitoring alerts on memory usage percentage

Common Causes

  • maxmemory set too low for workload
  • Eviction policy set to noeviction (default in some configs)
  • Memory leak from unexpired keys accumulating
  • Large data structures (hashes, lists) consuming excessive memory
  • No TTL set on volatile cache keys
  • Redis used as database instead of cache without eviction

Step-by-Step Fix

  1. 1.Check current memory usage and settings:
  2. 2.```bash
  3. 3.redis-cli INFO memory
  4. 4.# Look for:
  5. 5.# used_memory: 1073741824
  6. 6.# maxmemory: 1073741824
  7. 7.# maxmemory_policy: noeviction
  8. 8.`
  9. 9.Check memory usage by keys:
  10. 10.```bash
  11. 11.# Sample random keys to find memory hogs
  12. 12.redis-cli --bigkeys

# Or use rdb-tools for detailed analysis rdb -c memory /var/lib/redis/dump.rdb -largest 100 > memory.csv ```

  1. 1.Immediately increase maxmemory (temporary fix):
  2. 2.```bash
  3. 3.redis-cli CONFIG SET maxmemory 2gb
  4. 4.`
  5. 5.Change eviction policy to allow key removal:
  6. 6.```bash
  7. 7.# allkeys-lru: Remove least recently used keys
  8. 8.# volatile-lru: Remove LRU keys with TTL
  9. 9.# allkeys-lfu: Remove least frequently used
  10. 10.# volatile-lfu: Remove LFU keys with TTL
  11. 11.redis-cli CONFIG SET maxmemory-policy allkeys-lru
  12. 12.`
  13. 13.Set eviction policy persistently in redis.conf:
  14. 14.`
  15. 15.maxmemory 2gb
  16. 16.maxmemory-policy allkeys-lru
  17. 17.maxmemory-samples 10
  18. 18.`
  19. 19.Force immediate eviction by triggering a write:
  20. 20.```bash
  21. 21.redis-cli SET temp:trigger evict
  22. 22.# This will trigger eviction of old keys
  23. 23.`
  24. 24.Analyze and optimize large keys:
  25. 25.```bash
  26. 26.# Check size of specific keys
  27. 27.redis-cli MEMORY USAGE my:large:hash

# If using hashes with many fields, shard them: # Instead of user:1000 (all fields) # Use user:1000:profile, user:1000:settings, etc. ```

  1. 1.Set TTL on cache entries that should expire:
  2. 2.```bash
  3. 3.# Add TTL to existing keys without expiration
  4. 4.redis-cli --scan --pattern "cache:*" | xargs -I {} redis-cli EXPIRE {} 3600
  5. 5.`
  6. 6.Enable active expiration for faster cleanup:
  7. 7.```bash
  8. 8.redis-cli CONFIG SET active-expire-effort 10
  9. 9.`
  10. 10.Monitor memory after fix:
  11. 11.```bash
  12. 12.# Watch memory in real-time
  13. 13.watch -n 1 'redis-cli INFO memory | grep -E "used_memory_human|maxmemory_human|evicted_keys"'
  14. 14.`

Prevention

  • Always set appropriate maxmemory based on workload analysis
  • Use allkeys-lru or volatile-lru for cache workloads
  • Set TTL on all volatile cache keys
  • Monitor memory usage with used_memory / maxmemory ratio
  • Use MEMORY USAGE to audit large keys regularly
  • Implement cache warming strategies instead of storing unlimited data
  • Consider Redis Cluster for horizontal scaling when single node memory insufficient
  • Enable maxmemory-policy monitoring alerts
  • Regular redis-cli --bigkeys analysis in CI/CD

Advanced Memory Tuning

  • Use 32-bit Redis for small datasets (<4GB) to save memory
  • Enable activedefrag for memory defragmentation
  • Consider using Redis Streams instead of large Lists
  • Use RedisJSON module for efficient JSON storage
  • Implement client-side caching (Redis 6+) to reduce server memory