Introduction Exit code 137 means the container was killed by the OOM killer (128 + signal 9 SIGKILL). This happens when a container exceeds its memory limit and the kernel OOM killer terminates it. This is one of the most common causes of unexpected container restarts.
Symptoms - `docker ps` shows container with "Exited (137)" status - `docker inspect` shows "OOMKilled": true - dmesg shows: "Out of memory: Killed process X (containerd-shim)" - Container restarts in a loop (RestartPolicy=always) - Application logs cut off abruptly before crash
Common Causes - Docker memory limit (--memory) too low for workload - Memory leak in application code - Large file uploads loading entire file into memory - JVM heap not configured to respect container limits - Node.js V8 default heap exceeding container memory
Step-by-Step Fix 1. **Verify OOMKilled status**: ```bash docker inspect --format='{{.State.OOMKilled}}' <container-name> docker inspect --format='{{.State.ExitCode}}' <container-name> ```
- 1.Check current memory limit:
- 2.```bash
- 3.docker inspect --format='{{.HostConfig.Memory}}' <container-name>
- 4.# Convert from bytes: echo $(( $(docker inspect --format='{{.HostConfig.Memory}}' <container-name>) / 1024 / 1024 ))
- 5.
` - 6.Increase memory limit:
- 7.```bash
- 8.docker run --memory=2g --memory-swap=2g --memory-reservation=1g my-app
- 9.# Or update with docker compose:
- 10.# deploy:
- 11.# resources:
- 12.# limits:
- 13.# memory: 2G
- 14.# reservations:
- 15.# memory: 1G
- 16.
` - 17.For Java containers, set heap limits:
- 18.```bash
- 19.docker run -e JAVA_OPTS="-XX:MaxRAMPercentage=75.0 -XX:InitialRAMPercentage=50.0" my-java-app
- 20.
` - 21.MaxRAMPercentage=75 ensures the JVM uses only 75% of container memory.
- 22.For Node.js, set heap size:
- 23.```bash
- 24.docker run --memory=512m -e NODE_OPTIONS="--max-old-space-size=384" my-node-app
- 25.
`