Introduction
GitHub Actions allows workflows to upload build artifacts for later download or use in downstream jobs. Each artifact has a maximum size of 5GB. When a build produces large artifacts -- such as Docker image tarballs, compiled binaries, or test data -- the upload fails with an error. This blocks artifact-dependent workflows and prevents downstream jobs from accessing build outputs.
Symptoms
- Workflow step fails during
actions/upload-artifactwith size error - Console output shows
Artifact upload failed: file too large - Upload progresses partially then fails at the 5GB threshold
- Downstream jobs fail because the expected artifact is not available
- Error message:
Error: Failed to upload artifact: HTTP 400: Artifact size exceeds the maximum allowed size of 5368709120 bytes
Common Causes
- Build output includes unnecessary large files (node_modules, build caches)
- Docker image saved as tar for artifact upload exceeds 5GB
- Test data or database dump included in the artifact
- Multiple files combined into a single artifact archive exceeding the limit
- Debug builds producing larger binaries than release builds
Step-by-Step Fix
- 1.Check the artifact size before upload: Identify large files.
- 2.```yaml
- 3.- name: Check artifact sizes
- 4.run: |
- 5.du -sh ./build-output/* | sort -rh | head -10
- 6.du -sh ./build-output/ | tail -1
- 7.
` - 8.Compress the artifact to reduce size: Use efficient compression.
- 9.```yaml
- 10.- name: Compress artifact
- 11.run: |
- 12.tar -czf build-output.tar.gz -C ./build-output .
- 13.du -sh build-output.tar.gz
- 14.- name: Upload compressed artifact
- 15.uses: actions/upload-artifact@v4
- 16.with:
- 17.name: build-output
- 18.path: build-output.tar.gz
- 19.
` - 20.Exclude unnecessary files from the artifact: Only upload what is needed.
- 21.```yaml
- 22.- name: Upload artifact
- 23.uses: actions/upload-artifact@v4
- 24.with:
- 25.name: build-output
- 26.path: |
- 27../build-output/bin/
- 28../build-output/lib/
- 29.!./build-output/**/*.pdb
- 30.!./build-output/**/*.map
- 31.
` - 32.Split large artifacts into multiple smaller ones: Divide across the limit.
- 33.```yaml
- 34.- name: Upload artifact part 1
- 35.uses: actions/upload-artifact@v4
- 36.with:
- 37.name: build-output-part1
- 38.path: ./build-output/bin/
- 39.- name: Upload artifact part 2
- 40.uses: actions/upload-artifact@v4
- 41.with:
- 42.name: build-output-part2
- 43.path: ./build-output/lib/
- 44.
` - 45.Use external storage for very large artifacts: Bypass GitHub's limit.
- 46.```yaml
- 47.- name: Upload to S3
- 48.run: |
- 49.aws s3 cp ./build-output/large-file.tar.gz s3://my-artifacts/${{ github.run_id }}/
- 50.env:
- 51.AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
- 52.AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
- 53.
`
Prevention
- Set artifact size limits in CI/CD guidelines (e.g., max 1GB per artifact)
- Compress artifacts before upload using gzip or zstd
- Exclude debug symbols, source maps, and test data from release artifacts
- Use external artifact storage (S3, GCS, Azure Blob) for files over 500MB
- Monitor artifact sizes in workflow runs and alert on growth trends
- Implement artifact cleanup policies to remove old artifacts and stay within storage quotas