Introduction CI/CD artifact upload failures block the entire pipeline. When build outputs cannot be stored for downstream jobs or deployment, the pipeline cannot proceed.
Symptoms - Error: "Failed to upload artifact: file size exceeds limit" - Error: "Upload timeout after 300s" - Error: "Access denied to artifact storage" - Downstream jobs cannot find expected artifacts - Artifact storage quota exceeded
Common Causes - Artifact exceeds maximum size limit - Storage backend quota exceeded - Network timeout during large artifact upload - Artifact retention policy deleting artifacts before downstream use - Storage backend credentials expired
Step-by-Step Fix 1. **Check artifact size**: ```bash du -sh artifact/ # Compare with pipeline limits # GitHub Actions: 5GB per artifact # GitLab CI: 1GB per artifact (free tier) ```
- 1.Compress artifacts before upload:
- 2.```yaml
- 3.# GitHub Actions
- 4.- name: Compress
- 5.run: tar -czf build.tar.gz dist/
- 6.- name: Upload
- 7.uses: actions/upload-artifact@v4
- 8.with:
- 9.name: build
- 10.path: build.tar.gz
- 11.
` - 12.Use external storage for large artifacts:
- 13.```bash
- 14.aws s3 cp build/ s3://my-artifacts/$CI_COMMIT_SHA/ --recursive
- 15.
`