Introduction

GitHub Actions allows workflows to upload build artifacts for later download or use in downstream jobs. Each artifact has a maximum size of 5GB. When a build produces large artifacts -- such as Docker image tarballs, compiled binaries, or test data -- the upload fails with an error. This blocks artifact-dependent workflows and prevents downstream jobs from accessing build outputs.

Symptoms

  • Workflow step fails during actions/upload-artifact with size error
  • Console output shows Artifact upload failed: file too large
  • Upload progresses partially then fails at the 5GB threshold
  • Downstream jobs fail because the expected artifact is not available
  • Error message: Error: Failed to upload artifact: HTTP 400: Artifact size exceeds the maximum allowed size of 5368709120 bytes

Common Causes

  • Build output includes unnecessary large files (node_modules, build caches)
  • Docker image saved as tar for artifact upload exceeds 5GB
  • Test data or database dump included in the artifact
  • Multiple files combined into a single artifact archive exceeding the limit
  • Debug builds producing larger binaries than release builds

Step-by-Step Fix

  1. 1.Check the artifact size before upload: Identify large files.
  2. 2.```yaml
  3. 3.- name: Check artifact sizes
  4. 4.run: |
  5. 5.du -sh ./build-output/* | sort -rh | head -10
  6. 6.du -sh ./build-output/ | tail -1
  7. 7.`
  8. 8.Compress the artifact to reduce size: Use efficient compression.
  9. 9.```yaml
  10. 10.- name: Compress artifact
  11. 11.run: |
  12. 12.tar -czf build-output.tar.gz -C ./build-output .
  13. 13.du -sh build-output.tar.gz
  14. 14.- name: Upload compressed artifact
  15. 15.uses: actions/upload-artifact@v4
  16. 16.with:
  17. 17.name: build-output
  18. 18.path: build-output.tar.gz
  19. 19.`
  20. 20.Exclude unnecessary files from the artifact: Only upload what is needed.
  21. 21.```yaml
  22. 22.- name: Upload artifact
  23. 23.uses: actions/upload-artifact@v4
  24. 24.with:
  25. 25.name: build-output
  26. 26.path: |
  27. 27../build-output/bin/
  28. 28../build-output/lib/
  29. 29.!./build-output/**/*.pdb
  30. 30.!./build-output/**/*.map
  31. 31.`
  32. 32.Split large artifacts into multiple smaller ones: Divide across the limit.
  33. 33.```yaml
  34. 34.- name: Upload artifact part 1
  35. 35.uses: actions/upload-artifact@v4
  36. 36.with:
  37. 37.name: build-output-part1
  38. 38.path: ./build-output/bin/
  39. 39.- name: Upload artifact part 2
  40. 40.uses: actions/upload-artifact@v4
  41. 41.with:
  42. 42.name: build-output-part2
  43. 43.path: ./build-output/lib/
  44. 44.`
  45. 45.Use external storage for very large artifacts: Bypass GitHub's limit.
  46. 46.```yaml
  47. 47.- name: Upload to S3
  48. 48.run: |
  49. 49.aws s3 cp ./build-output/large-file.tar.gz s3://my-artifacts/${{ github.run_id }}/
  50. 50.env:
  51. 51.AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
  52. 52.AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
  53. 53.`

Prevention

  • Set artifact size limits in CI/CD guidelines (e.g., max 1GB per artifact)
  • Compress artifacts before upload using gzip or zstd
  • Exclude debug symbols, source maps, and test data from release artifacts
  • Use external artifact storage (S3, GCS, Azure Blob) for files over 500MB
  • Monitor artifact sizes in workflow runs and alert on growth trends
  • Implement artifact cleanup policies to remove old artifacts and stay within storage quotas