# Codefresh Build Failed: Complete Troubleshooting Guide

Codefresh is a Kubernetes-native CI/CD platform that builds and deploys Docker containers. When Codefresh builds fail, the issues typically stem from pipeline YAML configuration, Docker build errors, Kubernetes integration problems, or authentication failures.

Let me walk through the most common Codefresh build failures and how to fix each one.

Understanding Codefresh Pipeline Structure

Codefresh pipelines use YAML with these main sections:

  • steps - Individual build tasks
  • stages - Grouped steps for parallel execution
  • triggers - Events that start pipelines
  • environments - Variables and secrets
  • runtime - Pipeline execution settings

Fix 1: Pipeline YAML Syntax Errors

The codefresh.yml has validation errors.

Symptoms: - "Invalid pipeline YAML" - Pipeline doesn't load - Syntax error at line X

Diagnosis:

```bash # Validate YAML locally yamllint codefresh.yml

# Or use Codefresh CLI codefresh validate codefresh.yml ```

Solution A: Fix common syntax issues:

```yaml # WRONG - wrong indentation steps: build: type: build imageName: my-image # Wrong indentation

# CORRECT - proper indentation steps: build: type: build imageName: my-image dockerfilePath: Dockerfile ```

Solution B: Validate step types:

```yaml # Common step types steps: build_step: type: build imageType: docker dockerfilePath: ./Dockerfile dockerfileName: Dockerfile context: . imageName: my-org/my-image

push_step: type: push candidate: ${{build_step}} registry: dockerhub credentials: registry: dockerhub username: ${{DOCKERHUB_USER}} password: ${{DOCKERHUB_PASSWORD}}

run_step: type: freestyle image: alpine commands: - echo "Hello"

git_step: type: git-clone repo: https://github.com/org/repo revision: main ```

Solution C: Use stages correctly:

```yaml # Stages for parallel execution stages: - name: build steps: - build_app - build_tests - name: test steps: - run_tests - name: deploy steps: - deploy_app

# Parallel execution within stage steps: unit_tests: stage: test type: freestyle image: node:20 commands: - npm test

integration_tests: stage: test # Same stage = parallel type: freestyle image: node:20 commands: - npm run integration ```

Fix 2: Docker Build Failures

Docker image builds fail.

Symptoms: - "Build failed" - "Dockerfile syntax error" - "Image pull failed during build"

Diagnosis:

```bash # Check build logs in Codefresh UI # Pipelines → [Build] → [Failed Build] → Build Step

# Test Dockerfile locally docker build -t test-image . ```

Solution A: Fix Dockerfile:

```dockerfile # Common issues:

# Base image not found FROM node:20.11.0 # Use specific, existing tag # Not: FROM node:nonexistent-tag

# Invalid commands RUN npm install # Correct # Not: npm install (missing RUN)

# Path issues COPY package.json ./ COPY package-lock.json ./ RUN npm ci # Not: COPY . . with wrong context ```

Solution B: Configure build context:

yaml
steps:
  build:
    type: build
    imageName: my-org/my-image
    dockerfilePath: ./Dockerfile
    context: .  # Build context directory

For multi-Dockerfile projects:

```yaml steps: build_app: type: build imageName: my-org/app dockerfilePath: ./app/Dockerfile context: ./app

build_worker: type: build imageName: my-org/worker dockerfilePath: ./worker/Dockerfile context: ./worker ```

Solution C: Handle base image pull errors:

yaml
steps:
  build:
    type: build
    imageName: my-org/my-image
    noCache: false  # Use cache for faster builds
    build_arguments:
      - BASE_IMAGE=node:20-alpine

For private base images:

yaml
steps:
  build:
    type: build
    imageName: my-org/my-image
    registry: my-private-registry
    credentials:
      registry: my-private-registry
      username: ${{REGISTRY_USER}}
      password: ${{REGISTRY_PASSWORD}}

Fix 3: Image Push Failures

Pushing images to registries fails.

Symptoms: - "Push failed" - "Unauthorized" - "Registry not found"

Solution A: Configure Docker registry:

yaml
steps:
  push:
    type: push
    candidate: ${{build_step}}
    registry: dockerhub
    credentials:
      registry: dockerhub
      username: ${{DOCKERHUB_USER}}
      password: ${{DOCKERHUB_PASSWORD}}
    tags:
      - latest
      - ${{CF_BRANCH_TAG_NORMALIZED}}

For other registries:

```yaml # AWS ECR steps: push_ecr: type: push candidate: ${{build_step}} registry: ecr credentials: registry: 123456789012.dkr.ecr.us-east-1.amazonaws.com username: AWS password: ${{AWS_ECR_PASSWORD}} tags: - latest

# GCR steps: push_gcr: type: push candidate: ${{build_step}} registry: gcr credentials: registry: gcr.io/my-project username: _json_key password: ${{GCR_JSON_KEY}} ```

Solution B: Use integrated registries:

  1. 1.Go to Account Settings → Integrations → Docker Registries
  2. 2.Add registry with credentials
  3. 3.Reference by name:
yaml
steps:
  push:
    type: push
    candidate: ${{build_step}}
    registry: my-integrated-registry  # From integrations

Solution C: Test registry access:

```bash # Test Docker login docker login -u user -p password registry.company.com

# Test push docker tag my-image registry.company.com/my-image:latest docker push registry.company.com/my-image:latest ```

Fix 4: Kubernetes Deployment Failures

Deployments to Kubernetes fail.

Symptoms: - "Deployment failed" - "Context not found" - "Namespace doesn't exist"

Diagnosis:

```bash # Check Kubernetes context kubectl config get-contexts

# Check namespace kubectl get namespaces

# Check deployment status kubectl get deployments -n target-namespace ```

Solution A: Configure Kubernetes context:

yaml
steps:
  deploy:
    type: deployment
    context: my-kubernetes-cluster  # Must exist in Codefresh integrations
    namespace: my-namespace
    yamlPath: ./k8s/deployment.yaml

Add Kubernetes integration:

  1. 1.Go to Account Settings → Integrations → Kubernetes
  2. 2.Add cluster by kubeconfig or cloud provider

Solution B: Create namespace:

```yaml steps: create_namespace: type: freestyle image: bitnami/kubectl commands: - kubectl create namespace my-namespace --dry-run=client -o yaml | kubectl apply -f -

deploy: type: deployment context: my-kubernetes-cluster namespace: my-namespace yamlPath: ./k8s/deployment.yaml ```

Solution C: Fix Helm deployment:

yaml
steps:
  helm_deploy:
    type: helm
    context: my-kubernetes-cluster
    namespace: my-namespace
    chartPath: ./charts/my-chart
    releaseName: my-release
    valuesFiles:
      - ./charts/my-chart/values.yaml
      - ./charts/my-chart/values-prod.yaml

Solution D: Check RBAC permissions:

```bash # Check Codefresh's Kubernetes permissions kubectl auth can-i create deployments --as=system:serviceaccount:codefresh:codefresh-sa -n my-namespace

# If insufficient, add permissions kubectl create rolebinding codefresh-admin \ --clusterrole=admin \ --serviceaccount=codefresh:codefresh-sa \ -n my-namespace ```

Fix 5: Git Clone Failures

Repository cloning fails.

Symptoms: - "Clone failed" - "Authentication failed" - "Repository not found"

Solution A: Fix Git clone step:

yaml
steps:
  clone:
    type: git-clone
    repo: https://github.com/org/repo
    revision: ${{CF_BRANCH}}
    credentials:
      username: ${{GITHUB_USER}}
      password: ${{GITHUB_TOKEN}}

Solution B: Use SSH for Git:

yaml
steps:
  clone:
    type: git-clone
    repo: git@github.com:org/repo.git
    revision: main
    credentials:
      sshPrivateKey: ${{GITHUB_SSH_KEY}}

Solution C: Configure Git integration:

  1. 1.Go to Account Settings → Integrations → Git Providers
  2. 2.Add GitHub/GitLab/Bitbucket integration
  3. 3.Repository will auto-authenticate
yaml
steps:
  clone:
    type: git-clone
    repo: org/repo  # Simplified if using integration
    revision: main

Fix 6: Trigger Failures

Pipelines don't trigger on Git events.

Symptoms: - Build not starting on push - Webhook not received - PR events ignored

Solution A: Configure triggers:

```yaml # Manual trigger (default) mode: manual

# Git push trigger mode: parallel trigger_branch: main

# Or full trigger configuration triggers: - name: git-push type: git repo: org/repo events: - push.heads - push.tags branchRegex: '/main|develop/i'

# Pull request trigger triggers: - name: pr-trigger type: git repo: org/repo events: - pull_request.opened - pull_request.synchronize ```

Solution B: Verify webhook:

In your Git provider:

  • GitHub: Settings → Webhooks
  • GitLab: Settings → Integrations
  • Bitbucket: Settings → Webhooks

Check webhook delivery history for errors.

Solution C: Test trigger manually:

bash
# Trigger via API
curl -X POST \
  -H "Authorization: $CODEFRESH_API_KEY" \
  "https://g.codefresh.io/api/pipelines/run/pipeline-id" \
  -d '{"branch":"main"}'

Fix 7: Environment Variable Issues

Variables not accessible.

Symptoms: - ${{...}} returns empty - Variable undefined - Wrong value substituted

Solution A: Use Codefresh built-in variables:

yaml
steps:
  run:
    type: freestyle
    image: alpine
    commands:
      - echo "Branch: ${{CF_BRANCH}}"
      - echo "Build ID: ${{CF_BUILD_ID}}"
      - echo "Commit: ${{CF_REVISION}}"
      - echo "Repo: ${{CF_REPO_OWNER}}/${{CF_REPO_NAME}}"

Built-in variables: - CF_BRANCH - Branch name - CF_BUILD_ID - Build number - CF_REVISION - Git commit SHA - CF_REPO_OWNER - Repository owner - CF_REPO_NAME - Repository name - CF_BRANCH_TAG_NORMALIZED - Docker-safe branch tag

Solution B: Set custom variables:

```yaml environment: - APP_ENV=production - VERSION=1.0.0

steps: run: type: freestyle image: alpine commands: - echo "Environment: ${{APP_ENV}}" - echo "Version: ${{VERSION}}" ```

Solution C: Use secrets:

```yaml # Define secrets in Codefresh UI # Account Settings → Secrets

steps: run: type: freestyle image: alpine commands: - curl -H "Authorization: Bearer ${{API_TOKEN}}" https://api.example.com environment: - API_TOKEN=${{secrets.API_TOKEN}} ```

Fix 8: Composite Step Errors

Composite steps (reusable templates) fail.

Symptoms: - "Composite not found" - Template step errors - Invalid composite configuration

Solution A: Create composite correctly:

yaml
# Composite definition (saved as template)
steps:
  - name: my-composite
    type: composite
    description: "My reusable step"
    inputs:
      image_name: string
      registry: string
    steps:
      - name: build
        type: build
        imageType: docker
        imageName: ${{inputs.image_name}}
      - name: push
        type: push
        candidate: ${{build}}
        registry: ${{inputs.registry}}

Solution B: Use composite in pipeline:

yaml
steps:
  my_step:
    type: composite
    compositeName: my-composite
    inputs:
      image_name: my-org/my-image
      registry: dockerhub

Solution C: Fix composite references:

```bash # List composites codefresh get composites

# Validate composite codefresh validate composite.yaml ```

Fix 9: Approval Gate Issues

Approval gates don't work.

Symptoms: - Pipeline stuck waiting - Approval not received - Can't approve/reject

Solution A: Configure approval:

```yaml steps: approval: type: approval description: "Approve deployment to production" when: branch: - main

deploy: type: deployment context: my-cluster namespace: prod yamlPath: ./k8s/prod.yaml when: steps: - approval ```

Solution B: Set approval notifications:

yaml
steps:
  approval:
    type: approval
    description: "Approve production deployment"
    environment:
      - SLACK_WEBHOOK=${{secrets.SLACK_WEBHOOK}}
    onApprove:
      notifications:
        - type: slack
          webhook: ${{SLACK_WEBHOOK}}
          message: "Approved! Deploying to production"

Fix 10: Parallel Execution Issues

Parallel steps cause resource conflicts.

Symptoms: - Steps interfere with each other - Resource exhaustion - Race conditions

Solution A: Use stages correctly:

```yaml stages: - name: build parallel: true steps: - build_app - build_worker

  • name: test
  • parallel: true
  • steps:
  • - unit_tests
  • - integration_tests
  • name: deploy
  • steps:
  • - deploy # Single step after tests
  • `

Solution B: Share artifacts:

```yaml steps: build: type: build imageName: my-org/my-image metadata: set: - buildArtifact: true

push: type: push candidate: ${{build}} # Reference previous step output ```

Solution C: Use workflow volume:

```yaml runtime: workflowVolume: name: shared-data claimName: shared-pvc

steps: generate_data: type: freestyle image: alpine commands: - echo "data" > /mnt/shared/data.txt volumeMounts: - name: shared-data mountPath: /mnt/shared

use_data: type: freestyle image: alpine commands: - cat /mnt/shared/data.txt volumeMounts: - name: shared-data mountPath: /mnt/shared ```

Quick Reference: Codefresh Errors

ErrorCauseSolution
YAML invalidSyntax errorUse yamllint, check indentation
Build failedDockerfile issueTest locally, fix context
Push failedRegistry authConfigure credentials
Deployment failedK8s contextAdd integration, check namespace
Clone failedGit authUse integration or credentials
Trigger not firingWebhook issueConfigure trigger, verify webhook
Variable emptyWrong syntaxUse ${{...}}
Composite errorTemplate issueValidate composite definition
Approval stuckConfigurationSet notifications, approvers

Debugging Commands

```bash # Validate pipeline yamllint codefresh.yml codefresh validate codefresh.yml

# Run pipeline manually codefresh run pipeline-name -b branch

# Get build logs codefresh logs build-id

# Check build status codefresh get builds

# List pipelines codefresh get pipelines

# Test Docker build docker build -t test-image .

# Test registry push docker login registry.company.com docker push registry.company.com/image:tag

# Check Kubernetes kubectl get namespaces kubectl get deployments -n namespace ```