GitLab CI/CD Complete Guide 2026: From Basics to Advanced Pipelines
Master GitLab CI/CD with our comprehensive 2026 guide. Learn pipelines, runners, deployment strategies, Docker integration, and advanced optimization techniques.
H
HostScout Team
··8 min read
GitLab CI/CD has evolved into one of the most comprehensive DevOps platforms available, offering everything from version control to continuous integration and deployment in a single application. In 2026, GitLab continues to lead the industry with powerful features, intuitive configuration, and enterprise-grade capabilities. This complete guide will take you from CI/CD basics to advanced pipeline optimization. ## What is GitLab CI/CD? GitLab CI/CD is an integrated continuous integration and continuous deployment tool built directly into GitLab. Unlike standalone CI/CD tools, GitLab provides a complete DevOps platform where you can manage code, track issues, review merge requests, and automate deployments—all in one place. The system uses YAML-based configuration files (.gitlab-ci.yml) to define pipelines that automatically build, test, and deploy your applications whenever you push code to your repository. ## Core Concepts ### Pipelines A pipeline is the top-level component of continuous integration, delivery, and deployment. Pipelines consist of: - Jobs: Individual tasks like building, testing, or deploying
Stages: Groups of jobs that run in sequence
Runners: Agents that execute jobs ### Jobs Jobs are the fundamental building blocks of a pipeline. Each job defines: - What commands to execute
Which stage it belongs to
Dependencies and artifacts
When it should run ### Stages Stages define the sequence in which jobs run. Common stages include: 1. Build
Test
Deploy Jobs in the same stage run in parallel, while stages execute sequentially. ### Runners Runners are isolated virtual machines or containers that execute jobs. GitLab provides: - Shared runners: Available to all projects
Group runners: Available to all projects in a group
Specific runners: Dedicated to specific projects ## Getting Started: Your First Pipeline ### Basic Pipeline Structure Create a .gitlab-ci.yml file in your repository root: ```yaml
.gitlab-ci.yml
stages: - build - test - deploy build-job: stage: build script: - echo “Compiling the code…” - npm install - npm run build artifacts: paths: - dist/ expire_in: 1 hour test-job: stage: test script: - echo “Running tests…” - npm run test dependencies: - build-job deploy-job: stage: deploy script: - echo “Deploying application…” - npm run deploy environment: name: production url: https://example.com only: - main
1. Builds the application and saves artifacts2. Runs tests using build artifacts3. Deploys to production (only from main branch) ### Understanding the Workflow When you push code to GitLab: 1. GitLab detects the `.gitlab-ci.yml` file2. Creates a pipeline with defined stages3. Assigns jobs to available runners4. Executes jobs according to stage order5. Reports results in the pipeline view ## Advanced Pipeline Configuration ### Variables and Environment Configuration Define variables at different scopes: ```yaml# Global variablesvariables: DEPLOY_SITE: "https://example.com" NODE_VERSION: "20" # Job-specific variablesbuild-job: variables: BUILD_ENV: "production" script: - echo "Building for $BUILD_ENV" - echo "Node version $NODE_VERSION"``` Use predefined CI/CD variables: ```yamldeploy-job: script: - echo "Deploying commit $CI_COMMIT_SHA" - echo "To branch $CI_COMMIT_REF_NAME" - echo "By $CI_COMMIT_AUTHOR"``` Store secrets in GitLab Settings > CI/CD > Variables: ```yamldeploy-job: script: - echo "Deploying with key $DEPLOY_KEY" # $DEPLOY_KEY is defined in project settings``` ### Conditional Execution Control when jobs run with `rules`: ```yamltest-job: script: - npm run test rules: - if: '$CI_PIPELINE_SOURCE == "merge_request_event"' - if: '$CI_COMMIT_BRANCH == "main"' deploy-staging: script: - deploy-to-staging.sh rules: - if: '$CI_COMMIT_BRANCH == "develop"' when: always - when: manual deploy-production: script: - deploy-to-production.sh rules: - if: '$CI_COMMIT_TAG =~ /^v\d+\.\d+\.\d+$/' when: manual``` ### Parallel and Matrix Jobs Run jobs in parallel: ```yamltest: parallel: 5 script: - npm run test -- --shard=$CI_NODE_INDEX/$CI_NODE_TOTAL``` Use matrix to test multiple configurations: ```yamltest: parallel: matrix: - NODE_VERSION: ['18', '20', '22'] OS: ['ubuntu-latest', 'windows-latest'] image: node:${NODE_VERSION} script: - echo "Testing on Node $NODE_VERSION with $OS" - npm run test``` ### Artifacts and Dependencies Manage build artifacts: ```yamlbuild: script: - npm run build artifacts: paths: - dist/ - coverage/ reports: coverage_report: coverage_format: cobertura path: coverage/cobertura-coverage.xml expire_in: 1 week test: script: - npm run test dependencies: - build coverage: '/Statements\s*:\s*(\d+\.?\d*)%/' deploy: script: - deploy.sh dist/ dependencies: - build needs: - job: build artifacts: true``` ### Caching for Performance Optimize pipeline speed with caching: ```yaml# Global cache configurationcache: key: ${CI_COMMIT_REF_SLUG} paths: - node_modules/ - .npm/ # Job-specific cachebuild: cache: key: files: - package-lock.json paths: - node_modules/ before_script: - npm ci --cache .npm --prefer-offline script: - npm run build``` Cache policies: ```yamlbuild: cache: key: build-cache paths: - node_modules/ policy: pull-push # Download and update cache test: cache: key: build-cache paths: - node_modules/ policy: pull # Only download, don't update``` ## Docker Integration ### Using Docker Images Specify Docker images for jobs: ```yamltest-node: image: node:20-alpine script: - npm run test test-python: image: python:3.12 script: - pip install -r requirements.txt - pytest test-go: image: golang:1.22 script: - go test ./...``` ### Building Docker Images Build and push Docker images: ```yamlbuild-docker: image: docker:24 services: - docker:24-dind variables: DOCKER_TLS_CERTDIR: "/certs" before_script: - docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY script: - docker build -t $CI_REGISTRY_IMAGE:$CI_COMMIT_REF_SLUG . - docker push $CI_REGISTRY_IMAGE:$CI_COMMIT_REF_SLUG - docker tag $CI_REGISTRY_IMAGE:$CI_COMMIT_REF_SLUG $CI_REGISTRY_IMAGE:latest - docker push $CI_REGISTRY_IMAGE:latest``` Multi-stage Docker builds: ```yaml# DockerfileFROM node:20-alpine AS builderWORKDIR /appCOPY package*.json ./RUN npm ciCOPY . .RUN npm run build FROM nginx:alpineCOPY --from=builder /app/dist /usr/share/nginx/htmlEXPOSE 80CMD ["nginx", "-g", "daemon off;"]``` ## Deployment Strategies ### Basic Deployment ```yamldeploy-production: stage: deploy script: - apt-get update -qq && apt-get install -y -qq rsync - rsync -avz --delete dist/ user@server:/var/www/html/ environment: name: production url: https://example.com only: - main``` ### Kubernetes Deployment ```yamldeploy-k8s: stage: deploy image: bitnami/kubectl:latest script: - kubectl config set-cluster k8s --server="$K8S_SERVER" --insecure-skip-tls-verify=true - kubectl config set-credentials admin --token="$K8S_TOKEN" - kubectl config set-context default --cluster=k8s --user=admin - kubectl config use-context default - kubectl apply -f k8s/deployment.yaml - kubectl rollout status deployment/myapp environment: name: production url: https://example.com``` Kubernetes manifest: ```yaml# k8s/deployment.yamlapiVersion: apps/v1kind: Deploymentmetadata: name: myappspec: replicas: 3 selector: matchLabels: app: myapp template: metadata: labels: app: myapp spec: containers: - name: myapp image: registry.gitlab.com/username/project:latest ports: - containerPort: 80``` ### AWS Deployment Deploy to AWS S3 and CloudFront: ```yamldeploy-aws: stage: deploy image: python:3.12 before_script: - pip install awscli script: - aws s3 sync dist/ s3://$S3_BUCKET --delete - aws cloudfront create-invalidation --distribution-id $CLOUDFRONT_ID --paths "/*" environment: name: production url: https://example.com only: - main``` ### Blue-Green Deployment ```yamldeploy-blue: stage: deploy script: - deploy-to-blue-environment.sh environment: name: production-blue url: https://blue.example.com test-blue: stage: verify script: - run-smoke-tests.sh https://blue.example.com needs: - deploy-blue switch-traffic: stage: switch script: - switch-load-balancer-to-blue.sh when: manual environment: name: production url: https://example.com needs: - test-blue``` ## Setting Up GitLab Runners ### Installing a Runner On Ubuntu/Debian: ```bash# Download the binarycurl -LJO "https://gitlab-runner-downloads.s3.amazonaws.com/latest/deb/gitlab-runner_amd64.deb" # Install the packagesudo dpkg -i gitlab-runner_amd64.deb # Register the runnersudo gitlab-runner register``` During registration, provide:- GitLab instance URL- Registration token (from Settings > CI/CD > Runners)- Runner description- Tags- Executor type (shell, docker, kubernetes, etc.) ### Docker Executor Configuration ```toml# /etc/gitlab-runner/config.tomlconcurrent = 4 [[runners]] name = "docker-runner" url = "https://gitlab.com/" token = "YOUR_TOKEN" executor = "docker" [runners.docker] tls_verify = false image = "node:20" privileged = false disable_cache = false volumes = ["/cache", "/var/run/docker.sock:/var/run/docker.sock"] shm_size = 0 [runners.cache] Type = "s3" Shared = true [runners.cache.s3] ServerAddress = "s3.amazonaws.com" BucketName = "runner-cache" BucketLocation = "us-east-1"``` ### Kubernetes Executor ```toml[[runners]] name = "kubernetes-runner" url = "https://gitlab.com/" token = "YOUR_TOKEN" executor = "kubernetes" [runners.kubernetes] host = "" namespace = "gitlab-runner" privileged = true cpu_request = "1" memory_request = "1Gi" service_cpu_request = "200m" service_memory_request = "256Mi" helper_cpu_request = "100m" helper_memory_request = "128Mi"``` ## Advanced Techniques ### Pipeline Includes and Templates Reuse pipeline configurations: ```yaml# .gitlab-ci.ymlinclude: - local: '/templates/build.yml' - local: '/templates/test.yml' - remote: 'https://example.com/ci-templates/deploy.yml' - project: 'group/shared-templates' ref: main file: '/templates/security.yml' stages: - build - test - security - deploy``` ```yaml# templates/build.yml.build-template: image: node:20 before_script: - npm ci cache: key: ${CI_COMMIT_REF_SLUG} paths: - node_modules/ build-app: extends: .build-template script: - npm run build``` ### Dynamic Child Pipelines Trigger child pipelines dynamically: ```yamlgenerate-config: stage: generate script: - python generate-pipeline.py > generated-config.yml artifacts: paths: - generated-config.yml trigger-child-pipeline: stage: deploy trigger: include: - artifact: generated-config.yml job: generate-config``` ### Merge Request Pipelines Configure pipelines for merge requests: ```yamlworkflow: rules: - if: '$CI_PIPELINE_SOURCE == "merge_request_event"' - if: '$CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS' when: never - if: '$CI_COMMIT_BRANCH' unit-tests: script: - npm run test:unit rules: - if: '$CI_PIPELINE_SOURCE == "merge_request_event"' integration-tests: script: - npm run test:integration rules: - if: '$CI_COMMIT_BRANCH == "main"'``` ### Performance Optimization Optimize pipeline execution: ```yaml# Use needs to create DAG (Directed Acyclic Graph)build: stage: build script: - npm run build lint: stage: test script: - npm run lint needs: [] # Run immediately, don't wait for build unit-test: stage: test script: - npm run test:unit needs: ["build"] # Only wait for build deploy: stage: deploy script: - deploy.sh needs: ["build", "unit-test"] # Don't wait for lint``` ## Monitoring and Debugging ### Pipeline Visualization GitLab provides several views:- **Pipeline graph:** Visual representation of job dependencies- **Job logs:** Detailed output from each job- **Pipeline analytics:** Success rates, duration trends ### Debugging Failed Jobs Enable debug logging: ```yamljob-name: variables: CI_DEBUG_TRACE: "true" script: - commands``` Interactive debugging with CI/CD debug mode: 1. Go to pipeline job2. Click "Debug" button3. Access interactive terminal ### Performance Metrics Track pipeline performance: ```yamlafter_script: - echo "Job took $CI_JOB_DURATION seconds" - echo "Pipeline has been running for $CI_PIPELINE_DURATION seconds"``` ## Security Best Practices ### Protected Variables Mark sensitive variables as protected (Settings > CI/CD > Variables):- Only available in protected branches/tags- Masked in job logs ### Security Scanning Integrate security tools: ```yamlinclude: - template: Security/SAST.gitlab-ci.yml - template: Security/Dependency-Scanning.gitlab-ci.yml - template: Security/Container-Scanning.gitlab-ci.yml sast: stage: test dependency_scanning: stage: test container_scanning: stage: test variables: CI_APPLICATION_REPOSITORY: $CI_REGISTRY_IMAGE CI_APPLICATION_TAG: $CI_COMMIT_REF_SLUG``` ### Secrets Management Use external secrets management: ```yamldeploy: image: vault:latest script: - export VAULT_TOKEN=$(vault write -field=token auth/jwt/login role=myapp-deploy jwt=$CI_JOB_JWT) - export DB_PASSWORD=$(vault kv get -field=password secret/myapp/db) - deploy.sh``` ## Frequently Asked Questions ### How do I run jobs only on specific branches? ```yamljob-name: script: - commands only: - main - develop # OR using rules rules: - if: '$CI_COMMIT_BRANCH == "main" || $CI_COMMIT_BRANCH == "develop"'``` ### Can I retry failed jobs automatically? ```yamljob-name: script: - flaky-command retry: max: 2 when: - runner_system_failure - stuck_or_timeout_failure``` ### How do I pass artifacts between pipelines? Use project artifacts: ```yaml# In first pipelinebuild: script: - make build artifacts: paths: - build/ # In second pipeline (triggered)deploy: script: - deploy build/ dependencies: - build``` ### How do I cancel redundant pipelines? ```yamlworkflow: auto_cancel: on_new_commit: interruptible job-name: interruptible: true script: - long-running-command``` ### Can I use GitLab CI/CD with other Git providers? Yes, through CI/CD for external repositories. Connect GitHub, Bitbucket, or other Git repositories to use GitLab CI/CD. ## Conclusion GitLab CI/CD has matured into a comprehensive DevOps platform that rivals or exceeds specialized CI/CD tools. Its integrated approach—combining version control, CI/CD, security scanning, and deployment in one platform—creates a streamlined developer experience. The power of GitLab CI/CD lies in its flexibility. Whether you're deploying a simple static site or orchestrating complex microservices deployments across multiple cloud providers, GitLab provides the tools and features needed to automate your workflow effectively. **Key Takeaways:** - Use YAML-based configuration for version-controlled, reproducible pipelines- use caching and parallel execution for optimal performance- Implement security scanning and secrets management from the start- Take advantage of templates and includes for DRY configurations- Monitor and optimize pipeline performance continuously As DevOps practices continue to evolve in 2026, GitLab CI/CD remains at the forefront, providing the automation, security, and scalability modern development teams need to ship software faster and more reliably.
Expert writer covering AI tools and software reviews. Helping readers make informed decisions about the best tools for their workflow.
Cite This Article
Use this citation when referencing this article in your own work.
HostScout Team. (2026, January 10). GitLab CI/CD Complete Guide 2026: From Basics to Advanced Pipelines. HostScout. https://hostscout.online/gitlab-ci-cd-guide/
HostScout Team. "GitLab CI/CD Complete Guide 2026: From Basics to Advanced Pipelines." HostScout, 10 Jan. 2026, https://hostscout.online/gitlab-ci-cd-guide/.
HostScout Team. "GitLab CI/CD Complete Guide 2026: From Basics to Advanced Pipelines." HostScout. January 10, 2026. https://hostscout.online/gitlab-ci-cd-guide/.
@online{gitlab_ci_cd_complet_2026,
author = {HostScout Team},
title = {GitLab CI/CD Complete Guide 2026: From Basics to Advanced Pipelines},
year = {2026},
url = {https://hostscout.online/gitlab-ci-cd-guide/},
urldate = {March 17, 2026},
organization = {HostScout}
}