SR.

Managing Monorepo Deployments with GitLab Dynamic Child Pipelines

Cover Image for Managing Monorepo Deployments with GitLab Dynamic Child Pipelines

Managing CI/CD pipelines across multiple monorepos in a large organization of developer teams can be a tall order. Teams often need some flexibility, but too much customization can lead to inconsistency and maintenance headaches.

A solution I've found is combining GitLab's dynamic child pipelines with a centrally managed pipeline template using includes. This lets your platform team maintain a standard set of pipelines, while still allowing teams to customise their workflows as needed via environment variables or options.

In this example, I focus on teams deploying web applications, but I believe this workflow can work for any technology stack.

Why Dynamic Child Pipelines?

Dynamic child pipelines allow you to generate downstream pipeline configurations via a function and pass them as artifacts to be triggered. This approach is especially useful in monorepos, enabling you to create jobs only for the projects that require them.

Example Monorepo Structure

Imagine a monorepo that contains a web app and a package. In the monorepo root package.json, you can include metadata describing the stack and project paths.

{
  "project": {
    "stack_type": "monorepo",
    "projectPaths": [
      "path/to/app",
      "path/to/package"
    ]
  }
}

Each project also has its own package.json with a project field indicating its type—in our case, either a web app or a package.

{
  "project": {
    "stack_type": "web" // or "package", etc.
    // ...other options
  }
}

Setting Up Dynamic Child Pipelines

You can trigger a child pipeline using a YAML file that is dynamically generated during a job, rather than relying on a static file in your repository. This approach is especially useful for creating pipelines that respond to changes in specific parts of your project.

stages:
  - create-pipeline
  - trigger-pipeline

create-pipeline:
  stage: create-pipeline
  script: node create-pipeline.js
  artifacts:
    paths:
      - child-pipeline.yml
  when: always

trigger-pipeline:
  stage: trigger-pipeline
  needs:
    job: create-pipeline
  trigger:
    include:
      - artifact: child-pipeline.yml
        job: create-pipeline
    strategy: depend

Creating Dynamic Jobs Config

Let's generate custom pipelines based on the monorepo's configuration. The script first reads the root repository's settings, then inspects each subproject to determine its stack type. Using this information, it creates the appropriate pipeline configuration for each project.

const fs = require('fs');

const createNextJob = (stackType, projPath) => `
deploy:${stackType}:
  stage: deploy
  rules:
    changes:
      paths: ${projPath}/**/*

  script:
    - echo 'deploying ${projPath}!'
    - yarn deploy
`;

const createPackageJob = (stackType, projPath) => `
publish:${stackType}:
  stage: deploy
  rules:
    changes:
      paths: ${projPath}/**/*

  script:
    - npm publish
`;

const createDynamicPipeline = () => {
  // Read root package.json for monorepo project paths
  const rootPkg = JSON.parse(fs.readFileSync('package.json', 'utf8'));
  const projectPaths = rootPkg.project?.projectPaths || [];

  let jobs = '';
  projectPaths.forEach((projPath) => {
    const projectPkg = JSON.parse(fs.readFileSync(`${projPath}/package.json`, 'utf8'));
    const stackType = projectPkg.project?.stackType;

    if (stackType === "nextjs") {
      jobs += createNextJob(stackType, projPath);
    }
    if (stackType === "package") {
      jobs += createPackageJob(stackType, projPath);
    }
  });

  fs.writeFileSync('dynamic-gitlab-ci.yml', jobs);
};

createDynamicPipeline();

This is a simple example, but it highlights the flexibility of this approach. Notice how we pass in the projPath and use it in the job's rules to ensure the job only runs when changes are made within that specific project. You can easily extend this pattern to generate jobs based on different project types, add more options via package.json, or support additional environments as your monorepo evolves.

Sharing the Monorepo Template

Once your dynamic pipeline setup is working, the real power comes from sharing these templates across teams. By leveraging GitLab's include feature, you can centrally manage your monorepo pipeline logic in a single repository. This means every team can benefit from improvements, bug fixes, and best practices—without duplicating code or reinventing the wheel.

For example, you can version your pipeline templates and have each project reference the latest (or a specific) version:

include:
  - project: 'platform-team-repo'
    ref: $VERSION
    file: 'templates/.monorepo-ci-template.yml'

This approach not only keeps your CI/CD configuration DRY, but also makes it easy to roll out updates and maintain consistency across all your projects. Teams still have the flexibility to override or extend the shared template as needed.

Summary

We looked at how you can leverage GitLab’s dynamic child pipelines to streamline monorepos. By generating pipeline config for each project and centralising your pipeline templates using GitLab’s include feature, you get the best of both worlds: a consistent, maintainable setup that’s still flexible enough for teams to customize as needed.

Thanks for reading!

For more details, see the GitLab documentation on dynamic child pipelines.