Skip to main content

Boosting C++ Memory Safety with Parallel Builds and Shared Configurations

· 6 min read
Christopher McArthur
C++ DevOps Expert

C++ grants developers immense power, but with it comes the greater responsibility of managing memory. Memory leaks and access violations can bring down even the most robust applications. To combat these issues, sanitizers like AddressSanitizer (ASan) and LeakSanitizer (LSan) are invaluable tools. When it comes to addressing security, this is only the tip of the iceberg; you are expected to do more but sanitizers are an approachable starting pointing.

Despite these obvious upsides 50% of developers don't leverage these tools, integrating them into the build process often raises concerns about increased build times. This blog post explores a strategy to leverage modern build tools and parallelization to achieve exceptional memory safety without sacrificing CI speed.

The Power of Many: Building More Configurations

Traditionally, projects might ship with a single build configuration optimized for production - they are built exclusive for the platforms they support. But what if we built more configurations specifically designed for memory safety checks? We can introduce configurations that enable sanitizers like ASan and LSan during the build process. These configurations would catch memory issues early on, preventing them from reaching production.


This also possible cross-platform, thought it might be more effective to target just one depending on your infrastructure.

Foundation: Simple CI Pipeline

As previously covered, CMake Preset are an excellent way to organize build configurations. If you are unsure about them, checkout the old school way or this far more restrictive implementation, there's an elegance to the simplicity presets can bring. Let's take the release preset and wrap that in a pipeline to get started.


  • Compiler Toolchain: The CI environment should have the necessary compiler toolchain (e.g., GCC, Clang) pre-installed.
  • Dependencies: Any project dependencies should be readily available through package managers or pre-downloaded artifacts.
  • Test Coverage: Good test coverage either through unit or integration tests where the code can be exercised; working examples are a good substitute.

Pipeline Stages:

  1. Checkout: Fetch the latest code from the version control system (e.g., Git).
  2. Install Dependencies: If dependencies aren't pre-installed, use package managers to install them during this stage.
  3. Configure:
    • Select the desired build configurations using CMake. For example one for production and one for each sanitizer (ASan, LSan, etc.).
  4. Build:
    • Execute the build commands defined in the CMake configurations from stage 3.
  5. Test:
    • Run unit tests and other automated tests for each successfully built configuration.
  6. Artifacts:
    • Archive and store build artifacts like executables and test reports for further analysis or future reference.

Check the examples for GitHub Actions from the presets post putting these principals practice.

The Key: Parallel Builds for Speed

The challenge? Building with multiple sanitizers can significantly impact build times especially if they are already problematic. Here's where the magic of parallel builds comes in. By leveraging tools like CMake, we can define separate build configurations for each sanitizer. More importantly, we can configure these builds to run in parallel, a well established technique to reduce build times.

Implementing this on GitHub Actions

name: CI with Parallel Sanitizer Builds

branches: [ main ]

runs-on: ubuntu-latest # Make sure to have a proper version-controlled build image
preset: [release, asan, lsan]
- uses: actions/checkout@v4

- name: Install dependencies
uses: ./.github/actions/install_deps # Setup caching and call favorite package manager

- name: Configure with preset
run: cmake --preset ${{ matrix.preset }}

- name: Build
run: cmake --build --preset ${{ matrix.preset }}

- name: Run tests
run: cmake --build --target unit_tests_run --preset ${{ matrix.preset }}

- if: matrix.preset == 'release'
name: Upload artifacts
uses: actions/upload-artifacts@v4
paths: build/release

Caching Third-Party Dependencies for Effective Sanitizer Builds

When using sanitizers, it's crucial to ensure that the entire build graph, including third-party dependencies, is built with sanitizers enabled. This guarantees that memory issues are detected across the entire codebase, not just within your own code, as the boundaries are were mistakes are most likely.

As we've established, building with sanitizers can significantly impact build times; to address this challenge, caching mechanisms play a vital role in optimizing the CI workflow. Third-party dependencies are less likely to change, but need to be treated with the same care, between build are offer an effective return on investment for the effort to implement more complex workflows.

Why Caching Matters

Caching third-party dependencies can significantly improve build times by avoiding redundant downloads and builds. This is particularly beneficial when using sanitizers, as the entire build graph needs to be re-evaluated with sanitizers enabled for each configuration.

Here's why caching is crucial:

  • Reduced Build Times: Caching eliminates the need to repeatedly download and build dependencies, especially for frequently used libraries. This can drastically reduce build times, especially on CI pipelines.
  • Consistent Results: Caching ensures that the same dependencies are used across different builds, leading to more consistent and reliable results.
  • Improved Developer Experience: Faster builds translate to a smoother development experience for the team, allowing them to iterate and test code changes more efficiently.

Strategies for Caching Third-Party Dependencies

Several strategies can be employed to effectively cache third-party dependencies when using sanitizers in C++ projects:

  • Per-Preset Caches: Create separate caches for each build configuration (e.g., release, asan, lsan). This ensures that dependencies are only downloaded and built once for each configuration, reducing redundancy.
  • Conan Packages: Utilize package managers like Conan to manage third-party dependencies. Conan allows for caching packages remote (and locally) and reusing them across different builds.

Adding onto the pipeline example, this looks like the following:

Putting it All Together

  1. Define Build Configurations: Use CMake to create individual build configurations for each sanitizer and a standard production configuration.
  2. Enable Parallel Builds: Leverage multiple cores or machines for parallel execution of these configurations.
  3. Implement Caching:
    • Utilize per-preset caches and/or Conan packages to efficiently manage third-party dependencies, ensuring consistent results and faster builds.


  • Improved Memory Safety: Sanitizers catch memory issues early on, preventing them from reaching production.
  • Fast CI Builds: Parallel builds mitigate the impact of additional configurations on CI pipelines.
  • Consistent Development Workflow: Shared CMake presets streamline the development process by providing a familiar environment with memory safety checks readily available.

By adopting this strategy, you can elevate your C++ project's memory safety without sacrificing development speed or introducing workflow friction. Remember, memory safety is not just a CI concern; it's an essential part of the entire development lifecycle. So, build more configurations, leverage parallelization, and empower your developers to write robust, memory-safe code!

Packages: The Building Blocks of C++ Development

· 6 min read
Christopher McArthur
C++ DevOps Expert

In the ever-evolving world of C++, managing code effectively is paramount. Packages, a fundamental concept in software distribution, provide a structured approach to organizing and distributing reusable components. This blog post delves into the core elements of C++ packages, their essential properties, and how they streamline the development process and proposes a set of core concepts that should be captured by any specification.

The reason this is so important from a CI design point of view is creating an effective caching solution to help improve build times. Avoid uploading unnecessary files is an imperative requirement for this strategy.ll

Unpacking the Essentials: What Makes a C++ Package

A C++ package encapsulates a collection of core artifacts, the lifeblood of your codebase. This does not extend to delivering the product for example as on-premise software but might be bundled for a software development kit. These artifacts serve as building blocks that are ingested by build systems to execute the compilation, archiving, and linking processes. Here's a breakdown of the key components:

  • Core Artifacts:
    • Headers (.hpp files): These files contain declarations (functions, classes, variables, etc.) that serve as blueprints for your code. Other source files can #include headers to access the declared entities.
    • Libraries (Static and Shared): These are compiled archives (.a or .lib for static, .so or .dll for shared) that bundle object code representing compiled functions and variables. Libraries provide reusable functionality that can be integrated into your programs.
    • Binary Modules (C++20 Modules): C++20 introduces modules (.ixx files) as a more granular unit of code organization and compilation compared to traditional headers. They enable stricter dependency management and improve build times.
  • Supporting Artifacts:
    • License Information (.txt or .md files): It's crucial to include licensing details to comply with copyright and distribution requirements.

There's often a temptation to include Executables (.exe or .out); these can be utilities used during the build process, such as code generators or testing frameworks. However I'd argue there's likely an extension to the idea of a "package" called "tool" which could be specialized and more adapt at covering this use case.

The Package Imperative: What Makes a Valid Package?

To guarantee robust development, C++ packages must adhere to specific criteria:

  • One Definition Rule (ODR) Compliance: Package contents must strictly follow the ODR, ensuring there's only one definition for a given entity across all included files. This prevents ambiguities and potential compilation errors.
  • Well-Formed Programs: A valid package must yield functional, well-formed programs when all its core artifacts are consumed. Errors arising from improper interaction between different components disqualify a package.

The primary goal is to enable downstream consumption; capturing the inputs for the compiler is the practical application of this concept. Packages should be readily consumable by other build systems for downstream projects. A clear and well-defined interface is essential for smooth integration.

Beyond the Bare Essentials: Configuration Considerations

While core artifacts form the foundation of a package, however there is more information necessary for compiling and linking which plays a vital role in specifying details like:

  • Compiler Flags: These flags control the behavior of the compiler, influencing optimization levels, warning modes, and other compilation settings.
  • Include Paths: Build systems need to know where to locate headers during the compilation process. Package specifications should explicitly define include paths to avoid errors.
  • Linker Flags: Linker flags instruct the linker on how to combine object files (libraries) into executable programs. These can be specified within package configurations.
  • Library and Library Path: Packages may depend on external libraries. Explicitly declaring required libraries and their paths facilitates linking during the build.

Avoid Binary Compatibility

C++ packages benefit from avoiding binary compatibility information within the package itself. This simplifies package and leaves the that be tooling specific; often teams have contradictory requirements (save it or break it) and the tooling or vendor specific implementation should not be decided here.

The Build System Should not Care:

  • Build systems like CMake or Make are adept at handling platform-specific configurations. They take compiler flags, target architectures, and other factors into account to ensure correct compilation and linking.

Package Managers, Right Tool for the Job:

  • Package managers like Conan or vcpkg offer a powerful solution for dependency management.
  • They maintain a database of packages with specific versions and configurations (triplets) catering to different environments.
  • When a new project depends on a package, the package manager can:
    1. Rebuild from source if the package has no compatibility information, ensuring the build process leverages the specific project's configuration.
    2. Strict Compatibility Checks (for packages with well-defined compatibility):
      • The package manager verifies if the existing package version aligns with the project's requirements.
      • If so, the package can be reused, saving build time.
      • If not, the manager triggers a rebuild from source.

The Case for "No Compatibility" Definition:

  • By leaving binary compatibility out of the package metadata, the package becomes more adaptable.
  • Build systems and package managers can then apply their knowledge and strategies for optimal builds.
  • This reduces duplication of effort and streamlines the development process.

Trade-offs and Best Practices:

  • While avoiding binary compatibility information generally simplifies package management, there are situations where pre-built binaries might be beneficial (e.g., for performance optimization on a specific system).
  • In such cases, package managers often offer mechanisms for providing pre-built binaries with clear versioning and compatibility guidelines.

However, as a general rule, keeping packages free of binary compatibility information promotes flexibility and maintainability for C++ projects in diverse environments. This approach fosters collaboration between packages, builds systems, and package managers, resulting in more robust and adaptable C++ development ecosystems.

Conclusion: Packages - The Powerhouse of C++ Development

Packages empower C++ developers with a structured and efficient way to organize code, promote reusability, and ensure project maintainability. By adhering to the essential properties outlined in this guide, you can craft robust and portable packages that act as the cornerstones of successful C++ development endeavors.

Ready to Dive Deeper?

For further exploration, consider delving into specific build systems like CMake or Make to understand how they handle package creation and configuration. Explore popular package managers like Conan or vcpkg that simplify dependency management and facilitate the sharing of C++ packages across diverse projects.

Automated Testing for Seamless CMake Config File Integration

· 13 min read
Christopher McArthur
C++ DevOps Expert

As a C++ developer, ensuring your library integrates flawlessly with other projects is crucial for driving adoption. CMake being the defacto standard plays a vital role in this process by providing installed configuration files; guiding consumers on how to find and utilize your library using find_package. But how do you guarantee these config files are installed correctly and provide all the necessary information? Enter automated testing!

This blog post explores an approach for testing CMake config files inspired by Behavioral Driven Development practices and showcases a powerful implementation on GitHub Actions featuring 14+ test cases.

Why Test CMake Config Files?

Imagine creating a fantastic C++ library, only to have users encounter missing headers or library paths when they attempt to integrate it within their builds. This very real headache is why many open-source developers have opted for header-only libraries. "Just copying the headers" eventually became the norm. However, this trend has culminated in ballooned build times, as the preprocessing stage can become a bottleneck.

Breaking down the 2024 Survey Results

· 13 min read
Christopher McArthur
C++ DevOps Expert

It's that time of year once again! The ISO Committee published the summary of the results for the C++ Developer Survey "Lite". This has been running for several years and it's probably the first time we can start to see some trends... hopefully!

The survey results, with less than 1300 developers compared to 1700 last year, is only partially explained by third-party restrictions as noted by the blog post sharing the results. Regardless a wider sample would be ideal. The dominance of CMake with an 83% market share is striking. Could this 4% growth be linked to the lower burden for managing build scripts? Despite these limitations, the survey offers valuable insights into C++ ecosystem trends.

Since this blog is all about building and shipping C++ software, I'll be focusing on the tooling and ecosystem questions and results. There's a natural bias here, as I'm particularly interested in how these trends affect developers like us. But fear not, there's plenty for everyone! In fact, I'm curious what aspects other bloggers will delve into. Let's jump right in as there are some fascinating statistical correlations to explore!

Package Management vs. Reproducible Builds... Or Complementary Approaches?

· 9 min read
Christopher McArthur
C++ DevOps Expert

Let's face it, in the land of C++ development, package management and reproducible builds can feel like oil and water. Package managers promise lightning-fast builds with pre-built libraries, while reproducible builds preach control and consistency by rebuilding everything. But here's the thing: they're not sworn enemies.

Think of it this way.

Let's start with the basic build system. Imagine you're spending hours compiling your code. You throw more cores at the problem, and the build time shrinks – but there's a limit. Eventually, adding more cores won't magically make it compile any faster. Now imagine you don't build at all. Poof! Your build time is divided by zero, because it's not happening at all, it's just not a factor anymore. The most reproducible builds are the ones you don't have to repeat endlessly. That's where package management comes in, saving you from endless build marathons. Yet there's even more benefits for reproducibility as well.

This post will explore how these two seemingly opposing forces can actually work together to create a streamlined and efficient development workflow. Despite being a new idea, there is evidence of this already being used along with potential new opportunities for future development in this space.

Const Correctness for C++ Builds

· 6 min read
Christopher McArthur
C++ DevOps Expert

In the ever-evolving world of software development, ensuring code quality and maintainability is paramount. Two seemingly unrelated concepts, const correctness in C++ and ephemeral build environments from DevOps, share a surprising connection, both aiming to build a strong foundation for reliable software.

Const Correctness: Enforcing Immutability in Code

Const correctness is a programming paradigm in C++ that emphasizes the use of the const keyword to explicitly declare variables and objects that shouldn't be modified. This enforces a form of immutability within your code. Just like an immutable object in other languages, a const variable cannot have its value changed after initialization.

CPS: A Streamlined Future for C++ or Overly Specific?

· 6 min read
Christopher McArthur
C++ DevOps Expert

The most relevant problems for C++ developers are package management, setting up CI/CD pipelines, and maintaining build scripts. Talking to developers and builds teams the cause of that frustration is the lack of interoperability between build systems.

The Common Package Specification (CPS) aims to revolutionize C++ development by standardizing how dependencies are described. While the core concept holds promise, specific aspects raise questions about its practicality within the C++ ecosystem.

Conquer C++ Dependency Challenges: A Comprehensive Guide

· 2 min read

Streamline your C++ development workflow and ensure long-term project sustainability with effective dependency management. This comprehensive guide delves deeper than just libraries, providing a holistic perspective on the entire toolchain ecosystem and equipping you with practical strategies for success.

Mastering the Art of Dependency Management

By following these industry-proven best practices, you can establish a robust and sustainable approach to dependency management in your C++ projects:

  • Embrace Version Control and Lockfiles: Ensure reproducible builds and prevent unexpected behavior by explicitly specifying exact versions of dependencies in your project configuration files and utilizing tools that generate lockfiles to record downloaded versions.
  • Minimize and Evaluate Dependencies: Carefully consider alternative approaches or existing libraries before introducing new dependencies to reduce complexity and potential issues. Regularly evaluate your existing dependencies and maintain an up-to-date list with documented justifications for each one.
  • Strike a Balance with Updates: Stay informed about updates to your dependencies and address critical security vulnerabilities promptly. However, prioritize stability and avoid frequent updates unless strictly necessary. Focus on updating the fewest number of dependencies possible, prioritizing direct dependencies during development and limiting updates for transitive dependencies to patches addressing vulnerabilities.
  • Implement Thorough Testing and Documentation: Proactively identify and mitigate issues arising from dependency changes by incorporating comprehensive testing strategies that cover functionalities impacted by updates or conflicts. Maintain clear documentation listing all project dependencies and their purpose to aid understanding and future maintenance efforts.
  • Leverage Artifact Management: Facilitate version control, sharing, and retrieval of build artifacts (compiled binaries, libraries, packages) across different environments and teams by utilizing a central artifact repository. Implement versioning and tagging schemes within your repository to track changes, identify specific builds, and ensure consistent deployments. Automate artifact publishing and promotion to streamline the workflow and reduce manual intervention.


Effective dependency management is an ongoing journey, but by embracing the insights and strategies outlined in this comprehensive guide, you can conquer the C++ dependency landscape with confidence. Build resilient and maintainable projects, streamline your development process, and empower yourself to deliver exceptional C++ applications.

Layer by Layer: Navigating C++ Dependencies with Precision

· 6 min read
Christopher McArthur
C++ DevOps Expert

The ever-evolving world of C++ development unlocks incredible possibilities, but one persistent challenge haunts programmers: dependency management. While libraries often steal the spotlight, the true scope of dependencies extends far beyond. It encompasses the entire toolchain ecosystem, from the compilers and operating systems used to craft your code for the operating systems where it ultimately executes on.

Imagine building a spacecraft. While the engine is undeniably crucial, neglecting the guidance system, navigation tools, and communication equipment would be disastrous. Similarly, focusing solely on libraries paints an incomplete picture. Every software tool you utilize, from the ground up, contributes to the final product's functionality, stability, and maintainability.

This guide delves into the intricate tooling and dependencies, exploring the various categories, offering effective management strategies, and practical examples across diverse project types. By venturing beyond the surface level of libraries, we equip you with the knowledge and tools to navigate the dependency landscape with confidence and efficiency, ensuring your C++ projects reach their full potential.

DevOps Is Software Engineering for Your Builds

· 6 min read
Christopher McArthur
C++ DevOps Expert

C++ developers often approach DevOps with a reasonable degree of skepticism, and it's not without reason. Their top five challenges revolve around build scripts, dependency management, and setting up CI pipelines—areas traditionally associated with DevOps responsibilities. Michael Xymitoulias articulated this sentiment well in a recent LinkedIn post writing:

Right now, it feels that C++ developers have to deal with way more than just writing business logic code. [...] Allowing devs to focus more on coding rather than trying to solve problems of the ecosystem would probably be liberating

Xymitoulias suggestion that allowing developers to focus more on coding, rather than grappling with ecosystem problems, would be liberating. He's not alone, Bill Hoffman, CTO behind CMake supports this ideal of simplifying the developer workflows, something we've seen put in place with recent improvements to the popular build system.

While building in C++ certainly comes with its challenges, streamlining this process is our collective goal. So, let's delve into the core focus as it relates to C++ and dispel some common misconceptions.