spk-logo-white-text-short2
0%
1-888-310-4540 (main) / 1-888-707-6150 (support) info@spkaa.com
Select Page

A Brief History of Automated Builds

Almost every programming book starts with an example of a small program which can be compiled from the command line using a simple call to the compiler with maybe a few flags. Known as “Hello World”, the few lines of code needed to output the text is almost universal.

What the reader takes away from the example is how to call the compiler to produce a build. However as they progress in learning the language, it soon becomes clear that multiple source files are needed and that compiling these manually is tedious and error prone. The solution is either to use some kind of script which compiles the modules or use a build tool.

The Unix operating system (and its derivatives / clones like FreeBSD and Linux) includes a tool called Make which automated builds based on a configuration file (a Makefile) describing which source files are needed by which components of the build. Due to its inclusion in Unix and the Unix-like operating systems, Make has become fairly universal. There are different versions, including a version by Microsoft for use on its Windows operating system. However Make isn’t the only build tool. For example, the Apache Ant build system is very popular for Java.

The problem with tools like Make, when used in their standard configuration, is that the compiling is performed sequentially. This means that one file is compiled and then the next and so on. When there are thousands of files with millions of lines of code this can be very slow. With the advent of cheap multi-core processors it became practical to compile source files in parallel. Make offers a flag (-j for jobs) which tells Make to perform multiple compiles at the same time. On a 32 core machine with solid state disks (SSDs), the compile time of the Linux kernel can be reduced from hours to under a minute using parallel building.

However the problem with parallel building is resolving the dependencies. If the Makefile doesn’t precisely define the dependencies of each module the build times might remain high (as the high level of cross dependencies means that the build process remains more sequential rather than parallel) or the build process breaks as modules are compiled and linked in the wrong order.

Tools like ElectricAccelerator are able to analyze a build system and create a dependency map. One way it does this is to monitor the usage of each file and detect when it is used by the build process. Such maps guarantee that the build is consistent and doesn’t break due to out-of-order compiles.

ElectricAccelertor also uses caching technology to reuse the output of previous compilations and so avoid unnecessary compiles. There is also a tool which can be used together with Make known as ccache which performs a similar function.

Conclusion

On large projects, build times significantly influence productivity. Enabling more sophisticated builds using parallelism and caching enables builds to be produced quicker and allows downstream activities (such as testing) to continue without hold-ups.

Latest White Papers

Replacing DOORS with Next Generation ALM

Replacing DOORS with Next Generation ALM

IBM DOORS has remained a consistent tool for managing software requirements. However, it has not kept up with the modern landscape. Explore options such as DOORS NG and other next-generation ALM tools in this eBook.What You Will Learn In this eBook, you will discover:...

Related Resources

Meet The Experts: Fernando Mino

Meet The Experts: Fernando Mino

Fernando Mino is a Systems Support Engineer who joined SPK and Associates in 2022. Before joining SPK, Fernando had worked in various fields, including teaching English in Ecuador and serving as an airport engineer in various parts of Europe.  In his current role, he...

Preparing Your DevOps Toolchain for the Future with AI and APIs

Preparing Your DevOps Toolchain for the Future with AI and APIs

For many years, automation and integration through APIs have driven efficiency in the DevOps space.  These features connect code repositories, CI/CD pipelines, and monitoring systems into cohesive toolchains.  However, the next major leap in software development is...

Why Consolidating Your DevSecOps Toolchain Drives Developer Productivity

Why Consolidating Your DevSecOps Toolchain Drives Developer Productivity

Software delivery success relies on two main aspects: speed and security.  However, a survey of senior IT and security leaders across North America reveals a sobering truth: 62% of organizations knowingly release insecure code just to meet delivery deadlines.  Despite...