1-888-310-4540 (main) / 1-888-707-6150 (support) info@spkaa.com
Select Page

Improving CI Build/Test Execution Times Through Improved Build Specifications

Written by SPK Blog Post
Published on June 18, 2015

It may come as no surprise that Google has a very large code base. With over 15,000 developers sending new code changes to their continuous Integration (CI) system every second, Google is constantly seeking methods to optimize both the size of their code base and the execution time of their builds.

In a recently published research paper, Google identifies a problem they call “underutilized targets” and details their automated solution for resolving it. while the tools Google developed to fix this problem are internal, their algorithm is public and has practical applications for the rest of us – especially those developing with open source libraries or for mobile devices.

Simply put, an “underutilized target” is a target in which a subset of its files are not needed by some of its dependents. Release engineers want fast builds. When you pull in and test unnecessary files, its a waste of time and the resulting artifacts are larger than necessary.

Decomposing a target is the act of splitting the target apart into smaller units. By doing so, we can avoid compiling and testing irrelevant code. This not only decreases our build/test iteration time, but also increases the modularity of our code base and shrinks the size of the artifacts produced. One could decompose a set of dependencies manually, however, this proves to be time consuming and error-prone. Because manual decomposition is difficult, Google wanted to create an automated approach in order to handle the large volume of code they wanted to cover. They developed a pair of internal tools called Decomposer and Refiner to respectively identify beneficial target decompositions and restructure the files accordingly.

After testing the effectiveness of these tools on a random set of Google’s internal Java libraries, the researches found that, on average, the test execution time for a single target decreased by 12%. When a set of 1010 targets were decomposed and restructured, test execution time for each target was decreased by 50% or more. These are significantly faster builds!

Since most of us don’t work for Google, you may be asking how this research can apply to your organization. Essentially, this research demonstrates that we can decrease the iteration time of our build/test cycle by improving the quality of our build specifications. While the specific tools Google has developed to do this are internal, the research article outlines the algorithm for implementing these tools and so it can be adapted to work elsewhere. Furthermore, the paper demonstrates that the algorithm is not language or build system dependent.  Developers in the mobile or embedded space may find the outcome of this study particularly relevant for decreasing the footprints of their deployable artifacts, while those making heavy use of opensource libraries could see improved build times if they review their dependencies and strip out unnecessary code.

Next Steps:

David Hubbell
Software Engineer
SPK and Associates

Latest White Papers

The Next Chapter of Jira Service Management

The Next Chapter of Jira Service Management

The service industry is only becoming more competitive as the years pass, making efficient delivery vital to success. Development and Operations teams need to work together to deliver aid and Jira Service Management can help achieve this. Explore the future of Jira...

Related Resources

Best Practices for Successful eQMS Implementation

Best Practices for Successful eQMS Implementation

Are you a medical device manufacturer seeking to overcome the challenges of your current quality management? Let me help you show you the light. In this blog, we will show you the advantages of adopting an Enterprise Quality Management System (eQMS). Better yet, one...

Metrics to Improve DevOps Performance

Metrics to Improve DevOps Performance

Metrics to Improve DevOps Performance DevOps has revolutionized the way organizations deliver high-quality applications. By seamlessly integrating software development and IT operations, DevOps drives efficiency and collaboration. However, to unlock the true potential...