How Containerized Toolchains Improve Embedded Software Development

Embedded software development is hard. Connected devices are becoming increasingly complex, and users are demanding more sophisticated experiences and greater product functionality.

While embedded software is used in systems across industries, including consumer, health care, security, automotive and beyond, developing embedded systems remains difficult due to a variety of challenges, including:

  • Embedded systems being larger, more complex and often distributed.
  • Limited computing resources.
  • Creating an optimal user experience (UX).
  • Building or configuring your own operating system.
  • Complicated cross-development tools and SDKs.

As the Internet of Things becomes increasingly ubiquitous and our world becomes more connected than ever, it is important that developers understand how to improve their development environments by leveraging containers. This informative blog post will walk you through how to use containers to create consistent environments for embedded software toolchains.

What Are Containers?

The result of two computer science trends meeting, containers are a form of operating system virtualization that packages up all the necessary executables, binary code, libraries and configuration files so applications run quickly and reliably from one computing environment to another.

First, the isolation of computer processes is increasing. It used to be that computers or mainframes leveraged a single piece of software with access to everything, but as limitations arose, multiple threads separated the kernel space from the user space, further isolating different processes based on contained memory space and system users with different permissions based on roles.

Isolating users and processes is ideal for increasing security. Isolation prevents vulnerabilities from spreading since attackers can’t access anything outside of the isolated process or user.

The other trend asks what technologies would be required to have a fully separate environment. It used to be that separate environments required separate hardware, but due to the size of the mainframes used back then, it was unreasonable to have more than one. As technology progressed, the industry evolved from mainframes to smaller mini computers to even smaller personal computers to lightweight virtual machines (VMs). Whether the implementations involve specialized hardware, software or a combination of the two, VMs provide the functionality needed to execute fully isolated, platform-independent operating systems.

The trends toward isolation and increasingly lightweight machines meet in the middle with containers. Containers solve the problem of portability by splitting entire systems into component parts so that they can be easily moved — whether from platform to platform, platform to cloud or cloud to cloud. Plus, containers do not contain their own operating system kernel, instead running on the underlying kernel provided by the host, which makes them more lightweight and portable, with significantly less overhead. By separating application code and all its dependencies from the infrastructure on which it is running, containers can be executed across many development platforms to provide software developers with a consistent, identical environment.

containerized toolchain image showing grid of shipping containers

The Most Common Container Use Cases

A great way to build, test, deploy and redeploy applications across multiple environments, containers require less system resources than traditional hardware or virtual machine environments and provide increased portability, more consistent operation, greater efficiency and improved application development.

Microservices

Today, containers are mostly used for deploying microservices. Microservices are an architectural style and organizational approach to software development that structures software as a collection of highly maintainable and testable, loosely coupled, independently deployable services. Distributed applications and microservices can be more easily isolated, deployed and scaled using containers.

Modernizing Existing Applications

Some organizations use containers to migrate existing applications into the modern cloud. Leveraging containers for this application offers some of the fundamental benefits of OS virtualization, but a modular, container-based application architecture is optimal.

In Combination With Orchestration

Containers can also be combined with orchestration to manage multiple containers deployed across multiple host machines. Whether using container orchestration tools like Docker Swarm or Kubernetes, container orchestration can help achieve failover/uptime, load balancing, scaling up or removing containers, movement of containers from one host to another, and more.

Embedded Software Toolchains

Fortunately, containers are also useful for replicable ephemeral environments, which brings us to the thrust of this blog post: Why use containers for embedded software toolchains?

Check out this blog about embedded software engineering services for answers to common FAQs.

The Problem With Embedded Software Toolchains

Generally speaking, embedded software toolchains suck — especially compared to the typical environments application-level engineers are used to. Embedded software toolchains are outdated, finicky, and not well isolated or contained. They often require old libraries, are hard to maintain, even harder to upgrade and quite difficult to set up, and most developers would rather do real, thoughtful work than manage development tools.

To set up an embedded software toolchain, many dependencies will need to be installed — in the correct version and not conflicting with other dependencies — which has historically been difficult and can often lead to the aptly named dependency hell. Usually, one developer on a team figures it out and can get the toolchain to work in their development environment, which is configured just the way they like it. But when another embedded software engineer pulls to integrate, the code won’t build on their machine or the system won’t compile it. Consequently, the embedded team will often make the developer for whom the toolchain does build do all of the development to avoid the inevitable “it won’t build for me” scenarios.

Containers, Consistent Environments and Embedded Software Toolchains

These scenarios are where containers can make a significant difference and improve the way we develop embedded software. By insulating development tools from the rest of the system, containers give us the option to get a toolchain set up correctly once — self-contained and without conflict — so that everyone on the team can use exactly the same tools and build environment. Creating consistent build environments helps make sure developers deliver consistent results when building.

By placing different variants of a toolchain in separate containers, teams can test applications against multiple containerized toolchains without worrying about how they coexist or if they’re even installed correctly. This opens the door for experimentation that might otherwise contaminate the development environment. Plus, with containerized toolchains, developers can patch older versions of tools, even after the development team has moved on to another release. Having container registries storing Docker images makes a toolchain the same for every team member, gets team members started faster and makes propagating bug fixes significantly easier since once the problem is fixed, the solution is available to the entire team.

Additional benefits of using containers to create consistent environments for embedded software toolchains include:

  • A single setup that works on every computer.
  • The maximization of resources so that containers can build upon other containers to work in the cloud for the CI/CD (continuous integration and continuous delivery/deployment) pipeline.
  • No need to manually build on the desktop.
  • Easy versioning to snapshot the entire runtime environment for branches, forks or releases.
  • The ability to layer the development environment on top of the production environment.
  • Simplification of the building of complex distributed systems.

At Cardinal Peak, we work on a wide range of projects, so containerizing embedded software toolchains for many of the different chips we use enables us to be immediately ready to go when we’re tasked with a new project using a specific chip. By leveraging containerized toolchains to store our prebuilt containers on a container registry like Dockerhub, having that library ready eliminates the days or weeks traditionally spent setting up a toolchain and reduces development time. With containerized toolchains, the results of a build no longer depend on a local setup — they are consistent.

How To Set Up a Containerized Toolchain

Much like hacking the environment together on your own computer, the steps involved in using containers to create consistent environments for embedded software toolchains are as follows:

  1. Start with the base image. Choose a Linux distribution and version that the embedded toolchain explicitly supports.
  2. Interactively install any necessary components to get the build up and running.
  3. Export or commit to save the container as an image.
    • At this point, anyone can use that new base image to create their own containers and run that code.

A more reproducible and easier way to adjust in the long run for future changes is to create a Dockerfile. A Dockerfile is a script that contains all the commands and defines the steps to create the image. While creating a Dockerfile and ensuring it works can require more effort upfront, you have the perfect recipe — even if the registry disappears — that anyone can build. Plus, any changes made going forward are tracked in the source code revision tool, and the steps are conspicuous so any other developer can understand why the toolchain is built the way it is.

Example: Cross-Compiling a Linux Kernel for Raspberry Pi

Embedded software developers always need a cross-compile toolchain, and managing this toolchain, its version, configuration and environment is vital to any embedded software project.

Natively compiling the kernel on a Raspberry Pi can take several hours, but by setting up a container to cross-compile the kernel, it took us less than 15 minutes to make that kernel available to everyone on our team. It also laid the groundwork for us to reuse this to build the kernel in any situation for any architecture.

 

Conclusion

While there is a learning curve, slight performance overhead and the appearance of maintenance overhead involved in using containerized toolchains versus doing everything locally on your own machine, the benefits far outweigh those disadvantages. By creating a reproducible environment that will work on every engineer’s computer and in your cloud CI/CD pipelines, containerized toolchains save the time and effort for each engineer to individually get the setup running and debug any issues.

Whether moving projects between environments, avoiding conflicting dependencies, experimenting with new software, adding developers to a project or maximizing resources through CI/CD, containers are ideal for embedded engineering. By offering a straightforward approach to managing software development, testing, builds and deployment, containers reduce the pain associated with redeploying embedded software on new boards.

If you still need help understanding why containers are the future of software development — or if your team requires assistance to bring your embedded system to life — connect with our engineering experts today!