This article is the first of a five-part series covering how to setup up a unit test harness on an embedded software project.
For the purposes of example, I’ll use the CppUTest harness, building within Silicon Labs’ Simplicity Studio (a YACE – Yet Another Customized Eclipse). This setup will be used to unit test components for Silicon Labs’ Thunderboard Blue Gecko SoC (ARM) projects. The unit tests are executed on-host, not on-target.
The steps and process are readily adaptable to alternative tools: CppUnit, Unity, Google Test, Atollic TrueSTUDIO, CodeWarrior, 8051, ATmega, TravisCI, Bitbucket Pipelines, and much more.
The five parts in the series are:
- Software Confucius: The case for unit testing in embedded software development.
- x86 Unit Test Build: Creating a GCC x86 build of the CppUTest harness and tests in Simplicity Studio.
- Running & Debugging: Running and debugging the x86 build within Simplicity Studio.
- Code Coverage: Coverage measurement using LCOV & Gcov.
- Continuous Integration: Building and running CppUTest unit tests in CircleCI.
I won’t go to detail on how to write unit test cases. There are plenty of great resources for that. Instead, I’m interested in getting people over the setup hurdle. Once you have a harness running, you’re all out of excuses for not writing tests.
Embedded software is late to every party. Agile, Scrum, continuous integration, unit testing, Test Driven Development (TDD), everything.
In my experience, these practices are frequently absent in embedded software. Embedded engineers aren’t “normal” software engineers. We like soldering irons, we like hardware, we like physical things. It’s no mean feat to convince an embedded engineer that testing can have value when it’s not physical.
Setting up a unit test harness is, pound for pound, a little harder for embedded software projects than other software. Most embedded IDEs don’t support it out of the box, as though they’ve never even heard of it. There’s a non-trivial effort required to get started, and it gets used as an excuse to never get started.
On-Target Or Off-Target
Even when I stumble upon an embedded engineer who believes in unit testing, it’s very often the case that they want to unit test on-target. This makes hardly any sense to me, apart from being able to use one toolchain for everything. I think there are many advantages to off-target (on-host) unit testing:
- a faster development micro-cycle that minimizes the number of times code must be deployed to target.
- the ability to make significant progress even when target hardware is unavailable.
- tests that can be developed, executed and debugged using more powerful tools.
- tests that can run on a continuous integration server, and don’t require any other hardware.
- code that, by definition, is more portable. i.e. Built by at least two toolchains, and executed on at least two systems.
I Don’t Like That Type Of Testing
Embedded guys tell me that. They also say they don’t have time.
Here’s why you should like this kind of testing:
- You can test scenarios that you can’t realistically create on-target. Especially panic/assert/insane scenarios.
- It’s easy to create many more input vectors than you can create by testing manually and physically.
- You can take control of the timebase to speed up test execution dramatically. ie. Fast forward, rewind, pause, and jump around.
- You can easily test boundary conditions, including timeouts, at (X-epsilon), X, and (X+epsilon). Something that is often not achievable, and certainly not cost-effective, to test manually and physically.
- You will have fewer errors reach the target system.
- We’ve all inherited code from someone else. And all experienced the pain of refactoring, maintaining and extending it. Can you imagine how easy it would be if that code came with a comprehensive unit test suite that characterized its behaviour? You should create such a suite for your successor. And your employer should demand you to if they care about risk management.
My Own Heresy
As much as I think the value of unit testing is underestimated by many embedded engineers, I personally think the value of Test Driven Development (TDD) is overestimated. At least the very dogmatic tests-must-be-written-before-production-code strain of TDD. To me, it doesn’t matter whether unit test cases are developed minutes or hours before or after the production code. What matters is to achieve good depth and quality of testing before the code is mainlined, and to have cheap regression testing going forward.
Not convinced? I wasn’t either. Maybe do what I did, and try James Grenning: https://wingman-sw.com/renaissance
He was one of the authors of the Agile Manifesto and one of the few who has tried to convert the embedded recalcitrants. He’s also a CppUTest maintainer. Here’s his book: https://www.amazon.com/Driven-Development-Embedded-Pragmatic-Programmers/dp/193435662X
The book is good, and does the job of getting you started. In the end though, the only thing that will convince you is sucking and seeing. Writing plenty of unit test cases.