Some among us may remember Earl Scheib who owned a chain of auto painting facilities; at least, that's what he called them. In actual fact, his shops were a national joke. In his TV commercials he would tell viewers, “I’ll paint any car for $99.95” and would promise one-day service. He did just that, but as the old saying goes, "You get what you pay for."
All Scheib really cared about was sales and he thought the way to increase them was by promising something cheaper and faster than the competition. The low cost and speed, however, came at the cost of quality.
What Scheib seemed to miss is that the concept of "value" has two sides to its equation. Not only does something have to come at the right price and in a timely manner, it also has to provide an acceptable level of quality regardless of the price, otherwise the value suffers. And whether it’s having auto body work done, buying groceries or developing software, value is truly what the consumer seeks.
Of course, when it comes to painting cars or buying groceries, we determine value rather easily. Determining software value, however, is a much more complicated process, but I can assure you, that process goes well beyond the question of how fast you can get the code written.
Diligently Chasing Quality
While the size of embedded software increases every year, the basic tasks to develop remain the same: editing, compiling and debugging. Magnus Unemyr of Atollic AB notes that one key to unlocking a more efficient software development process begins with creating a “well-thought-out design that is maintainable” and managing code complexity. He continues by noting that developers should think of themselves not as “code writers” exclusively, but also “software engineers,” focused on improving the efficiency and quality of the entire development value chain.
So let’s have a look at improvements in traditional development tools that can improve quality (and speed of development in the process) and then move on to other factors involved in improving software development value.
- Editing Tools: Easier navigation in editing tools can result in less complexity, and less complexity often translates to better code. These should include features such as color-coded syntax and expansion/collapsing of code blocks. They should also include a smart editing capability with a configurable coding style.
- Compilation Solutions: New compilation solutions should feature advanced build systems, and support application and library projects, which can be built in “managed” or “makefile” mode. Developers can also look for dual tool chains, one that addresses the embedded microcontroller device and the other that focuses on Windows-based PCs. This approach allows PC engineers to develop utilities that share configuration data to the embedded board, or log data from embedded boards, without the need to buy Microsoft VisualStudio.
- Debugging Tools: Developers should look for debugging solutions that include multi-processor debug capabilities and real-time tracing. They might also include support for a wide range of features, such as simple and conditional code and data breakpoints, full execution control functions, memory and CPU register views and call-stack views.
- Code Management: New code management solutions will ensure that as project requirements change and developers come and go on the project, robust version control is in place, making certain that in the future, developers will understand the reasons behind extending and modifying code during the project.
Code reviews require extra steps and time, but they are among the least expensive ways to improve software quality because with each phase of a software build, it takes ten times longer and costs ten times more to find errors.
Accelerating Quality through Communication
Another concept involved in maximizing software value is predictability. If you think about it, customers are less annoyed by the time it takes to complete a development project, than they are when they’re told the project is late. Typically, this lack of predictability lies deep within the software development process and usually the result of a chain reaction, not a single event. An activity that improves predictability at the beginning of the project will have a positive ripple effect on the entire development process.
One new development methodology that comes to mind is DevOps, which promises to speed the development process, but also has the ability to improve predictability.
The big advantage of DevOps, in my view, is its ability to tighten the communication loop between developers and operations, allowing developers to make changes more quickly. This collaboration facilitates faster changes, better updating and improved scalability and enables companies to make changes to their code on the fly rather than allowing issues to linger into succeeding stages of the build process where they can take ten-times longer to fix. Along with careful quality control processes, such as having multiple engineers review code before it goes online, DevOps can contribute to improved predictability, quality and value.
While enhanced communication is nice, DevOps only works if the issues with the software being built are made visible. This is why one of the most significant tasks in the "value add" process of a software build is ongoing structural analysis and measurement.
Automated analysis and measurement solutions apply advanced diagnostics to identify and quantify structural flaws, arming developers with the information necessary to make fixes. And although automated analysis and measurement solutions may add a bit of time to the development process, they are far more efficient and far less time consuming than manual structural analysis, which is grossly inefficient.
When coupled with a well-thought-out development plan that clarifies predictability and used in concert with advanced editing, debugging and other tools, automated analysis and measurement solutions provide visibility into the structural quality of an application. Ultimately, optimized software is achieved faster, which greatly increases the long-term value of the application...not to mention spending a little time on getting it right won't add as much time as trying to fix malfunctions, restore outages or mitigate losses of data due to breaches of vulnerable software.
So the next time you hear someone talk about how fast and cheap they will deliver their software, remember Earl Scheib and tell them you'd prefer it done right.