Complexity in embedded software development is driving the cost of test and verification to as much as 70% of overall development costs. Industrial automation, automotive, and aerospace engineers conduct exhaustive design and code reviews and build increasingly complex test systems to confirm that the software in embedded processors meets high-integrity standards and design requirements. And as verification consumes more time, engineers have fewer opportunities to innovate and create product differentiation through design optimisation.
Many organisations are finding that most errors they uncover in test and integration were introduced at the beginning of the design process while interpreting system requirements. Engineers now face the challenge of checking for errors and "cleaning them out" closer to the beginning of the development process, when they are cheaper to fix.
Finally, as development teams grow and become geographically dispersed, they are seeking better ways to collabo rate. Embedded software errors can be cut substantially by doing more complete design verification at or near the beginning of the design process via systematic system simulation, or virtual testing.
“Virtual” in this case denotes simulation, but with no hardware involved – just software and simulation engines. “Systematic” means building tests based on requirements and then executing those tests against the system design.
Two critical concepts that drive virtual testing process improvements:
- An executable system specification
- Requirements based tests
The executable system specification is a model that you can simulate, and includes your system design as well as environment models: models of the important elements of the physical world your embedded system interacts with. The system model needs to include subsystem domains such as controls, mechanical components, electrical components, and hydraulics.
Requirements-based tests are formulated from functional requirements that can be expressed as tests; essentially, "Given a particular input, here is the output I expect." At a minimum, you need to have a simulation input signal or sequence for each test, as well as output captured from the simulation, to compare with the expected output. You should also build a complete set of tests that fully exercise the requirements.
To understand how this works, consider the classic design process V diagram. With virtual testing, you would follow a second V early in the process, starting partway down the left leg and then up to the right. The design flow moves to a virtual test rather than down the left leg to the original apex (the point of implementation) and then up the right leg (the hardware side) to physical testing.
Model-Based Design enables verification as a parallel activity that occurs throughout the development process.
As embedded software grows in func tionality and complexity, engineering teams are moving beyond traditional code development techniques of using text editors and debuggers. They’re shifting to design centred on models, using modelling, simulation, and code generation tools on the desktop. This model-centric development approach is called Model-Based Design.
Embedded system developers in industrial control, automotive, and aero space are adopting Model-Based Design to improve their development processes and manage costs, while maintaining quality. They use models and simulation to increase efficiency with complex system designs, and automatic code generation from models to streamline implementation.
While simulation alone will not find all errors, it is a huge step forward and can be done almost as soon as you design a model. Iterating in a modelling environment is fast and easy.
A whole system model allows the development group to check if requirements make sense, and if the design meets those requirements. In a traditional process, they couldn’t obtain such results until they reached the right side of the V when testing physical components and systems.
Modelling individual components is incredibly useful and may be necessary to complete a complex design, but it’s helpful to first model the system or environment your component will operate in. By modelling the whole system in a single environment, you can quickly see how the functionality of your compo nent will interact with other components and how the integrated components will behave in the deployed system or environment.
You can also find missing require ments for your individual component or others. With a system model to return to as you iterate one component, you can assess how design iterations will affect system functionality.
By automatically generating code from the model, you can test design trade-offs and iterate faster than with hand coding.
Developing tests in parallel to the design process enables early detection of potential problems and significantly reduces the cost and time for fixing them. By thinking about testing while developing the model, you can design better for "testability," thereby ensuring the design can be fully tested.
Brett Murphy is Manager, Product Marketing for Verification, Validation, and Test, MathWorks.