TOC PREV NEXT INDEX  

Chapter 5

Practical Value of Coverage Analysis

 

HDL Verification Problem

Testing proves the existence of bugs in a design; it does not prove their absence. The only way to ensure that a design is bug free is to fully test the whole design. This sounds like a simple enough concept. However in reality today's designs are so large and complex it is not an easy task to ensure the design is fully verified. Over the last four years the verification portion of the design process has increased significantly, from about 30% - 40% to 50% - 80%, as shown in Figure 5-1. This makes verification the most significant part of the ASIC development process.

Figure 5-1

The problem is made worse by the competitive environment that exists today. Project time scales have to be met to avoid being beaten onto the market by the competition.

Figure 5-2 shows how the rate of bugs found decreases as a typical verification project proceeds. Initially the rate at which bugs are found is high. As the design matures the rate decreases. The problem is deciding when to ship the design.

Figure 5-2

Each fix introduces the risk of adding new bugs and any yet untested code could also still contain bugs. The only way to be confident that it is safe to ship your design is to ensure it is fully tested.

Coverage Analysis and HDL Verification

During the behavioral and RTL design phases of a project HDL coverage analysis tools such as Verification Navigator from TransEDA can be used to measure the amount of Verilog or VHDL code which has been executed during simulation.

Using coverage analysis to direct your verification process gives you a big advantage. Coverage analysis highlights the parts of your design that have not yet been executed, and therefore may contain bugs. Using this information you can concentrate your verification effort on the untested parts of your design.

Figure 5-3 shows the typical effect of using coverage analysis on the rate of bug detection in the verification process.

Figure 5-3

Using coverage directed verification the rate at which the bugs are found remains higher for longer. However it then decreases rapidly as the amount of HDL code executed approaches 100% and the bug rate drops to zero more quickly than without using coverage analysis. This demonstrates the value of coverage analysis in improving the verification process.

Project Management With Coverage Analysis

Coverage analysis not only helps to detect bugs but can also be used to help with project planning. The detailed information that most coverage analysis tools produce show exactly what proportion of the design is still to be verified. This information can then be used to make accurate predictions of when the verification process will be completed.

Quality control departments also benefit from the coverage analysis reports. These reports give documentary evidence of verification quality. Some companies already make coverage analysis a mandatory requirement in their design flow.

Functional Coverage

Coverage analysis does not prove that a design is functionally correct. It is still important and necessary to check the simulation results. The best way to do this is with a self checking test bench, which verifies the outputs of the design are correct for specific inputs. There is a correlation between functional coverage. However the size of the correlation depends on the amount of coverage measurements your tool provides and the number you use to test your design. Consider the statement below.

a <= b * c;

If this statement is tested with b=0 and c=0 you will have covered the line, but you would get the same simulation result if the operation was multiply, add, logical-and or logical-or, so the functionality has not been covered (verified). If extra measurements are used to ensure a range of values are passed through b and c you can be confident that the functionality of this example line has been fully covered.

Regression Testing

Coverage analysis can be used as criteria for regression test suite selection. In most cases the test suite for a design will cover the same code over and over again. From a coverage analysis point of view some of these tests are redundant. This situation is very common when using pseudo random test generation. The results of coverage analysis can be used to identify those tests that do not improve the coverage. These `redundant' tests could be omitted for short regression suites for example nightly or weekly runs. By sorting your test runs in this way you can ensure the maximum amount of your HDL code is tested by the regression tests. It also ensures the regression suite is short enough to be executed in the time available, for example overnight. The amount of redundancy in a regression suite varies greatly from company to company and design-to-design. Reductions of 10:1 are possible and have been achieved. The sorting of tests can be made more sophisticated by using different coverage criteria for different levels of regression testing. Nightly regression suites, for example, could be selected using the most basic coverage analysis measurements. Weekly suites could be selected using the rest of the coverage measurements and the remaining tests could be added for the monthly regression suite.

With most coverage analysis tools this regression suite sorting would have to be done manually or with scripts written by the user, however Verification Navigator from TransEDA has this regression suite ordering capability built in.

Gate Level Testing

At gate level the design is purely structural so the use of coverage analysis tools is limited. Only those coverage tools that provide toggle coverage such as Verification Navigator from TransEDA give any benefit at this stage. Toggle coverage is a check to see if the tests can apply both 0 and 1 value on all the signals in the design.

At gate level, however, toggle coverage has another important use in that it can be used as a pre-cursor to gate level fault simulation which is performed to generate tests to check the manufactured device.

Effective fault simulation relies on two circuit parameters, namely, Controllability and Observability. Controllability tests whether a 0 and a 1 can be applied to each node in the circuit, Observability checks whether the effect of a fault on each node in the circuit can be observed at the output pins.

A high value for controllability is critical for good fault simulation results and because there is a 100% correlation between toggle coverage and controllability, a high value for toggle coverage is necessary for good fault simulation results. But, because toggle coverage is run on the normal functional simulation, it is much faster than fault simulation and therefore more efficient than using fault simulation from the outset.


    TOC PREV NEXT INDEX  
Copyright (c) 2002
Teamwork International and TransEDA Limited
http://www.transeda.com
Voice: (408) 335-1300
Fax: (408) 335-1319
info@transeda.com