Why Is There A Coverage Loss When - What-If - Analysis Is Applied To Test Points
Why Is There A Coverage Loss When - What-If - Analysis Is Applied To Test Points
Title
Why is There a Coverage Loss When "What-If" Analysis is Applied to Test Points?
Description
Question:
I am performing "What-If" analysis on my design and want to get an estimate of the test coverage improvement using the
test points calculated from the analyze_test_points command.
The following test coverage is reported with ATPG in Fast-Sequential mode using the set_atpg -capture_cycles 10
command:
-----------------------------------------------
total faults 83200
test coverage 92.70%
-----------------------------------------------
In the same TetraMAX session, I ran test point analysis and estimated the resulting test coverage using the following
commands:
-----------------------------------------------
total faults 83200
test coverage 91.14%
-----------------------------------------------
Similar reductions in test coverage are reported when using other algorithms, such as the analyze_test_points –target
testability command or the analyze_test_points –target pattern_reduction command.
I expected to get an increase in test coverage after performing "What-If" analysis. But instead there is a reduction in test
coverage. Why is this occurring?
Answer:
The reduction in test coverage occurs because the run_atpg –observe_file command performs ATPG in Basic Scan
mode during analysis. This command does not use Fast-Sequential ATPG for estimating the coverage. To get the final test
coverage results, you can do either of the following:
1. Compare only Basic Scan ATPG results before and after "What-If" analysis using the following commands:
https://solvnetplus.synopsys.com/s/article/Why-is-There-a-Coverage-Loss-When-What-If-Analysis-is-Applied-to-Test-Points-1576092495490 1/3
2021/6/3 Why is There a Coverage Loss When "What-If" Analysis is Applied to Test Points?
add_faults -all
#enabling only basic scan ATPG
set_atpg -capture_cycles 0
run_atpg -auto
analyze_test_points -target testability \
-test_points_file ./test_points_testability.lst
reset_state
update_faults -reset_au
run_atpg -auto -observe_file ./test_points_testability.lst
2. The test_points_testability.lst file created by the analyze_test_points command in the first option above includes the
control and observe points, which are defined using set_test_point_element -type control/observe command. So to
insert the control and observe test points into the netlist, you need read the test points list file (i.e., the
test_points_testability.lst generated by TetraMAX ATPG ) by specifying the source test_points_testability.lst
command before the insert_dft command in DFT Compiler. Next, perform ATPG on this new netlist using Fast-
Sequential mode. Using this setting, the test coverage is typically better than the virtual test coverage reported by
"What-If" analysis.
Workaround
Product L1
TestMAX ATPG (/s/detail/01t1U000003IY0ZQAW)
Additional Product(s)
Article Number
000017456
URL Name
Why-is-There-a-Coverage-Loss-When-What-If-Analysis-is-Applied-to-Test-Points-1576092495490
Recommended Articles
Performing what-if analysis for test coverage improvements with test points
Why is there coverage loss after multi-mode and Adaptive Scan insertion?
M170 Error Reported During 'What If' Analysis for Inserting Observe Points
Why Doesn't Test Coverage Improve After Reading an Observe File in run_atpg?
https://solvnetplus.synopsys.com/s/article/Why-is-There-a-Coverage-Loss-When-What-If-Analysis-is-Applied-to-Test-Points-1576092495490 3/3