Testbench Organization and Design
Testbench Organization and Design
Test bench
Most hardware description languages circuit description
and test waveforms are described in different ways
VHDL: the language itself can be used to express the
testing waveforms. (called test benches.)
Test benches: a VHDL model that generates waveforms
with which to test a circuit (VHDL) model.
Test benches is used only in simulation, not for
synthesized.
2
Simulation Technology
3
Testbench Environment
Testbench
Clock
Input Response Verification Generation
Init’n Stimuli Assessment Utility
& Synch’n
Design Under
Verification
4
Testbench Environment: Example
Instantialtion:
5
Testbench Environment: Example
Description of design under verification and input stimuli:
− Need to apply a bit stream:
− Store bits in an array and apply the array.
initial i = size_of_input;
Response assessment:
remainder = input_array % 8’b10000111;
if (remainder != {q7, q6, q5, q4, q3, q2, q1, q0})
print_error();
6
Testbench Environment: Example
FF initialization:
initial begin
DUV.Q0 = 1’b0;
DUV.Q1 = 1’b0;
…
DUV.Q7 = 1’b0;
Clock generation:
always clk = #1 ~clk;
7
Testbench Environment: Example
Testbench-to-design interface:
− access to the design signals through primary inputs/outputs and
hierarchical paths.
Verification utility:
− functions and modules shared by various parts of the testbench: e.g.
print_error()
8
Test Cases
• Test Case:
Properties (scenarios) to be verified.
• Example:
ALU:
− TC1: Verifying integer operations,
− TC2: Verifying Boolean operations.
9
Test Cases
• Example:
TC1:
− Verify integers add/subtract
− Input vectors chosen to cause corner cases (e.g. overflow)
TC2:
− Verify Boolean operations:
− Input vectors: certain bit patterns (e.g. 101010101, 11111111)
• Reusability:
Use the same testbench for multiple test cases
− To maximize portability, TCs must be separated from testbench,
− e.g. read initial values from a file (that contains a TC).
10
Initialization
• Initialization:
Assign values to state elements (FFs, memories)
Although the task of circuitry (at power-on), often done
in testbench.
Reasons:
− Initialization circuit has not designed at the time.
− Simulation is to run starting from a long time after power-on,
− (e.g. simulating through initialization stage takes too long)
− Simulation emulates an exception condition (normal operation never
reaches it from its legal initial states).
11
Initialization
• Initialization time zero:
Some simulators create a transition (event) from
unknown X (uninitialized) to initial value and some
others don’t.
− Inconsistent results from simulator to simulator.
Initialize at a positive time.
Even safer:
− Initialize to X (or ‘U’) at time zero.
− Then initialize to init value at a positive time.
task init_later;
input [N:0] value;
begin
design.usb.xmit.Q = 1’bx;//avoid
time zeros event
…
#1;
design.usb.xmit.Q = value[0];//now
real initialization
…
end 12
Clock Generation and Synchronization
period
clock
13
Clock Generation and Synchronization
• Toggle Method:
Difficult to see the value of clock at a given time
− Comments: falling/rising.
If left uninitialized, doesn’t toggle (starts at x)
− A potential bug.
Easy to change the phase or initial value.
− Other statements kept intact.
14
Clock Generation and Synchronization
15
Clock Generation and Synchronization
Clock multiplier:
− Note: Synchronized with the base clock.
16
Clock Generation and Synchronization
• Multiple Clock Systems with a Base Clock:
If the period of the base clock not known, Measure it:
// measure the first period of base_clock
initial begin
derived_clock = 1’b0; //assume starting 0
@(posedge base_clock) T1 = $realtime;
@(posedge base_clock) T2 = $realtime;
period = T2 – T1;
T1 = T2;
->start; // start generating derived_clock
end
17
Clock Generation and Synchronization
If two periods are independent, don’t generate one
from the other.
Right Wrong
18
Clock Generation and Synchronization
19
Clock Synchronization
Synch’er
w1
w3 Synchronized
signal
Synchronizing
Signal w2
20
Clock Synchronization
• Synchronizer:
A latch.
Uses a signal (synchronizing) to trigger sampling of
another to create a dependency between them.
− Removes uncertainty in their relative phase.
always @(fast_clock)
clock_synchronized <= clock;
21
Stimuli Generation
• Synchronous Method:
Applying vectors to primary inputs synchronously.
23
Stimulus Generation
• Asynchronous Method:
Sometimes inputs are to be applied asynchronously.
− e.g. Handshaking.
24
Response Assessment
25
Response Assessment
• Comparison Methods:
1. Offline (post processing)
Node values are dumped out to a file during simulation
Then the file is processed after the simulation is finished.
2. On the fly
Node values are gathered during simulation and are
compared with expected values.
26
Response Assessment
27
Response Assessment:
Design State Dumping
• Measures for efficient bug locating:
Scope:
− Restrict dumping to certain areas where bugs are more likely to occur.
Interval:
− Turn on dumping only within a time window.
Depth:
− Dump out only at a specified depth (from the top module).
Sampling:
− Sample signals only when they change.
− Sample signals with each clock: Not recommended because:
1. Some signal values are not caught.
2. Slow if some signals are not changed over several clocks.
28
Response Assessment:
Design State Dumping
• Scope:
The range of signals to be printed out.
1. HDL scope: function, module, block.
2. User-defined scope: group of functionally similar modules.
29
Response Assessment:
• Run-Time Restriction:
Dumping routines should have parameters to
turn on/off dumping.
Scope: Module A
Restrict up to depth depth
30
Response Assessment:
• Golden Response:
Visual inspection of signal traces is suitable only
− for a small number of signals.
− when the user knows where to look for clues (i.e. scope is narrow).
31
Response Assessment:
Golden Response
• Golden Response:
Can be generated directly.
Or by a different model of the design
− e.g. non-synthesizable higher level model or a C/C++ model.
If different responses
− bugs in design,
− bugs in golden response.
32
Response Assessment:
Golden Response
33
Response Assessment:
Golden Response
• Time Window:
Wider window:
− More coverage,
− More simulation time,
− More disk space.
34
Response Assessment:
Golden Response
• Hard to update:
There may be thousands of golden files in large
designs.
Golden files may need to change:
− if a bug is found in it (in the supposedly correct design).
− if design is changed to meet other constraints (e.g. pipelining).
− if specifications are modified,
− if printing variables (or formats) are changed.
All golden files may need to change.
Golden file may be very large:
− Gigabytes are commonplace.
Maintenance problems.
35
Response Assessment:
Self-Checking Codes
• Self Checking:
Checking is moved to testbench.
− Signals monitored and compared constantly in testbench.
Two parts:
1. Detection:
− Compares monitored signals with expected values.
2. Alert:
− Different severity levels: different actions.
36
Response Assessment:
Self-Checking Codes
37
Response Assessment:
Self-Checking Codes
• On-the-fly Checking:
Example: Multiplier
// compare results
if (expected != product)
begin // alert component
$display (“ERROR: incorrect product, result = %d, …);
$finish;
end
38
Response Assessment:
Self-Checking Codes
• Good practice:
Separate checking code from design code.
− Checking code is not part of the design.
− Verification engineer should make it straightforward to remove it.
− E.g. Encapsulate it in a task/procedure along with other verification
utility routines in separate file.
Use C/C++ routines to derive expected behavior.
− Use VHDL PLI (VHPI) or Verilog PLI to construct a function in HDL.
39
Response Assessment:
• Example: Self-Checking Codes
// RTL code of a multiplier
multiplier inst(.in1(mult1), .in2(mult2), .prod(product));
void multiplication()
{
…
m1 = tf_getp(1); // get the first argument
m2 = tf_getp(2); // get the second argument
ans = m1 * m2;
tf_putp(3, ans); // return the answer to the third argument
}
40
Response Assessment:
Self-Checking Codes
41
Response Assessment:
Self-Checking Codes
• Off-line checking:
When expected behavior takes long time to
compute, on-the-fly node is not suiatble.
− Store expected responses in a table or database.
− If small, can reside in RTL code (faster)
otherwise, a PLI user task must be called for RTL to access it (more
portable).
42
Responses Assessment:
Temporal Specification
After functional specification, timing correctness.
• Two types:
Synchronous timing specification,
− Timing requirements expressed in terms of a clock.
Asynchronous timing specification,
− Timing requirements as absolute time intervals.
43
Responses Assessment:
Temporal Specification
• Example:
44
Responses Assessment:
Temporal Specification
• For interval [t1, t2]:
Checking can be done in two stages:
45