Verification Basics
Verification Basics
Anoushka Tripathi
1 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
What is Verification?
Verification is not just about writing testbenches or running a bunch of tests. It’s a process that
ensures the design you’ve created works exactly as you intended.
• When you taste a dish while cooking, you’re verifying the flavor is what you want.
• When you match landmarks to a map, you’re verifying you’re heading in the right
direction.
• How to make sure you’re checking for the right things in your design.
• How verification helps in reusing designs, and the challenges of reusing verification
itself.
What is a Testbench?
A testbench is a tool we use to test a design by simulating how it works. It’s a piece of code that:
Testbenches are often written in System Verilog, but they can also use external files or even
code written in C.
Imagine the testbench as a tiny universe for your design—it controls everything that happens.
No inputs or outputs come from the outside world. It’s all contained within this closed system.
• What the correct outputs should look like if the design is working perfectly.
This process ensures that the design behaves as expected in every possible scenario.
2 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
In modern hardware design, verification is one of the most critical and time-consuming
activities. It ensures that the design works as intended, and without it, the risk of errors, delays,
or failures increases dramatically. Here’s a breakdown of why verification is essential and how it
can be made more efficient:
• Dedicated Teams:
To manage this complexity, many design teams have more verification engineers than
RTL designers—sometimes twice as many. These engineers focus entirely on ensuring
that the design meets its specifications.
• Verification often lies on the project's "critical path," meaning it determines the overall
timeline. Several factors contribute to this:
o Verification sometimes starts late, only after the design is complete. This delay
can cause project schedules to slip further.
• To address this, new tools and methodologies aim to speed up verification by enabling
parallel work, using higher-level abstractions, and automating repetitive tasks.
1. Through Parallelism:
o For example, digging a hole can be sped up by having multiple workers with
shovels. Similarly, multiple testbenches can be written and debugged in parallel
with the design's implementation.
2. Through Abstraction:
o Caution: Higher abstraction levels reduce control over details, so it’s important
to switch between abstraction levels when needed.
3. Through Automation:
o While full automation is not possible due to the variety of designs and scenarios,
domain-specific automation tools can significantly reduce manual effort.
o Example: A pool vacuum randomly moves along the bottom, covering most
areas without manual guidance. Similarly, constrained random testing can
explore edge cases in a design, freeing up engineers for more critical tasks.
• Testbenches should be flexible enough to switch between these levels during execution.
For instance:
Without proper verification, designs are prone to errors, delays, and costly rework. Verification
ensures that the final product meets expectations, avoids failures, and reaches the market on
4 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
time. By using parallelism, abstraction, and automation effectively, verification efforts can be
optimized, making the entire process more efficient and reliable.
This holistic approach makes verification not just a task but the backbone of successful
hardware development.
The reconvergence model is a way to visualize and understand the verification process in a
structured manner. It focuses on ensuring that any transformation applied to a design produces
the expected outcome by comparing the result with the original intent.
The model answers the critical question: "What are you verifying?" Verification is not just
about finding errors; it is about confirming that the output of a process or transformation
matches the intended design or specification. Without this understanding, the verification
process lacks direction and purpose.
1. Input Specification:
2. Transformation:
o Examples include:
3. Verification:
o This step compares the output of the transformation against the input
specification.
o The process ensures that the transformation does not deviate from the intended
design.
4. Common Origin:
The model shows two paths starting from the common origin:
1. Transformation Path:
The original specification undergoes a transformation to produce the output (e.g., RTL
coding or synthesis).
2. Verification Path:
A separate process is used to check whether the output matches the intent of the
original specification.
These two paths reconverge at the original specification, ensuring alignment between what was
intended and what was produced.
• Synthesis Verification:
Confirms that the gate-level netlist produced by synthesizing RTL code retains the
intended functionality.
The Human Factor introduces variability and potential errors in the verification process when
human interpretation is required to perform transformations, such as converting specifications
into RTL (Register Transfer Level) code. While verification strives to ensure correctness, human
involvement often creates challenges that need to be addressed through careful practices and
complementary mechanisms.
1. Subjectivity in Interpretation:
o When the same person performs both design and verification, the process may
confirm their interpretation rather than the specification itself.
o Errors arising from this variability can propagate through the process unless
adequately addressed.
Three complementary strategies can be applied to reduce the impact of human errors:
1. Automation:
o Benefits:
7 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
o Limitations:
o Definition: Design systems and processes to make human errors less likely or
inconsequential.
o Implementation:
o Challenges:
3. Redundancy:
o Approaches:
o Applications:
o Cost Considerations:
o Process:
8 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
▪ Verification checks the design against this interpretation, not the original
specification.
o Problem:
o Process:
▪ The outcome is validated against the original intent rather than a single
interpretation.
o Benefits:
The process of verification focuses on determining whether a design meets its intended goals,
but the specific transformation being verified depends on the origin and reconvergence
points of the verification process. Different tools and techniques focus on verifying different
aspects, such as equivalence, properties, or functional intent.
• Verification tools like formal verification, property checking, and functional verification
rely on these points to determine their focus.
Understanding these points is critical to know what exactly is being verified, as they influence
whether the design conforms to its specification, or merely to an interpretation of it.
1. Formal Verification
Formal verification uses mathematical methods to prove that specific properties of a design
hold true. It does not eliminate the need for writing testbenches and is applied in two main
categories:
9 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
a. Equivalence Checking
• Definition:
Compares two models (e.g., RTL to gate-level netlists) to ensure the transformation
preserves functionality.
• Key Applications:
• Advantages:
• Reconvergence Model:
Verifies that the output matches the logical intent of the input transformation.
b. Property Checking
• Definition:
Proves specific assertions about the design's behavior based on defined properties.
• Key Applications:
• Challenges:
• Reconvergence Model:
Focuses on verifying specific properties rather than general design correctness.
2. Functional Verification
• Definition:
Functional verification ensures that a design aligns with its intended functionality as
per its specification.
• Key Insights:
10 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
o Verifies the design's intent rather than just its logical transformations.
• Limitations:
o While it can prove the presence of bugs, it cannot prove their absence.
• Reconvergence Model:
Maps the design's behavior back to the specification to ensure it meets its intent.
• Assertions:
o Require careful crafting to avoid trivialities that merely restate design behavior.
• Testbenches:
• Absence of Errors:
o You can prove the presence of a bug with a single example of failure.
Automation is a cornerstone of effective verification because it reduces the time and effort
required to detect and resolve bugs. Some tools automate routine checks, allowing engineers to
11 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
focus on more complex issues. For example, simulators are crucial for functional verification
since they run simulations of the design to check if it behaves as expected. However, tools like
linting and code coverage go further, automating processes that would otherwise consume
significant time and effort.
Verification engineers must choose the right technologies to ensure that no significant bugs are
missed during the verification process. The goal is to improve confidence in the product's
functional correctness by using tools and technologies that highlight issues early in the design
process.
Project managers, on the other hand, must balance delivering a working product on time and
within budget while equipping their engineers with the right tools to ensure confidence in the
verification process. One of their most critical responsibilities is deciding when to stop testing,
weighing the cost of finding additional bugs against the value of increased correctness.
The chapter introduces various technologies that are used in different EDA (Electronic Design
Automation) tools. A single tool might incorporate multiple technologies to optimize the
verification process. For example, some tools perform “super linting”, which combines
traditional linting with formal verification, while hybrid tools may combine simulation and
formal analysis.
Synopsys Tools
As the author was a Synopsys employee at the time of writing, many tools discussed are from
Synopsys. However, the tools mentioned could also have counterparts from other EDA
companies.
Linting Technology
Linting is one of the verification tools that identify common coding mistakes early on without
running simulations. The term "lint" originated from a UNIX utility for C programming that would
identify questionable or erroneous code constructs. This allowed programmers to find and fix
mistakes efficiently without waiting for runtime errors.
Advantages of Linting
3. Early Detection: Linting helps identify problems during the development process rather
than during testing or debugging, which saves time.
Limitations of Linting
12 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
1. Static Analysis Only: Linting can only catch certain types of errors based on the
structure of the code. It cannot detect logical issues or problems related to algorithmic
behavior. For example, in Sample 2-3, it cannot determine that an uninitialized variable
might cause unpredictable results.
2. False Positives and Negatives: Linting often reports many false positives, leading to
"alert fatigue" where developers may get frustrated with non-existent issues. On the flip
side, it may miss genuine logical issues that cannot be detected through static analysis.
3. Limited Scope: Linting cannot catch deeper issues, such as race conditions in
concurrent processes, or functional bugs related to data flow or logic errors. These
issues often require simulation or formal methods.
• Filter Error Messages: You can reduce frustration and clutter by filtering out known
false positives and focusing on genuine problems. This minimizes the chance of missing
critical errors amidst false alarms.
• Run Linting Continuously: Linting should be performed regularly while code is being
written to catch issues early and reduce the risk of overwhelming error reports after a
large amount of code is developed.
Linting is especially useful for SystemVerilog, where it catches errors that are syntactically
correct but might lead to functional problems, such as the counter example shown in Sample 2-
5. The code looks correct and compiles without errors, but the use of a byte type (which is a
signed 8-bit value) causes issues with the condition counter < 255, as the counter will never
reach 255. Linting can immediately flag this problem, allowing for a quick fix without running any
simulations.
Modern linting tools may integrate formal verification techniques to perform more advanced
static checks, such as identifying unreachable states in an FSM or unexecuted code paths.
These advanced linting tools are capable of detecting more subtle issues that go beyond basic
syntax or structural analysis.
o Simulation is not the end product. The ultimate aim is to create a functional,
physical design.
3. Approximation of Reality
o Simulations mimic reality but simplify many aspects. They do not capture all
physical characteristics, such as continuous voltage variations or asynchronous
events, and focus on a manageable subset for testing.
o The simulation's accuracy depends on how well the model reflects the actual
design.
3. Event-Driven Simulation
1. Performance Bottlenecks
2. Trade-Offs in Accuracy
14 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
Simulation Techniques
1. Event-Driven Simulation
o If multiple inputs change but result in no output change, the simulator might still
execute those events to maintain logical consistency.
2. Cycle-Based Simulation
1. Detecting Flaws
o Continuous signals in the real world are represented in discrete forms (e.g., 0, 1,
unknown, high-impedance).
o Such approximations may overlook subtle issues that arise in actual hardware.
A verification plan is a critical document in the hardware design process, ensuring that the
design is thoroughly tested and meets all requirements before manufacturing. Here's a
breakdown of its role and importance:
o Each designer tested their work as they saw fit, often leaving flaws
undiscovered.
15 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
o Flexible but expensive solutions, like FPGAs, were often used to address design
flaws found later.
• Metrics like code coverage, functional coverage, and bug discovery rates help track
progress but don't define the entire process.
• A clear plan is required to determine when verification is complete and to ensure all
critical aspects are tested.
• When testing is done: A schedule for completing tests to a defined level of confidence.
• The verification plan begins with the architectural specification and evolves as the
implementation specification is completed.
• The specification document is the authoritative source for both design and verification.
• Just like the design has a specification, the verification effort also needs its own plan.
• Verification is often as labor-intensive as the design itself (or even more so), making a
structured plan essential.
• Key Principle:
• Confidence: Ensures the design is thoroughly tested, reducing risks of failure in the
field.
This section explains how the verification process should be organized, structured, and
implemented to ensure a hardware design meets all its requirements. It emphasizes the
importance of planning, levels of granularity in testing, and the role of the team in creating a
robust design. Let’s break it down.
o A detailed verification plan acts as a "line in the sand", ensuring all essential
tests are completed before the design is shipped.
o Rule: The design should only be shipped after passing all tests and meeting
coverage and bug-rate metrics.
• Team Involvement:
o The goal is not just to create RTL (hardware description code) but to deliver a
fully functioning design.
• The approach to creating a verification plan has been used in ultra-reliable systems, like
those from NASA and the aerospace industry, for decades.
• These methods ensure reliability and are applied to both hardware and software
designs.
3. Levels of Verification
• To make progress in testing, the interfaces (connections between design parts) and
functionality of partitions must remain stable.
• Frequent changes to interfaces slow down the process because testbenches (testing
setups) must be updated constantly.
4. Unit-Level Verification
o Units are tested informally by their designers, using simple checks or embedded
assertions.
o There are too many units in a project, and creating detailed tests for each would
be inefficient.
• Reusable Cores:
o Some blocks, like reusable cores, are designed for use in multiple projects.
o The design should group related features into blocks for standalone verification.
• Standardized Interfaces:
19 VLSI TECH WITH ANOUSHKA | VERIFICATION BASICS TO ADVANCE
o Blocks and cores should use standard interfaces to simplify testing and promote
reusability.