0% found this document useful (0 votes)
26 views9 pages

Intro To UVM Part 1

Uploaded by

Muhammad Hadi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views9 pages

Intro To UVM Part 1

Uploaded by

Muhammad Hadi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

11/1/2023 Introduction to UVM

Topics discussed:
• UVM Testbench Structure
• Parts of a UVM Testbench
• UVM Phases
o Common Phases
o Scheduled Phases
o Phase Synchronization
• UVM Testbench Template

Daniyal Tahsildar
UVM Testbench Structure:
A UVM-based testbench has a similar structure to that of an SV (SystemVerilog)
testbench, comprising various components for verification, stimulus generation, and
result analysis. However, there are several notable differences that set UVM testbenches
apart and enhance their capabilities.
Standardized Methodology:
UVM provides a standardized verification methodology that promotes reusability
and consistency in testbench development. It offers a set of predefined classes and a
clear hierarchical structure for creating verification environments.
Component-Based Design:
In UVM, testbenches are typically organized using a component-based approach.
Components such as agents, drivers, monitors, and sequencers are used to encapsulate
specific functionality, making the testbench more modular and scalable.
Transaction-Level Modelling:
UVM encourages the use of transaction-level modelling (TLM) for
communication between different modules, allowing for more abstract and efficient
modelling of stimulus and responses.
Reusability:
UVM promotes reusability by facilitating the creation of generic verification IP
(VIP) components that can be reused across different projects and designs.
Test Hierarchy:
UVM introduces the concept of test classes (e.g., uvm_test) that serve as
organized containers for test scenarios. This hierarchical approach simplifies test
selection and execution.
Advanced Reporting and Debugging:
UVM provides advanced reporting and debugging capabilities, including built-in
transaction recording, messaging, and customizable reporting. This aids in identifying
issues and generating detailed verification reports.
Functional Coverage:
UVM emphasizes functional coverage, making it easier to track and analyze the
completeness of test scenarios. Coverage-driven verification is a fundamental aspect of
UVM testbenches.
Phases and Tasks:
UVM introduces a phased execution model, where different phases (e.g., build,
connect, run, extract, report) allow fine-grained control over testbench activities. This
enhances simulation control and management.
Configuration Database:
UVM incorporates a configuration database that facilitates the sharing of
configuration settings and parameters among different testbench components, offering
a flexible and efficient means of configuration management.

Parts of a UVM Testbench:

Test_top:
Refers to the top-level module or component within a UVM-based testbench. It
is the highest-level module that orchestrates and controls the overall verification
process.
Test:
Specific type of UVM component that represents test cases or a test scenario.
Each test has its own verification environment.
Environment:
The environment coordinates various verification components and ensures a
systematic approach to verifying the DUT, connecting different components, managing
resources, and controlling simulation sequences.
Agent:
Agents act as intermediaries between the testbench and the DUT. They
encompass drivers, monitors, and sequencers and help in organizing and managing the
verification process.
Sequencer:
Sequencers control the flow of transactions within the testbench. They generate
transaction sequences and send them to drivers for execution.
Driver:
The driver component is responsible for translating high-level transactions into
low-level signals to drive the DUT. It plays a crucial role in stimulus generation.
Monitor:
Monitors continuously observe and record signals from the DUT. It’s a passive
entity that samples DUT signals but does not drive them. They are essential for
collecting data and checking for correctness during simulation.
Scoreboard:
Scoreboards compare the expected results with actual results from the DUT. They
play a crucial role in verifying the correctness of the design.
Coverage:
Coverage components track which parts of the design have been exercised during
simulation. This is essential for ensuring that the verification environment thoroughly
tests the DUT.
Interface:
Construct used to define a collection of signals that describe the interaction
between different modules or components within a design. It specifies how data and
control information are exchanged between these modules.

UVM Phases:
UVM defines a set of standardized phases and processes that help streamline the
verification process. These phases play a crucial role in managing and controlling the
simulation and testing of a design.
There are mainly two types of phases:
Common simulation phases:
Are common in all UVM components like build phase, connect phases, run phase
etc.
Scheduled phases:
They are not compulsorily implemented in each component but are used for
scheduling a sequence execution in a specific phase (primarily in run phase). These are
used for running sequences in the required order.
Phase synchronization:
All components executing at run time must
drop all objections for that phase before any
component can move on to the next phase. Phase
synchronization ensures that all relevant
components and processes are coordinated in their
execution across the different phases. This
coordination helps maintain the order and timing
required for effective verification.
Common simulation phases are classified as:
Build phase (creates testbench components):
In this phase, you set up the test environment and create the necessary
components like agents, drivers, monitors, and sequencers.

Build phase is used to create


components as well as
virtual interfaces.

Components using build phase: test, environment, agent driver, monitor, sequencer,
scoreboard.
Connect phase (connects components):
This phase involves connecting different UVM components together to create the
complete verification environment. It establishes communication paths between
components.

Components using connect phase: Agent, Environment.


End of Elaboration phase (determine testbench structure):
This phase analyzes the design hierarchy and builds the simulation database,
which represents the design and its interconnections. It provides a stable environment
for preparing the simulation setup.

Components using End of Elaboration phase: test


Start of simulation phase (understand design and make any final changes):
During this phase, UVM components can perform tasks related to configuring the
simulation environment, such as setting up design hierarchies, setting initial values, and
preparing for the actual simulation.
Components using start of simulation phase: test
Run phase (start component functionality):
The run phase is where the actual simulation takes place. Sequences are executed,
and the design under test (DUT) is driven with test stimuli.

Components using run phase: test, driver, monitor, scoreboard


Extract phase (collect the outputs):
This phase extracts data from different points of the verification environment
(Scoreboard and other testbench components). The simulation is completed at this point
and will no longer advance.
Components using extract phase: test
Check phase (compare outputs):
During this phase, the verification environment checks the results produced by
the DUT against expected values. It's where the verification components analyze and
compare the output with the expected behavior to detect any errors or unexpected
conditions.
Components using check phase: test
Report phase (report status):
In this phase, the test results and any errors or warnings are reported. It can also
include logging and data analysis to generate comprehensive verification reports.

Components using report phase: test


The "Start of Simulation" phase, "Extract" phase, and "Check" phase are typically
not explicitly programmed; instead, they are automatically invoked when the test is
executed.
Run-time sub-phases or scheduled phases:
Whenever run-phase is called, a set of predefined phases run concurrently. These
phases are executed during the actual simulation runtime when the verification
environment is actively testing the design. These phases can provide additional
functionality and control over the verification process without interfering with the core
simulation activities making the verification environment more flexible and adaptable
to different verification requirements.
Run-phase mainly has 4 sub-phases:
Reset phase (Reset DUT):
Here the components are configured to drive outputs to their designated reset or
idle states and the state variables are initialized. Just before transitioning out of this
phase, reset signals are appropriately de-asserted.
Configure phase (Configure DUT):
Verify that the components responsible for configuring the DUT are functioning
smoothly, ensuring that they execute transactions as intended to align the DUT with the
desired settings for the test and the surrounding environment.
Main phase (Test DUT):
Components should execute transactions smoothly, and data stimulus sequences
should be initiated. Following this, wait for the stimulus sequences to have completed.
Shutdown phase (Wait for data in the DUT to be drained):
Wait for all data to be drained out of the DUT, and then proceed to extract any
remaining buffered data within the DUT, typically accomplished through read/write
operations or sequences.
There are also other phases like pre_reset_phase, post_reset_phase,
pre_main_phase, post_main_phase etc, but they almost never get used.

UVM testbench template:


In industry practice, it's rare to build a testbench entirely from the ground up.
Instead, the primary tasks typically involve adding test cases, sequences, and making
necessary adjustments. Nonetheless, having a strong understanding of the fundamental
structure of a UVM testbench is crucial. Employing a well-crafted template significantly
streamlines testbench development, making the process more efficient and reducing the
likelihood of errors.
In constructing a UVM testbench, it's essential to begin by understanding the
design requirements and creating the necessary components. Once the foundational
components are established, the next step involves connecting them to the interface and
initiating the test environment to identify any potential issues.
Following this, it's a recommended practice to focus on the sequence item class,
as it serves as a central repository for all signals utilized by both the design and the
testbench. Common functions, such as address width calculations, can also be added
here for efficiency.
To validate progress, start with a straightforward test and ensure its functionality.
Gradually expand upon this foundation, adding functionalities to the testbench
components as needed.
In scenarios involving multiple master and slave components, it's crucial to verify
the testbench's performance with a single master and slave configuration before
extending its capabilities to accommodate multiple instances.
Efficiency can be further improved by incorporating components such as a test
library, sequence library, and virtual sequencer. Additionally, employing a shared file
that houses static variables, common classes, and typedef declarations used across the
entire testbench can streamline development. Defining the 'new' function (for
components and objects) in this shared file also simplifies coding efforts and reduces
the need for redundant code rewrites.
Template code for various testbench components has been provided, along with
associated details. Upcoming documents will explore other crucial UVM concepts like
configuration & resource database, interfaces and objections offering a more
comprehensive understanding of UVM’s capabilities and implementation.
***Zoom In***

EDA Playground links:


UVM_TEMPLATE : This illustrates the fundamental
structure of a UVM-based testbench. An efficiently
designed template serves as a foundation for
streamlined development of intricate testbenches.
UVM_STRUCTURED_TB : A sophisticated testbench,
derived from the template, designed to test multiple
scenarios. Do note that this is merely a demonstration;
real AHB IC testbenches are far more extensive.

The EDA Playground links showcase how a sophisticated testbench can be constructed using a well-
structured template. I recommend exploring them and experimenting with the provided examples to
gain insights into efficient testbench development.

This document serves as an introductory overview of UVM, and I look forward to sharing more in-depth
UVM concepts in the future. If you're eager to delve deeper into UVM, the UVM reference manual is the
best possible resource for comprehensive learning and understanding: UVM_Reference_Guide

You might also like