0% found this document useful (0 votes)
38 views21 pages

MTech 602362005

Uploaded by

ssharma1mtech23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views21 pages

MTech 602362005

Uploaded by

ssharma1mtech23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

VERIFICATION OF RTL Design Using Formal Technique

SEMINAR REPORT
Seminar report submitted in partial fulfillment of the requirement for the Award of the Degree of
MASTER OF ENGINEERING
in Electronics and Communication Engineering
Submitted By
Archit Gupta
602362005
Under Supervision of
Dr. Anil Singh
(Assistant Professor)
Mr. Paras Gupta
(Formal Verification Engineer, Intel)

ELECTRONICS AND COMMUNICATION ENGINEERING DEPARTMENT


THAPAR INSTITIUTE OF ENGINEERING AND TECHNOLOGY
(A DEEMED TO BEUNIVERSITY), PATIALA, PUNJAB
DECEMBER 2024
DECLARATION

I, Archit Gupta hereby declare that the work presented in this thesis entitled “ Verification
of RTL Design using Formal Technique” in partial fulfilment of the requirement for the
award of degree of Master of Technology (VLSI Design) submitted at Electronics and
Communication Engineering Department, Thapar Institute of Engineering &
Technology (Deemed to be University), Patiala is an authentic record of work carried out
under supervision of Industry Mentor Paras Gupta (Formal Verification Engineer, Intel
Technology India Private Limited) and Dr. Anil Singh (Assistant Professor,
Electronics and Communication Engineering Department, Thapar Institute of
Engineering & Technology) from June 2024 to December 2024. The matter presented in
this has not been submitted either in part or full to any other university or institute for the
award of any other degree.

Archit Gupta Dr. Anil Singh


(602362005) (Assistant Professor)

Paras Gupta
(Formal Verification Engineer)
ACKNOWLEDGEMENT

This thesis would not have been possible without the guidance and the help of several
individuals who in one way or another contributed and extended their valuable assistance
in the preparation and completion of this study.
First and foremost, my sincere gratitude to Dr. Anil Singh for his continuous support,
motivation, immense knowledge. His guidance has helped me in all the time of research
andwriting of the thesis.
.
Mr. Paras Gupta, Formal Verification Engineer, FVCTO, Intel Pvt Ltd for his patience
and steadfast encouragement, his constructive criticisms at different stages of my work
were thought-provoking and he helped me focus on my ideas. I am deeply grateful to him
for all discussions that helped me in understanding the technical details of my work.
Last but not the least, my family and dear friends. None of this would have been possible
without their love. And the one above all of us, the omnipresent God for giving me the
strengthto plod on, thank you so much everyone.

iii
ABSTRACT

The complexity of chips has increased to the point where using standard verification approaches is
becoming more challenging and costly. In order to address the challenge of correctly executing
complicated designs, current research is investigating the use of formal approaches in conjunction with
modifications to the design methodology. It has been proposed that formalizing abstract models early
in the design process can help identify design flaws and lower the cost of bug fixes. Understanding that
various verification problems require distinct approaches is essential to employing formal verification
successfully in the early stages of design. Every viewpoint provides a different method of reasoning in
response to the query, "Why is the design correct?" A set of perspectives can capture the design intuition
by utilizing distinct models and tools for each perspective. This allows the models to be made small
enough to be rapidly built, validated, and altered. When corner case issues were discovered early in the
design process, the cost of re-design was lower than it would have been if the bugs had been discovered
later.

iv
TABLE OF CONTENTS
DECLARATION......................................................................................................................................... ii

ACKNOWLEDGEMENT ......................................................................................................................... iii

ABSTRACT ................................................................................................................................................ iv

LIST OF FIGURES ................................................................................................................................... vi

CHAPTER 1 .................................................................................................................................................I

1.1 Introduction to the area of work ............................................................................................................... I

1.2 Motivation of the work ............................................................................................................................ I

CHAPTER 2 .............................................................................................................................................. IV

2.1 Design Verification ............................................................................................................................... IV

2.2 Verification methodology ...................................................................................................................... V

2.3 Round robin arbiter ................................................................................................................................ V

CHAPTER 3 .............................................................................................................................................. VI

3.1 Introduction ........................................................................................................................................... VI

3.1.1 Theorem proving .............................................................................................................. VI

3.1.2 Equivalence checking ......................................................................................................VII

3.1.3 Model checking ................................................................................................................VII

3.2 Functional verification process .......................................................................................................... VIII

3.2.1 Simulation-Based Verification ......................................................................................... IX

3.2.2. Formal Method-Based Verification ............................................................................... XI

3.3 Simulation-Based Verification versus Formal Verification ................................................................. XII

3.4 Design assertions ............................................................................................................................... XIII

v
LIST OF FIGURES
FIGURE. 1.1 ROOT CAUSE OF FUNCTIONAL FLAWS................................................................................................. III
FIGURE. 2.1 CORRELATION BETWEEN DESIGN VERIFICATION AND DESIGN PROCESSING ........................................ IV
FIGURE. 3.1 FORMAL VERIFICATION TECHNIQUES ................................................................................................. VI
FIGURE. 3.2 HIGH LEVEL VIEW OF THE CHIP VERIFICATION FLOW ...................................................................... VIII
FIG. 3.3 APPROACHES TO VERIFICATION ............................................................................................................... IX
FIGURE. 3.4 FLOW OF SIMULATION BASED VERIFICATION ...................................................................................... X
FIGURE. 3.5 FLOW OF FORMAL BASED VERIFICATION ............................................................................................ XI
FIGURE. 3.6 AN OUTPUT SPACE PERSPECTIVE OF SIMULATION-BASED VERIFICATION VERSUS FORMAL
VERIFICATION .............................................................................................................................................. XII

FIGURE. 3.7 LAYERS OF SVA ASSERTION LANGUAGE ........................................................................................ XIII

vi
CHAPTER 1
INTRODUCTION
1.1 Introduction to the area of work
With the increasing complexity of design that is made available through better fabrication technologies,
the problem of verification also becomes a bigger issue. Traditionally, verification is performed using
simulation techniques and currently, it is still the primary way to verify design correctness. Techniques
in simulation technology such as coverage models, guided test generation, and faster simulation speeds
have helped by allowing simulation to zero in on bugs more efficiently. The use of formal methods also
has various successes through advances in model checkers which allows for complete coverage on small
modules or conceptual protocols. The various successes with formal methods have also allowed a hybrid
formal and informal verification approach. Simulations now use formal methods to guide their
verification or allow formal methods to replace simulation in certain parts of the verification process.
Even with these developments, both verification techniques still have drawbacks when design
complexity rises. The likelihood of overlooked corner cases that simulation tools struggle to analyze
increases with the design's complexity, particularly in light of the current trend toward parallel
processing units. Formal verification is a useful tool for identifying corner case design problems, but
the methods available for formal verification are currently limited to verifying tiny designs. The
consistent application of formal verification to bigger designs and the identification of corner situations
arising from the interplay of several modules remain unexplored. Large-scale, high-level designs have
been successfully verified using formal verification, yet these accomplishments are time-consuming
and design specific.
Formal verification can also be used to quickly validate protocol concepts prior to the design phase. To
guarantee that the design implements the proven model, it is unclear how to connect the validated
protocol models with the actual implementation in these verification efforts.

1.2 Motivation of the work


The idea of designing from the abstract model and successively add in the details has been the
predominate model for programming until the rise of object-oriented programming. It is proposed that
a program can be made more adaptive to changes resulting from extensions or corrections by gradually
improving its architecture. The only design decisions and choices that will need to be re-implemented
are the final ones, as these adjustments or additions might not require alterations to the abstract
description. This idea is also known as the top-down design methodology. Formal verification appears
to be a good fit for this methodology because it allows for the refinement layers to differ little, making
it feasible to use formal methods to prove that these two layers coincide. Before moving on to the next
level of refinement, each design layer can be made 100% coverage verified by formal verification,
ensuring that no bugs remain. By doing this, it is certain that errors won't spread during the refinement
process. Formal verification presents a challenge to the design process since, in the process of designing
a project, designers must consider a multitude of factors in addition to the design's correctness. Because

I
implementation efficiency and performance are taken into consideration, the design frequently evolves
as it moves forward. Because it takes a lot of effort to formalize an abstract design, formalize the features
of interest, and identify methods for reducing the design to a manageable size, this makes it difficult to
undertake formal verification using refinement approaches during the design process. Moreover,
modifications to the abstract design have an impact on formal verification efforts. Therefore, the labor-
intensive parts of formal verification must be abandoned and reanalyzed when the actual design calls
for modifications to the abstract model. Therefore, it is ineffective to utilize standard formal verification
approaches in conjunction with design refinement since formal verification cannot keep up with the
rapid advancement of designs due to modifications made to the abstract level model during the design
process.
The ability to assess the design early in the process, when changes are less expensive to make, is the
driving force behind the refining methodology. It makes it possible to identify and address design flaws
early on from the standpoint of verification. Additionally, using an abstract design for bug hunting
makes it simpler to track down the cause of design flaws and to troubleshoot using the abstract model.
The benefits of early design verification are so great that most design approaches aim to achieve early
design verification, whether explicitly or implicitly.
According to several functional verification surveys, two spins are necessary before manufacturing for
at least 50% of design projects. The study also demonstrates that functional defects in the RTL design
are the primary cause of respins. Respins can have a significant financial impact on the product and are
very costly in terms of time and money. Fig. 1.2 shows a chart that illustrates the underlying causes of
functional defects in design projects. The y-axis displays the proportion of design projects impacted by
each sort of functional fault, while the x-axis displays various underlying causes. Design mistakes were
the source of functional defects in seventy percent of the design initiatives. These kinds of design flaws
can be linked to the design engineers' improper interpretation of the specifications. Additionally, the
graphic indicates that 50% of the projects are impacted by defects brought about by modifications to
the requirements, and 50% of the projects are impacted by inaccurate or incomplete specifications. The
figures in Fig. 1.1 indicate a significant flaw in the design flows, namely the use of unofficial
requirements to characterize a design's functions. Functional defects are mostly caused by insufficient
and confusing informal specifications. Creating attributes based on loose specifications may result in
an inaccurate or partial set of properties, which ultimately affects the quality of the verification process.
Furthermore, when the DUV's parameters change frequently, it becomes challenging to sustain the
workflow since manual property coding necessitates a significant amount of rework.

II
Figure. 1.1 Root cause of functional flaws [15]

The promise of formal verification (FV) is to achieve stronger, true verification of the DUV instead of
merely testing with simulation-based methods. FV can provide full verification for at least part of a
DUV, full checking of properties for all DUV states, in contrast to the limited. During the last decade,
FV has established itself as a productive additional weapon in the arsenal of the verification team. Many
engineers still misunderstand FV as an academic approach. However, rapid improvements and much
practical use of formal methods yielded advances in areas like specification and assertion languages.
These advances not only improved productivity of verification methods in the formal field, but also
have influenced simulation methods.

III
CHAPTER 2
LITERATURE REVIEW
2.1 Design Verification
A design process transforms a set of specifications into an implementation of the specifications. At the
specification level, the specifications state the functionality that the design executes but does not
indicate how it executes. An implementation of the standards provides a detailed explanation of how
the functionality is provided. Design verification is the reverse process of design. The process of design
verification begins with the execution of an implementation and subsequently ensures that the
implementation aligns with its predetermined requirements. Consequently, during each stage of the
design process, there is a corresponding phase to confirm its accuracy. For instance, a design phase that
transforms a functional specification into an algorithmic implementation necessitates a verification
phase to guarantee the algorithm's performance. Features outlined in the standard. Likewise, a tangible
arrangement that generates a configuration from a gate netlist verification is necessary to confirm that
the layout aligns with the gate netlist. Design verification encompasses many areas, such as functional
verification, timing verification, layout verification, just to name a few. Figure 2.1 illustrates the
correlation between the design processing and verification.

Figure. 2.1 Correlation between design verification and design processing [14]

Design verification is a methodical process that ensures and affirms that a design satisfies its defined
requirements and adheres to design rules. It is an essential phase in the product development process,
with the goal of detecting and resolving design problems at an early stage to prevent expensive and
time-consuming revisions in later stages of development. Design verification guarantees the accurate
and dependable operation of the ultimate result, be it an integrated circuit (IC), a system-on-chip (SoC),
or any electronic system. The verification of System-on-Chip (SoC) and Application-Specific
Integrated Circuit (ASIC) is crucial for ensuring the dependability and optimal functioning of integrated
circuits.

IV
2.2 Verification methodology
Functional verification in design projects is a best-effort process to meet goals in the verification plan,
which are based on requirements and insights from architects, developers, and users. The process relies
on an approximate verification strategy, which is fundamentally imperfect. An effective verification
approach starts with a comprehensive test strategy that outlines the functionality to be verified, ensuring
all standards are met. A "scoreboard" technique is used to monitor progress and mark test plan items as
completed. Functional coverage and code coverage are commonly used metrics to quantify the quality
of verification. A verification methodology must also determine the specific language or languages to
be utilized during the verification process. Verilog and VHDL are widely used design languages, while
verification code typically requires its own language. An optimal verification language should resemble
a software language rather than a hardware language.

2.3 Round robin arbiter


The arbiter is an electronic device that allocate access to shared resources. A crucial part of SoC shared
bus communication is the arbitrator block. Since multiple masters on a SoC bus may submit requests at
the same time, an arbiter is needed to determine which master gets bus access. When it comes to
managing requests from the master and answers from the slave (such as acknowledgement signals and
retries), the Bus Arbiter is essential. Arbitration algorithms' primary goal is to guarantee that only one
master is ever able to access the bus at a time; all other masters are compelled to wait to be allowed to
use the bus until they are approved for use. The arbiter has the following two schemes.
1. The Round Robin scheme
2. Fixed priority scheme
The round-robin method involves time-slicing, which means that we have to set aside a specific period
of time for each step to be completed. For simplicity, it is typically done with equal priority. Round
robin is more effective when the jobs are of roughly similar significance since each task has a greater
chance of being completed. This helps prevent the situation where the task with the lowest priority is
rarely completed because there is always another assignment with a higher priority. Let's say we have
multiple sources of data that we need to read.
The arbiter block is crucial in SoC shared bus communication, as multiple masters may submit requests
simultaneously. Masters may have bandwidth or real-time requirements, and the arbitration algorithm
can be distributed or centralized. Distributed arbitration systems have no slave side, and each master's
side controls access. The primary goal of arbitration algorithms is to ensure only one master can access
the bus, preventing other masters from waiting until authorized. The power used by the arbitration
process in on-chip communications varies significantly, making comparing algorithms an essential part
of the SoC design process.

V
CHAPTER 3
FORMAL VERIFICATION
3.1 Introduction
Formal verification can be defined as a process of checking the correctness of a design implementation
against the design specification using mathematical theories. A temporal formula that captures a
particular behavior of the design is checked thoroughly on the mathematical model of the design for all
permissible input values. Commercial formal verification tools facilitate the use of languages with
expanded syntax in industrial practice, which makes it easier to define temporal formulae. The temporal
formulas, also known as qualities, are manually derived from specifications. An appropriate error trace,
commonly referred to as a counterexample, is supplied in cases where the temporal formula's
mathematical model is not supported. It is demonstrated that the design behaves as specified by the
specifications when the model holds true for the formula. The three main categories of formal
verification procedures are Model Checking, Equivalency Checking, and Theorem Proving. The formal
techniques are categorized as shown in Fig. 3.1.

Figure. 3.1 Formal verification techniques [14]

3.1.1 Theorem proving


Formal verification's subfield of "theorem proving" deals with automating formal reasoning based on
the rules of logic. The system is represented as a collection of mathematical definitions in theorem
proving by applying the rules of mathematical logic. The theorems that follow the mathematical
definitions are derived as the intended or expected attributes of the system. First-order and higher-order
logic provers are used by theorem provers to verify the behavior of the system. First-order logic is
thought to be the most semi-decidable and comprehensible logic. It is impossible to decide any logic
that is more expressive than first-order logic. In order to effectively use first-order logic for hardware
verification, natural numbers must be used to simulate the time (for sequential circuits). Nevertheless,
there are no processes in first-order logic that offer comprehensive formalisms for natural numbers. It

VI
has been shown that higher order logics can be used for hardware verification. Higher-order logics have
undecidable validity; hence the proof systems are interactive and are usually used as proof helpers.
Consequently, theorem provers are needed in conjunction with other formal techniques since they lack
fully automated procedures, even though they can be used to evaluate reactive digital systems.

3.1.2 Equivalence checking


Equivalence Checking or Formal Equivalence Checking (FEC) can be defined as determining the
equivalence of two model representations using mathematical reasoning. The model representations can
be obtained from two distinct designs that are anticipated to always yield the same set of outputs, from
the same design using two different platforms (such as Verilog or VHDL), or at various abstraction
levels (such as RTL or gate level). Two FEC types that are frequently utilized in the industry are
sequential equivalency checking and combinatorial equivalence checking.
When comparing two iterations of the same design (or circuit) at various abstraction levels,
combinatorial equivalency verification is utilized to ascertain whether the synthesized netlist is equal to
its RTL description, for instance. Combinational equivalency checking uses state matching to determine
the comparable state variables of the two distinct circuits and then performs an equivalency check on
them. In the state matching process, all state variables are gathered into a single equivalency class. Some
state variables are then shown to be non-equivalent, and the equivalency classes are divided
appropriately. Various techniques based on satisfiability (SAT), automatic test pattern generation
(ATPG), binary decision diagram (BDD), and structural and logic modeling are used to demonstrate
the non-equivalence of state variables. Large designs do not scale well for the BDD-based techniques.
Because of their massive re-convergent fan-out structures, SAT-based methods for equivalency
checking are regarded as difficult.
The sequential equivalence checking is used to determine if two models generate the same set of outputs
at all time points for an equivalent set of inputs. It is not necessary for the internal nodes of the designs
to be equivalent for there to be sequential equivalency. The purpose of these equivalency tests is to find
out if a collection of characteristics always determines the right values for a design’s outputs.

3.1.3 Model checking


Model checking, also referred to as property checking, is an algorithmic technique for demonstrating
that the behavior of a sequential system follows its design. The main method by which the FV tools
examine the behavior of a design implementation is model checking. By examining the validity of
temporal formulas on the mathematical model of the implementation, a model checker confirms that
the system's implementation complies with its specifications.
A model checker needs the following components to confirm that design implementation complies with
the design specification:
1. a mathematical model of the implementation with appropriate expressiveness,
2. a suitable specification language to define an expected design behavior, and
3. an effective proof method (algorithm).
VII
Explicit model checking is a kind of model checking in which every possible state of a design is
explicitly described. Because of state space explosion, explicit model checking is virtually not possible
for architectures with moderate to high levels of complexity. Boolean functions are used to implicitly
describe a system's attainable states in symbolic model verification. Symbolic model checking uses
satisfiability (SAT) and binary decision diagrams (BDD) as proof techniques to assess if a design
behaves as intended in relation to the specification.

3.2 Functional verification process


Once design specifications are developed for a processor design project, hardware designers proceed to
construct a comprehensive, functional representation of the design using a hardware description
language (HDL), such as Verilog or VHDL. Verification engineers construct a verification environment
in order to accomplish the objectives outlined in a verification plan. The numerous design blocks are
allocated to teams of design engineers who construct according to specifications, and verification
engineers who conduct unit-level functional verification. Once design blocks are accessible, they are
incorporated into subsystems and subjected to verification. The process persists until all the design
blocks are finalized and incorporated into the system at the highest level. Design and verification
engineers identify and resolve bugs that are found during the verification process. Pre-silicon
verification refers to the functional verification process that takes place prior to the construction of a
silicon prototype. During the pre-silicon verification process, the hardware description language (HDL)
representation of the design is systematically compared to the specifications using the verification
environment. Formal verification, and to a larger extent, simulation-based verification, are the main
methods used for pre-silicon verification.

Figure. 3.2 High level view of the chip verification flow [16]

VIII
Formal verification tools transform the Hardware Description Language (HDL) representation of the
design into Boolean functions. These functions are then compared to attributes that are obtained from
the specifications. If engineers can express design specifications as accurate Boolean properties, the
tools can mathematically verify or refute the accuracy of the design. The robust assurance that formal
verification offers when the design is accurate, and the counterexamples it produces when the design is
flawed, renders it an exceedingly desirable instrument for verification. Regrettably, the process of fully
capturing a design's specifications as formally provable attributes is a formidable one. Furthermore, the
computational requirements of formal verification tools are so high that they only allow for the analysis
of tiny design units and subsystems. Hence, the efficacy of formal verification is constrained to
diminutive design blocks. Given that the objective of this dissertation is to confirm the accuracy of
intricate designs, we will not delve into the topic of formal verification any further.
Simulation-based functional verification encompasses most of the verification work done before the
physical implementation of a chip. Software-based simulation systems transform the hardware
description of the design into an executable file that is compatible with any computer. The verification
testbench, a software that runs parallel to the simulated design, directly interacts with the simulation
and has unrestricted access to all design signals. The versatility of software simulation and the extensive
visibility into design signals allows for robust verification and troubleshooting capabilities. Engineers
have the ability to create advanced checkers, analyze the design using debuggers, and capture
waveforms, among other tasks. Regrettably, software simulators exhibit inadequate speed when dealing
with intricate designs, typically replicating a maximum of hundreds of design cycles per second.
Simulating a second of a design's operation at these rates might require several months.

Simulation based Formal based


Fig. 3.3 Approaches to verification [14]

3.2.1 Simulation-Based Verification


Simulation-based verification is a widely used method of verifying designs in software development. It
involves subjecting the design to a test bench, applying input stimuli, and comparing the output with a
reference output. The test bench can be pre-produced or generated during the simulation process, and
the reference output can be generated either beforehand or in real-time.
A design undergoes a linter program before simulation to identify static mistakes, possible errors, and
breaches of code style guidelines. A project implements its own coding style rules to avoid design flaws

IX
and enhance simulation performance. Vectors representing test plan elements are created.
Directed tests are input vectors designed to test specific functionality and features, but they tend to be
biased towards areas where designers are uninformed. To avoid this, pseudorandom tests are used
alongside directed tests. Simulators are selected to conduct simulations, either event-driven or cycle-
based.

The efficacy of modeling a test on a design is determined by the extent of coverage the test offers.
Coverage tools provide reports on code or functional coverage, allowing designers to identify untested
components and generate tests to address those areas.
When a bug is discovered, it is crucial to notify the designer and rectify the issue. This can be done by
entering the bug into a bug tracking system, which triggers notifications to the design owner. The system
monitors the bug's progress through multiple stages, including being opened, validated, fixed, and
closed.
The typical flow of simulation-based verification is summarized in Fig 3.4

Figure. 3.4 Flow of simulation-based verification [14]

X
3.2.2. Formal Method-Based Verification
Formal verification provides properties on mathematical models representing the system
implementation. In particular, the identification of the design errors is address by means of model
checking tools. In the context temporal logic properties are defined to formally check the correctness
of a design implementation with respect to the specification [1].
The formal method-based verification methodology differs from the simulation-based methodology in
that it does not require the generation of test vectors; otherwise, it is similar. Formal verification can be
divided into two distinct categories: equivalence checking and property verification.
Equivalence checking is the process of determining whether two implementations are functionally
identical. Verifying equivalence with a simulation-based approach is impractical due to the virtually
unlimited number of vectors in the input space. During the process of formal verification, the outcome
of an equivalency checker is unambiguous. Nevertheless, industrial equivalency checkers have not yet
achieved the level of being a fully automated solution and frequently necessitate the user to pinpoint
corresponding nodes in the two circuits in order to narrow down the search area for the checkers. The
nodes that users have selected as being equivalent are referred to as cut points At times, a checker can
infer equivalent nodes from the node names.
The other type of formal verification is property checking. Property verification involves analyzing a
given design and a property, which serves as a partial specification of the design, in order to determine
whether the design satisfies the property. A property serves as a redundant explanation of the design,
effectively confirming its validity. A program that verifies a property is commonly known as a model
checker, as it pertains to the representation of a real circuit in the form of a computer model.
The typical flow of formal-based verification is summarized in Fig 3.5

Figure. 3.5 Flow of formal based verification [14]

XI
3.3 Simulation-Based Verification versus Formal Verification
The primary difference between formal verification and simulation-based verification is that the latter
does not require input vectors while the former does. In simulation-based verification, producing input
vectors first and then deriving reference outputs is the mindset. The formal verification procedure
involves thinking in the opposite direction. The user first specifies the desired output behavior, which
the formal checker is then left to confirm or refute. Users don't give input stimuli any thought at all. The
formal technique is output driven, whereas the simulation-based methodology is input driven. The
tendency to think input-drivenly is more common and is mirrored in the perceived difficulty of utilizing
a formal checker. Completeness—the absence of any gaps in the input space—is another selling factor
for formal verification, while simulation-based verification struggles with this issue. This formal
verification's power, meanwhile, might occasionally give rise to the false belief that a design is 100%
bug-free after formal verification. To find out if formal verification is accurately understood, let's
compare simulation-based verification with formal verification. Conceptually, simulating a vector is
like confirming a point in the input space. According to this perspective, input space sampling can be
used for verification in simulation-based methods. If every point is not sampled, there is a chance that
a mistake will evade verification. Formal verification operates at the property level as opposed to the
point level. Formal verification, given a property, looks for failures in every possible input and state
circumstance. From an output standpoint, formal verification verifies a set of output points at a time (a
collection of output points constitutes a property); simulation-based verification verifies a single output
point at a time. This comparison between formal verification and simulation-based verification is shown
in Figure 3.6. From this angle, the formal verification approach is different from the simulation-based
approach in that it verifies groupings of points in the design space rather than individual points.
Therefore, it must be further demonstrated that the collection of attributes that have been formally
validated as a whole forms the specifications in order to fully verify that a design satisfies its
specifications using formal methods. Formal verification software is less user-friendly and more
difficult to use because it verifies a set of points at a time.

Figure. 3.6 An output space perspective of simulation-based verification versus formal verification
[14]

XII
3.4 Design assertions
Formal verification conforms that the design meets its specifications. In order to achieve this the first
step needed is to find a way to express what it means for a design to be correct. This can be achieved
by writing assertions using the System Verilog Assertions (SVA) language. The SVA can be thought
of as several layers of increasing complexity as shown in figure 3.7.

Figure. 3.7 Layers of SVA assertion language [17]

Booleans
It can be a simple logical standard boolean expressions. It might be a single logic variable or a boolean
formula such as a & b. The boolean expressions can be used in a sequence or property as shown in
figure 3.7 and shall be evaluated over the sampled values of all the variables. Whereas the sampled
values are the values of the variable at the end of each previous simulation time step.

Sequences
These are the statements with boolean expressions in it that are happening over time. The simplest of
sequences are linear in order meaning that they are just a list of finite boolean expressions that occur
over linear order to increasing time. The increasing passage of time is defined with a clocking event.
Sequences are composed by concatenation which specifies the time delay, using ## operator, from the
end of first sequence to beginning of second sequence as shown below
a ##N b It means that signal b shall be true on the Nth clock tick after signal a was true.

Properties
A property combines sequences with additional operators in order to capture the design behavior that is
expected to be verified based on design specification. A property can be used as an assumption, assertion
or coverage specification but does not produce result by itself. Shown below is an example of a named
property reqgnt:

property reqgnt;
req |-> s_eventually gnt;
endproperty

It states that if there is a request req which is an antecedent then eventually a grant must come for it gnt

XIII
which is the consequent. The antecedent and consequent in the property are connected via the
implication operators |-> or |=>. The first one (|->) is overlapping meaning that, if there is a match for
antecedent then the end point for match is start point for the evaluation of consequent expression. While
the second one (|=>) is non-overlapping meaning that, the start point for the evaluation of consequent
is one tick after the match for antecedent.

Assertion statements
An assertion statement is used to validate the behavior of a system. As stated above, properties do not
produce any result by itself. They need to be put into an assertion statement which uses one of the
following keywords assert, assume or cover:
Assert: to specify the property as an obligation for the design that is to be checked to verify that the
property holds.
assert property (req |-> gnt);

Assume: to specify the property as input constraints on the environment. Formal tool use these input
constraints in order to generate the input stimuli.
assume property (!req |-> !gnt);

Cover: these are used to monitor the property evaluation for coverage. They make sure that the intended
behavior is happening at least once by finding a single trace for it.
cover property (req |-> gnt);

The assertions in general can be of two types - immediate and concurrent:


• Immediate assertion statements are simple assertions which follow simulation event semantics and are
executed in procedural blocks. There is no clocking or reset and does not support many advanced
property operators. Due to this they cannot check conditions which have passage of time. The immediate
assertions are defined using only assert keyword without using property keyword with it:
immediate1: assert (!req && !gnt);

• Concurrent assertion statements follow the clock semantics and can describe the behavior that include
passage of time. They also support advanced property statements about logical implementation that
include time intervals. Concurrent assertion statements use both assert and property keyword in the
statement as shown below:
conc1: assert property (a ##2 req |=> gnt);
It is worth noting that immediate assertion statements are mostly used for simulation. In this thesis
work, only concurrent assertion statements were written to verify the design behavior.

XIV
REFERENCES

[1] V. Bertacco. Ph.D. Dissertation, Stanford University, August 2003. [Online]. Available:
http://web.eecs.umich.edu/~valeria/research/thesis/ thesis2.pdf
[2] https://verificationacademy.com/seminars/2020-functional-verification-study
[3] W. K. Lam. Hardware Design Verification: Simulation and Formal MethodBased Approaches,
section 1.3.1.
[4] E. Seligman, T. Schubert, M.V. Achutha kiran Kumar. Formal Verification: An essential toolkit for
modern VLSI design, section Foreword
[5] Kern, C. and Greenstreet, M. R. (1999). Formal verication in hardware design: a survey. Technical
report, Univ. of British Columbia, Vancouver.
[6] Bergeron, J. (2003). Writing Testbenches: Functional Verification of HDL Models.
https://doi.org/10.1007/978-1-4615-0302-6
[7] Mansouri, N. and Vemuri, R. (2000). Automated correctness condition generation for formal
verication of synthesized RTL designs. Formal Methods in Systems Design
[8] A. Gupta. “Formal hardware verification methods: a Survey”. Formal Methods in System Designs,
Vol. 1
[9] R. Stuber. Formal Verification: Not Just for Control Paths, Mentor, June 2017
[10] J. Bromley, J. Sprott. Verilab, 2016. [Online]. Available: https://www.
verilab.com/files/dvcon_eu_2016_fv_tutorial.pdf
[11] C.E. Cummings. System Verilog Assertions Design Tricks and SVA Bind Files, SNUG, San Jose,
2009
[12] ARM, “AMBA Specification (Rev 2.0)”, available at http://www.arm.com.
[13] Akhilesh Kumar, Richa Sinha, “Design and Verification analysis of APB3 Protocol with
Coverage,” IJAET, Nov 2011.

[14] Lam, W. K. (2009, January 15). Hardware Design Verification: Simulation and Formal Method-
Based Approaches.

[15] https://verificationacademy.com/seminars/2018-functional-verification-study

[16] https://www.synopsys.com/content/dam/synopsys/services/whitepapers/delivering-functional-
verification-engagements.pdf
[17] NISHANT GUPTA (October 10, 2019) Exploration of formal verification in GPU hardware IP,

XV

You might also like