2008 Tvlsi Ieee1500

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Politecnico di Torino

Porto Institutional Repository

[Article] IEEE Standard 1500 Compliance Verification for Embedded Cores

Original Citation:
Benso A., Di Carlo S., Prinetto P., Zorian Y. (2008). IEEE Standard 1500 Compliance Verification
for Embedded Cores. In: IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI)
SYSTEMS, vol. 16 n. 4, pp. 397-407. - ISSN 1063-8210
Availability:
This version is available at : http://porto.polito.it/1785476/ since: January 2009
Publisher:
IEEE
Published version:
DOI:10.1109/TVLSI.2008.917412
Terms of use:
This article is made available under terms and conditions applicable to Open Access Policy Article
("Public - All rights reserved") , as described at http://porto.polito.it/terms_and_conditions.
html
Porto, the institutional repository of the Politecnico di Torino, is provided by the University Library
and the IT-Services. The aim is to enable open access to all the world. Please share with us how
this access benefits you. Your story matters.

(Article begins on next page)


IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, VOL. 16, NO. 4, APRIL 2008 397

IEEE Standard 1500 Compliance Verification


for Embedded Cores
Alfredo Benso, Senior Member, IEEE, Stefano Di Carlo, Member, IEEE, Paolo Prinetto, and
Yervant Zorian, Fellow, IEEE

Abstract—Core-based design and reuse are the two key elements There exist strong functional similarities between the tra-
for an efficient system-on-chip (SoC) development. Unfortunately, ditional system-on-board and the SoC design. However, their
they also introduce new challenges in SoC testing, such as core test manufacturing test process is quite different.
reuse and the need of a common test infrastructure working with
cores originating from different vendors. The IEEE 1500 Standard In a system-on-board, the IC provider is responsible for the
for Embedded Core Testing addresses these issues by proposing a design, manufacturing, and testing of the ICs components of
flexible hardware test wrapper architecture for embedded cores, the system. In this context, the system integrator is, with re-
together with a core test language (CTL) used to describe the im- spect to testing for manufacturing defects, only responsible for
plemented wrapper functionalities. Several intellectual property the interconnections between the ICs. The boundary scan test,
providers have already announced IEEE Standard 1500 compli-
ance in both existing and future design blocks. In this paper, we also known as JTAG or the IEEE Standard 1149.1 [3], is a
address the problem of guaranteeing the compliance of a wrapper well-known technique to address this board-level interconnect
architecture and its CTL description to the IEEE Standard 1500. test problem. In the SoC design flow, the core provider delivers
This step is mandatory to fully trust the wrapper functionalities a description of the core design to the system integrator at dif-
in applying the test sequences to the core. We present a systematic ferent possible levels: soft (register-transfer level), firm (netlist),
methodology to build a verification framework for IEEE Standard
1500 compliant cores, allowing core providers and/or integrators or hard (technology-dependent layout). Being the core delivered
to verify the compliance of their products (sold or purchased) to only as a model, a manufacturing test is at this stage impossible.
the standard. Therefore, the test responsibility of the system integrator now
not only concerns the interconnect logic and wiring between the
Index Terms—Electronic design automation (EDA) tools, IEEE
1500 Standard, unified modeling language (UML), verification of cores, but also the IP cores themselves.
embedded cores. For an SoC design, the test of embedded cores constitutes a
large part of the overall IC test, and hence substantially impacts
the total IC quality level as well as its test development effort
I. INTRODUCTION and associated costs. The adoption and design of adequate test
and diagnosis strategies is therefore a major challenge in the
production of SoCs.
The IEEE Standard Testability Method for Embedded Core-
T HE INCREASED density and performance of advanced
silicon technologies made system-on-a-chip (SoC) appli-
cation-specific integrated circuits (ASICs) possible. SoCs bring
Based Integrated Circuits (IEEE Standard 1500 [4]) addresses
the specific challenges that come with testing deeply embedded
together a set of functions and technology features on a single reusable cores supplied by diverse providers, who often use dif-
die of enormous complexity. Each component is available as ferent hardware description levels and mixed technologies [2],
a predesigned functional block that comes as an intellectual [5], [6]. The IEEE Standard 1500 defines a scalable standard ar-
property (IP) embedded core, reusable in different designs. chitecture to facilitate and support the testing of embedded cores
These so-called embedded cores make it easier to import tech- and the associated interconnect circuitry in a SoC.
nologies to a new system and differentiate the corresponding The standard does not cover the core’s internal test methods
product by leveraging IP advantages. Most importantly, the or chip-level test access configuration. The standardization ef-
fort focuses on non-merged cores (cores that are tested as stand-
use of embedded cores shortens the time-to-market for new
systems thanks to a heavy design reuse [1]. alone units) and provides the following two main supports:
What makes designing systems with IP cores an attractive • standardized core test language (CTL), capable of ex-
methodology (e.g., design reuse, heterogeneity, reconfigura- pressing all test-related information that need to be
transferred from the core provider to the core user;
bility, and customizability) also makes testing and debugging
of these systems a very complex challenge [2]. • standardized (but configurable and scalable) core test
wrapper, allowing easy test access to the core in an SoC
design.
Manuscript received March 2, 2007.
A. Benso, S. Di Carlo, and P. Prinetto are with the Department of In-
Several publications presented solutions to build SoCs with
formation and Automation Technologies, Politecnico di Torino, 10129 IEEE Standard 1500 testability features [7], [8]; nevertheless, by
Torino, Italy (e-mail: [email protected]; [email protected]; analyzing the IEEE Standard 1500, it is clear that implementing
[email protected]). a fully compliant core is not trivial. The IEEE Standard 1500
Y. Zorian is with the Virage Logic, Fremont, CA 94538 USA (e-mail: zo-
[email protected]). is in fact articulated in a large set of architectural rules. Some
Digital Object Identifier 10.1109/TVLSI.2008.917412 of them are very specific on particular design aspects, whereas
1063-8210/$25.00 © 2008 IEEE
398 IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, VOL. 16, NO. 4, APRIL 2008

others do not introduce particular restrictions but define general


characteristics of the design. It is very easy therefore to forget
one of the specific rules or misunderstand a general rule, imple-
menting it with a custom architecture that does not respect the
standard as a whole.
Verifying the actual compliance of a wrapped core to the
IEEE Standard 1500 is therefore mandatory. Nevertheless,
without a formalized approach, this verification task can be
more expensive than the design of the core itself.
Looking at the available literature, only a few authors tried to
address the problem of verifying the compliance of an IP core
to the IEEE Standard 1500. In [9], Diamantidis et al. present an
Fig. 1. IEEE 1500 core test wrapper architecture.
approach based on a dynamic, constrained-random coverage-
driven verification methodology to verify the functionality of
the complete test infrastructure within a given SoC. The main is the bridge between core providers and core users and
drawback of this contribution is that the authors verify the SoC facilitates the automation of test data transfer and reuse
and wrapper functionalities without systematically addressing between these two entities. The Information Model is de-
every single aspect (rule) of the standard. scribed using the IEEE 1450.6 CTL [10]. The information
This paper shows a systematic methodology to build a verifi- model includes the following:
cation framework for IEEE Standard 1500 compliant cores. This — set of wrapper’s signals;
methodology does not aim at providing a complete implemen- — timing of the wrapper signals (wrapper communication
tation of the verification framework but it focuses on defining protocol);
an abstract model that enables core providers and/or integrators — information about the test patterns.
to build their custom verification frameworks to verify the com- Fig. 1 shows the overall architecture of a general IEEE 1500
pliancy of their products (sold or purchased) to the IEEE Stan- wrapper. It includes the following elements:
dard 1500. The model guarantees to systematically address the • wrapper instruction register (WIR) for controlling the
different aspects of the standard. It is in fact an abstraction of wrapper operational mode;
the standard itself and could be reused to build compliance ver- • chain of wrapper cells called wrapper boundary register
ification frameworks for different standards as long as they are (WBR) to provide test functions at the core terminals;
structured similarly to the IEEE Standard 1500 (e.g., the JTAG • wrapper bypass register (WBY) for synchronously by-
standard). passing the wrapper;
This paper is organized as follows. Section II overviews the • wrapper interface port (WIP) for serially controlling the
basic concepts of the IEEE Standard 1500, whereas Section III wrapper using the wrapper serial input (WSI) and the
introduces the basic elements composing the proposed IEEE wrapper serial output (WSO), and, optionally, a test access
Standard 1500 Verification Framework. Sections IV and V de- mechanism (TAM).
tail the different types of verification needed to build the frame- The IEEE Standard 1500 is an effort of reconciling and ac-
work. Finally, Section VI presents a prototype of the IEEE Stan- commodating different test strategies and motives. The greatest
dard Verification Framework and Section VII summarizes the effort has been put into supporting as many requirements as pos-
main contributions and outline future research activities. sible while still producing a cohesive and consistent standard.
In addition to the mandatory elements, the core designer is
II. AN OVERVIEW OF THE IEEE STANDARD 1500 free to define a set of wrapper defined registers (WDR) and/or
This section introduces the main features of the IEEE Stan- core defined registers (CDR). WDRs and CDRs are the mecha-
dard 1500 that will be extensively used in this paper. nism used by the IEEE Standard 1500 to accommodate the dif-
The IEEE Standard 1500 defines a scalable standard design- ferent test strategies coming from different core providers.
for-testability architecture to facilitate testing and diagnosis of The strong effort put into providing high flexibility can be
embedded cores and associated circuitry in an SoC. The archi- finally translated into the definition of the following two levels
tecture is independent from the underlying core’s functionality of compliance to the standard.
and technology. • IEEE Standard 1500 Compliant Core: This notion refers
The IEEE Standard 1500 architecture always includes the fol- to a core that incorporates an IEEE Standard 1500 wrapper
lowing. function and comes with an IEEE Standard 1500 CTL pro-
• Core Test Wrapper: A wrapper placed around the bound- gram. The CTL program describes the core test knowledge,
aries of the core that allows accessing its testing function- including how to operate the wrapper, at the wrapper’s ex-
alities using a standard interface and protocol (see Fig. 1). ternal terminals.
The wrapper is completely transparent when the core is not • IEEE Standard 1500 Ready Core: This notion refers to a
in test mode. core which does not have a complete IEEE Standard 1500
• Information Model: A formal description of the IEEE Stan- wrapper, but does have a IEEE Standard 1500 CTL de-
dard 1500 functionalities (mandatory and optional) imple- scription. The CTL description can be used to synthesize an
mented by the core test wrapper. The Information Model IEEE Standard 1500 compliant wrapper to make the core
BENSO et al.: IEEE STANDARD 1500 COMPLIANCE VERIFICATION FOR EMBEDDED CORES 399

Fig. 2. IEEE 1500 Verification Framework UML use cases diagram.

fully compliant. The CTL program describes the core test


knowledge at the bare core terminals.

III. BUILDING THE IEEE STANDARD 1500


VERIFICATION FRAMEWORK
This section introduces the basic concepts needed to build
a verification framework for the IEEE Standard 1500. Wher-
ever possible, the framework will be described resorting to the
unified modeling language (UML) [11]. UML is a semiformal
specification language standardized by the object management
group (OMG) [12]. The usage of UML allows building a model
of the framework not biased towards a specific software imple-
mentation.
The first step to perform is the identification of the different
scenario where the framework may be used. Fig. 2 shows the
UML use cases diagram for the IEEE Standard 1500 Verifica-
tion Framework.
The verification framework represents the system to model
(identified by a rectangle in the diagram) and for this system we Fig. 3. IEEE 1500 Verification Framework UML class diagram.
have a single use case (identified by the oval in the diagram)
consisting in the verification of the compliance of a core test
wrapper with the IEEE Standard 1500. very poor knowledge of the core’s internal structure, the
An UML use case defines a sequence of interactions between verification of the IEEE Standard 1500 compliance may
one or more actors, i.e., candidate users of the system, and the be even harder than for a core provider.
system itself. For the IEEE Standard 1500 Verification Frame- Modelling the IEEE Standard Verification Framework means
work, we envision the following two different actors. defining the actions performed by the “Verify the wrapper com-
• Core Provider: Core providers have a strong interest in pro- pliance” use case and defining the elements involved in those
viding IEEE Standard 1500 compliant designs to facilitate actions. The elements building the framework are modeled by
the integration of their cores into system-level test infra- an UML Class Diagram. A class diagram partitions the system
structures. From the core provider perspective, selling a into areas of responsibility (classes), and shows dependencies
product as IEEE Standard 1500 compliant when the core (associations) between them.
is actually not fully compliant, is a high risk situation. Ver- Fig. 3 depicts the class diagram for the full IEEE Standard
ifying the standard compliance after the core design has 1500 Verification Framework. The different elements (classes
been signed-off can be really complex. In fact, if an error and associations) of the diagram will be deeply explained in
has been originated by an incorrect application of a rule, Sections IV–VI and will be used to formalize the operations
during the simulation and verification phase of the core performed by the framework.
the designer will very likely be unable to recognize the Let us now consider the structure of the IEEE Standard 1500.
problem. Moreover, in case of errors appearing in corner As already introduced in Section II, the standard defines a core
cases of the core functionalities, the error can easily escape test wrapper and an information model defined as a set of CTL
the debug phase but can be excited by the user application. [10] statements.
• Core Integrator: Core integrators need to be sure that The information model is a key element of the standard. To
IEEE Standard 1500 compliant cores actually comply verify the compliancy of a core test wrapper with the IEEE Stan-
with the required functionalities at both standalone and dard 1500, we have to first verify the correctness of the syntax
system-level. In case of errors in the wrapper design, the of the CTL description provided with the wrapper. This simple
whole SoC compliancy or even functionality may be com- constraint identifies the first element of the framework: a module
promised. Moreover, since core integrators usually have a in charge of performing a syntax analysis of the information
400 IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, VOL. 16, NO. 4, APRIL 2008

model. This module is modeled by a class (Syntax_Verifier) in


the class diagram of Fig. 3.
Beside the language used to specify the information model,
the IEEE Standard 1500 is then composed of a set of different
rules that define how an IEEE Standard 1500 compliant core
test wrapper has to be designed. Rules identify the following
two different aspects of the standard.
• Semantic Aspects: They mainly concern the information
model (CTL description) and can be verified without any
interaction with the actual core/wrapper implementation
(i.e., without any need of performing core/wrapper simu-
lations).
• Behavioral Aspects: They target the communication proto-
cols and the behavior of the wrapper. In general, the verifi-
cation of these aspects requires a functional simulation of
the wrapper.
The way semantic and behavioral aspects can be verified
is quite different; in the framework, they require two separate
modules in charge of performing the two types of verification.
The two modules are modeled by the Semantic_Verifier class
and the Behavioral_Verifier class of the diagram in Fig. 3.
At this point, we have three main modules in the IEEE Stan-
dard 1500 Verification Framework that correspond to the fol-
lowing three verification processes: Fig. 4. IEEE 1500 UML Verification Framework basic verification plan.
• syntax verification;
• semantic verification; If the syntax verification passes, it is then possible to move to
• behavioral verification. the next verification phases. In order to reduce the verification
Beforeaddingmoredetailstothemodel,itisimportanttodefine effort in the case of non-compliant cores, the next scheduled ver-
a possible verification plan in order to highlight dependencies ification is the semantic verification. This verification works on
between the results of the different verification steps. Moreover, information contained in the information model (does not need
being that the verification process is one of the main cost factors a functional simulation of the wrapper) and can therefore be per-
of a modern SoC, we have to define optimal verification plans formed much faster than the behavioral verification. Moreover,
targeting the reduction of the overall verification time. the behavioral verification needs to use the information model,
Fig. 4 shows the verification plan for the syntax, semantic, which than needs to be analyzed first.
and simulative verification in the proposed framework. It is Again the semantic verification starts with a do_Verification()
modeled using an UML sequence diagram describing how message exchanged between the actor and the Semantic_Ver-
groups of objects (instances of classes) collaborate to complete ifier class. The result of the message is a Boolean value
a given task. Typically, a sequence diagram captures the be- (OK_Sem) stating whether the verification passed or failed.
havior of a single use case and in this particular case, it models In the case of a positive response from the semantic verifica-
the “Verify the wrapper compliance” use case of Fig. 2. The tion, it is possible to perform the last verification step (behav-
collaboration between the objects is represented by an exchange ioral verification) modeled in Fig. 4 as a message exchange be-
of messages between objects (calls to UML classes methods). tween the actor and the behavioral verifier. Moreover, in some
The verification process is managed by one of the actors of situations it may be useful to perform the behavioral verification
the model. At this abstraction level there is no difference be- even in the case of a negative response from the semantic veri-
tween the two classes of users identified in the use case diagram fication in order to better diagnose the cause of the non-compli-
of Fig. 2 (core integrator and core provider). As already intro- ancy. This possibility is modeled in Fig. 4 by the self-message
duced in this section, being that the information contained in the “?Perform behavioral verification” sent by the actor to itself.
information model is an essential element for any other verifica- In Sections IV–VI, to complete the definition of the IEEE
tion action, the first action to perform is the syntax verification. Standard 1500 Verification Framework, each verification step
The action of performing the verification is modeled as an modeled in Fig. 4 will be analyzed in details.
exchange of a do_Verification() message between the Actor and
the Syntax_Verifier class. The Syntax_Verifier performs the re- IV. SYNTAX VERIFICATION
quired checks and returns a Boolean value (OK_Syn) to the As introduced in Section III, the syntax verification is the
actor indicating whether the verification was successful or not. first action performed by the IEEE Standard 1500 Verification
If the syntax verification fails, it is not possible to proceed with Framework (see Fig. 4).
the next steps. The verification process ends and the core test This analysis verifies that the information model (CTL) [10]
wrapper under analysis is considered not IEEE Standard 1500 provided together with an IEEE Standard 1500 compliant core
compliant. is syntactically correct. The syntax verification does not deal
BENSO et al.: IEEE STANDARD 1500 COMPLIANCE VERIFICATION FOR EMBEDDED CORES 401

Fig. 5. IEEE 1500 syntax verification UML component diagram.

with the information contained in the information model, it just


checks the syntax of the statements in the model.
The problem of verifying the syntax of a set of statements, ac-
cording to a given language definition, is a well-known problem
in the field of programming language compilers. It is usually
solved using special programs called lexers and parsers [13].
In computer science and linguistics, parsing is the process of
analyzing a sequence of tokens to determine its grammatical
structure with respect to a given formal grammar. A parser is
the component of a compiler that carries out this task. Parsing
transforms input text into a data structure, usually a tree, which
is suitable for later processing and which captures the implied
hierarchy of the input. Lexical analysis creates tokens from a
sequence of input characters. Tokens are processed by the parser
to build a data structure such as parse tree or abstract syntax
trees. Fig. 6. IEEE Standard 1500 rules examples.
The syntax verification is modeled by the Syntax_Veri-
fier class (see Fig. 3) in the IEEE Standard 1500 Verification
Framework. The class implements the do_Verification() method verification aim at proving that the features implemented in a
actually performing the syntax analysis of the CTL provided core test wrapper are compliant with the definitions of the stan-
with the information model. We can therefore say that: “the dard.
Syntax_Verifier class analyzes the syntax of the information The IEEE Standard 1500 basically consists in a collection
model.” This dependency is modeled in Fig. 3 by an asso- of rules. The concept of “IEEE 1500 rule” is a key point to
ciation (Analize) between the Syntax_Verifier class and the model the semantic and behavioral verification. Fig. 6 shows an
Information_Model class that models the IEEE Standard 1500 example of two different rules.
information model. From the previous example, it is clear that some rules target
The Syntax_Verifier class is finally composed of two main the semantic of the IEEE Standard 1500 information model
elements: 1) a CTL lexer and 2) a CTL parser that collaborate (e.g., Rule 17.2.1.d of Fig. 6), whereas other rules focus more on
to analyze the syntax of the IEEE Standard 1500 information the architectural or functional aspects of a component (e.g., Rule
model. This property is modeled by the UML component dia- 11.3.1.d of Fig. 6). There are also rules that have both charac-
gram of Fig. 5 depicting the software components actually com- teristics. We therefore identify the following three different rule
posing the Syntax_Verifier. categories:
In order to implement the syntax verification it is enough to • semantic rules;
implement a parser and a lexer for the CTL language. Even if the • behavioral rules;
CTL is an extension of the STIL language [14] and a standard • mixed rules.
itself [10], this implementation is not trivial since most of the Each IEEE Standard 1500 rule is modeled in the IEEE
lexical rules that define the CTL language are described using Standard 1500 verification framework by the IEEE_1500_rule
natural language and they have to be translated into a formal class (see Fig. 3). This class allows the formalization of a set
grammar allowing the implementation of the lexer and parser of concepts expressed in natural language (english) by the
components. standard. Each rule is characterized by the following attributes
(see Fig. 3).
V. IEEE STANDARD 1500 RULES VERIFICATION • Name: The rule’s number as it appears in the IEEE Stan-
The syntax verification, introduced in Section IV, is able to dard 1500 [4].
guarantee the syntactic correctness of the IEEE Standard 1500 • Standard_Desc: The rule explanation as it appears in the
information model only. In order to guarantee the compliancy IEEE Standard 1500 (in natural language).
of a core test wrapper with the IEEE Standard 1500 it is neces- • Status: Identifies (for a given core test wrapper) whether
sary to perform a deeper analysis. As introduced in Section III, the rule has been verified with success, it failed, or still has
this analysis has to be at the same time semantic, by using infor- to be verified.
mation obtained from the information model itself, and behav- Each rule targets different architectural and functional aspects
ioral, by performing simulations of the wrapper using electronic of the core test wrapper. This dependency is modeled by the
design automation (EDA) tools. Both semantic and behavioral Refer_To association between the IEEE_1500_Rule class and
402 IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, VOL. 16, NO. 4, APRIL 2008

the Wrapper_Element class in Fig. 3. The Wrapper_Element


class is an abstract class that identifies a general part of the
wrapper. It is then specialized into the different actual compo-
nents of the wrapper, e.g., WIR, WBR, etc. (see Section II).
Finally, each rule needs to have a verifier able to proof the
correct implementation of the rule in the core test wrapper. From
the classification of IEEE 1500 rules into semantic, behavioral,
and mixed rules, we can identify the following two categories
of rule verifiers.
• Semantic Rule Verifiers: In charge of verifying the semantic
aspects of a rule.
• Behavioral Rule Verifiers: In charge of verifying the archi-
tectural and functional aspects of a rule.
The two types of rule verifiers are modeled in the frame-
work with the Semantic_Rule_Verifier class and the Be-
havioral_Rule_Verifier class, respectively (see Fig. 3). The
modeling and implementation of these two rule verifiers will
be deeply analyzed in the following two subsections.
As already introduced in Section III, one of the requirements
in the implementation of an IEEE Standard 1500 Verification
Framework is the optimization of the verification process. In
particular, it is necessary to understand whether, upon a rule
failure, it is necessary to abort the whole verification process
or not. To address this issue, we introduced a very detailed rule
hierarchy. This hierarchy is represented in Fig. 3 as the De-
pend_On association. Depend_On is a one to many association Fig. 7. Example of IEEE Standard 1500 rule hierarchy.
that creates a relation between a rule and a set of other rules: if
rule A depends on rule B, it means that in order to verify the
compliancy to rule A it is necessary to first verify the compli- by no means enough to guarantee IEEE Standard 1500 compli-
ancy to rule B. The identification of the optimal hierarchy is a ancy. First of all, pure semantic rules are only a relatively small
key point in the definition of the verification plan and in the re- subset of the whole rules set. Moreover, the semantic analysis is
duction of the verification costs. For the implementation of our performed on data contained in an information model supplied
prototype (see Section VI), to create the rule hierarchy, we took by the core provider; there is no guarantee that the CTL descrip-
into account the following characteristics. tion perfectly matches the actual hardware implementation.
• Verification Effort: The time required to verify the rule. Looking at the content of the IEEE Standard 1500 informa-
Faster rules (e.g., semantic rules) are, if possible, placed tion model the following types of CTL statements are respon-
higher in the hierarchy. sible for most of the required semantic information:
• Component Complexity: The complexity of the compo- • ScanStructures;
nents targeted by the rule. Rules that, in case of failure, • MacroDefs;
require to fix the core itself are placed higher in the hier- • environments.
archy and are usually critical for the continuation of the The main issue with the information model is that it may in-
verification process. clude many user-defined structures; also, many mandatory or
• Component Isolation: The impact that the components tar- optional hardware structures may be mapped on hardware com-
geted by the rule have in the wrapper. If the component is ponents already present in the core and therefore using different
functionally isolated from the rest of the wrapper, then it is naming conventions. Nevertheless, in order to be useful to the
possible to continue at least part of the remaining verifica- semantic verifier, the semantic information needs to be orga-
tion process even if the rule fails. nized in a more formal and less core-dependent way. To over-
An example of part of the rule hierarchy is presented in Fig. 7. come this problem, we propose to translate the CTL description
related to a particular core implementation into a more general
metadata model that can be more easily analyzed and proofed.
A. Semantic Rule Verifier This operation can be efficiently performed during the
parsing operation executed at the beginning of the syntax
Semantic rules and part of the mixed rules can be verified by verification step (see Section IV). In particular, the semantic
simply analyzing the content of the IEEE Standard 1500 infor- information of the information model has to be translated
mation model provided with the core test wrapper. This is a fast into the metadata structure and then renamed according to an
and powerful way to verify at least part of the IEEE Standard internal and core-independent naming convention. In this way,
1500 compliancy since it does not require any simulation of the all user-defined structures (and therefore signal/register names)
wrapper/core itself. Although simple and fast, this analysis is can be mapped to their corresponding general templates into
BENSO et al.: IEEE STANDARD 1500 COMPLIANCE VERIFICATION FOR EMBEDDED CORES 403

The term dynamic refers to the fact that the verification pat-
terns/stimulus are generated and applied to the design over a
number of clock cycles, and the corresponding results are col-
lected and compared against a reference/golden model. An EDA
simulator is used both to compute the values of all signals during
the simulation and to compare the expected values with the cal-
culated ones.
Simple dynamic verification has a main drawback: only a
subset of all possible behaviors can be verified in a time-bound
simulation run. Testing all possible behaviors under every pos-
sible combination of input stimuli is in most of the cases an un-
Fig. 8. IEEE 1500 Metadata UML class diagram.
feasible task since the test space is too large to be fully covered
in a reasonable amount of time.
To overcome this problem, the number of verification pat-
the metadata model and all the semantic checks can now be per- terns applied to the wrapper has to be statistically significant
formed on the metadata model, where the naming convention but not complete. To do this, verification input patterns are gen-
is independent from the corresponding core. erated randomly under a set of constraints, which are expressed
A very high level UML class diagram modeling an example as mathematical expressions limiting the set of legal values on
of the metadata model is provided in Fig. 8. the input signals that drive the design. In this way, the simulator
Thanks to its internal naming convention and corresponding generates random values and constraints ensure that the gener-
signal mapping, it becomes much easier to run checks on the ated scenarios are valid and plausible.
content of the metadata model. For example, a simple rule that To further optimize this constrained-random generation, cov-
says erage-driven verification is used. Functional coverage metrics
are automatically and in real-time recorded in order to ascertain
whether (and how effectively) a particular test verified (or is ver-
ifying) a given feature; this information can then be fed back into
the generation process in order to drive additional verification
could be easily verified performing a control like efforts more effectively towards the required goal. The coverage
metrics are evaluated on coverage monitoring points defined by
the user. The market offers a number of tools that are able to
support this dynamic (or functional) verification methodology.
The most used ones are Specman Elite (Cadence) [15] and Vera
This operation will automatically check if a mapping exists (Synopsys) [16]. Besides the different verification and pattern
between the WS_Bypass template in the metadata model, and a generation engines, all of them apply the verification patterns to
(user-defined) register in the CTL information model. the target design using a verification component placed around
the core under analysis. The verification component, which is
a behavioral-level module described using a proprietary verifi-
B. Behavioral Rules Verifier
cation language (e for Specman Elite, OpenVera for Vera), per-
The behavioral verification is the most complex step of the forms the constrained-random generation of the verification pat-
verification process. Behavioral and mixed rules can only be val- terns, applies them, and is directly controlled by the verification
idated using a behavioral approach. Differently from the syntax engine monitoring the current coverage reached in the verifica-
and semantic verification, the behavioral approach is based on tion process.
a functional simulation of the core test wrapper and of the core To efficiently apply this verification approach to perform the
itself. Simulation is the only effective approach to verify com- behavioral verification of IEEE Standard 1500 rules it is neces-
pliancy of time-related rules, protocols, signal connections, and sary to do the following.
correct instructions implementation. The IEEE Standard 1500 • Create a rule verification component for each rule (or
Verification Framework models the behavioral rule verifier with subset of similar rules). The rule verification component
the Behavioral_Rule_Verifier class in Fig. 3. The class imple- is in charge of generating the verification patterns that,
ments the do_Verification() method that actually performs the applied during the simulation, will allow checking that
behavioral verification. The behavioral rule verifier needs to all the architectural and behavioral aspects of the rule are
simulate the core/wrapper it therefore has a Simulate associa- correctly implemented in the design.
tion with the IEEE_1500_Compliant_Core class (see Fig. 3). • Identify in the design the rule coverage points for that rule.
The IEEE_1500_Compliant_Core class models the core under A coverage point is a signal/register in the wrapper that
verification (it usually corresponds to an HDL description of the needs to be monitored in order to evaluate the coverage
core). reached during the verification process on that particular
A well-known approach to perform this type of verification rule.
is the so-called dynamic, coverage-driven, constrained-random The concept of coverage applied to the IEEE Standard 1500
simulation functional verification. rules is very important. Not only allows to compute the number
404 IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, VOL. 16, NO. 4, APRIL 2008

Fig. 9. IEEE 1500 behavioral verifier component diagram.

of verified rules over the set of available ones, but also allows to
compute a compliancy level of each individual rule. As briefly
explained before, if the combination of possible input patterns/
internal states of the components involved in the rule is too large,
the verification engine resorts to the constrained-random pattern
generation. This will lead to the application of a subset of all Fig. 10. Verification plan for semantic and behavioral rules.
possible patterns and therefore to a rule coverage or compliancy
level possibly lower than 100%.
The most challenging issue in this phase is to write rule ver- C. Semantic and Behavioral Verification Plan
ification components and identify rule coverage points that are
independent from the specific core or wrapper under analysis. As already introduced in Section III, the definition of an ef-
Again, the name mapping stored in the metadata model de- ficient test plan is critical in reducing the overall verification
scribed in Section V-A can be used to abstract the verification costs. The sequence diagram of Fig. 4 introduced a first level of
component from the specific core implementation. scheduling among the three different verification activities per-
Another very important issue to be considered in this phase formed by the IEEE Standard Verification Framework. We can
is whether the core under verification is a black- or a white-box. now enter into more details and model how these activities can
The difference is in the amount of available information on the be integrated together.
core’s internal structure. In a black-box core the only available 1) Syntax Verification Test Plan: The syntax verification is
information is the input/output (I/O) interface. For IP protection a one-step verification process. It just performs a single action
the internal structure of a black-box core is unknown (except to that consists in parsing the CTL files provided with the IEEE
the core designer, of course). A core integrator, who buys cores Standard 1500 information model. No particular scheduling is
from different vendors, usually deals only with black-box cores. needed for this verification.
On the other hand, a core designer always has all the information 2) Semantic and Behavioral Verification Test Plan: Both se-
regarding the core, and therefore uses the white-box approach. mantic and behavioral verification involve the verification of a
From the IEEE Standard 1500 compliancy verification point set of IEEE Standard 1500 rules. The order in which the rules
of view, the difference between a black- and a white-box core are verified is extremely important, first of all to guarantee the
directly impacts the degree of compliancy that can be verified, correctness of the result of the verification process and second
whereas in a white-box core, all the internal signals of the core to reduce the overall verification time by performing different
can be controlled and/or observed and therefore all rules can be actions in parallel.
thoroughly verified, in a black-box core only the rules (or the The definition of the rule hierarchy described in Section V
portions of them) that do not require directly controlling or ob- and modeled using the Depend_On association in the UML
serving the core internal signals can be fully verified. Full IEEE class diagram of Fig. 3 implicitly defines a verification plan
Standard 1500 verification compliancy can only be achieved for both semantic and behavioral rules. The plan is modeled by
when dealing with white-box cores or with black-box cores im- the UML sequence diagram of Fig. 10. The semantic/behavioral
plementing only the basic requirements of the standard. verifier sends a message to each rule object asking to perform
This very important aspect of the behavioral verifica- the verification. The message is an asynchronous message (half
tion is modeled in the IEEE Standard Verification frame- arrow) that means that the verification of the full set of rules is
work by the Black_Box_Core_Test_Wrapper class and the performed in parallel.
White_Box_Core_Test_Wrapper class. Each rule, before performing its actual verification, first con-
Fig. 9 finally summarizes the main components of the behav- trols the status of the rules it is dependent on. It is possible to
ioral verifier using an UML components diagram. have the following three different situations:
BENSO et al.: IEEE STANDARD 1500 COMPLIANCE VERIFICATION FOR EMBEDDED CORES 405

• all rules are already verified: it is possible to proceed with


the test;
• all rules are already verified but some of them failed: the
whole verification process may be aborted or not, de-
pending on the relationship severity of the involved rules;
• some rule is still working on its verification: the rule has to
wait.
When all rules have completed the verification, the semantic/
behavioral verifier collects the results and can inform the actor
about the compliancy of the wrapper.
Obviously, the two types of verification (semantic/behav-
ioral) will work on different sets of rules and the set of rules
that will be verified will depend on the features actually imple-
mented in the wrapper.

VI. IMPLEMENTATION
This section overviews a prototype implementation of the
abstract UML model of the IEEE Standard 1500 Verification Fig. 11. Lexical rules example.
Framework presented in the previous sections. The full frame-
work has been implemented using the Java [17] developing
platform. The Java language is a full object oriented program- #3 is more complex: formally, it gives a global surface descrip-
ming language and it is a perfect candidate to implement UML tion without providing, for instance, what type of characters are
models. allowed or not. To formally describe these types of rules, an ex-
Besides the general interface provided by the implementa- tensive analysis of the IEEE Standard 1450.6 has been neces-
tion, we will focus in this section on a possible implementation sary.
of the three types of verifiers (i.e., syntax, semantic, and behav-
ioral), being these components the real core of the verification B. Semantic Verification
framework. As already introduced in Section V-B, the most crit- The implementation of the semantic verification mainly con-
ical component in the framework is the behavioral rule verifier. sists in the implementation of the metadata structure defined in
This component needs to perform a dynamic functional veri- Section V-A. We implemented the metadata as a collection of
fication. Many commercial platforms are available to perform Java classes as defined by the UML class diagram in Fig. 8. The
this type of verification; for our implementation, we used the task of populating the metadata with the actual CTL information
Specman Elite [15] suite. is demanded to the CTL parser implemented in Section VI-A.
Finally, each IEEE Standard 1500 semantic rule has been trans-
A. Syntax Verification lated into a set of queries performed on the metadata. Actually,
Several technologies allow the automatic implementations of this type of implementation is not the more efficient one. Due to
parsers starting from the description of a formal grammar. We the number of rules to verify and to the complexity of a real core,
successfully implemented a syntax analyzer by using two open- the memory size and the complexity of the metadata structure in
source tools. JFLEX [18] is a lexical analyzer generator for Java, memory may become very high. For a commercial implementa-
developed by V. Paxson. JFLEX is designed to work together tion, the use of a data base management system (DBMS) would
with the parser generator CUP [19] by S. Hudson. The two tools be recommended.
allow the description of a formal language grammar (in our case
C. Behavioral Verification
the CTL grammar) and the automatic generation of a Java envi-
ronment (collection of Java classes) able to perform the syntax The implementation of the behavioral verification process re-
analysis of a text file, according to the defined language. The lies on the use of the Specman Elite verification environment
definition of the set of lexical rules and of the grammar needed [15] interfaced with the Synopsys VCS simulator [20]. Specman
to implement the parser has been performed by analyzing the Elite allows the definition of functional verification plans by
IEEE CTL definition [10] and by translating a set of informal using an object oriented language named e. e is a complete ver-
definition contained in the standard into a formal definition suit- ification language that allows the definition of the following:
able for the generation of the parsers. • built-in data generation from objects definition and con-
Fig. 11 shows an example of how syntax definition of the straints;
IEEE Standard 1450.6 (CTL) are translated into formal defini- • notation of time like HDL simulators;
tions needed to generate the parser by using JFLEX and CUP. In • built-in parallel execution;
the example, we have three types of informal rules: rules #1 and • HDL interface; read from and write to HDL signals at any
#2 describe with a simple, precise, and intuitive vocabulary how hierarchical level; call HDL tasks;
the functional rules will be and indicate directly the definition • predefined verification capabilities; automates handling
of a set of lexical definitions (i.e., “newline” and “tab”). Rule checks without having to write complex procedural code;
406 IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, VOL. 16, NO. 4, APRIL 2008

• predefined flow execution.


In addition, it defines a so-called e-Reuse Methodology
(eRM) aiming at designing reusable, consistent, extensible,
plug-and-play verification environments, and “e” Verification
Components (eVCs). In the implementation of our prototype,
we followed this design methodology.

D. Application Cases
The functionalities of the IEEE Standard 1500 Verification
Framework prototype have been tested on a simple IEEE Stan-
dard 1500 compliant core implementing a four bit counter with
the following characteristics:
• CLOCK input used as counting clock;
• RESET input to reset the counting state;
• LOAD input to force a new start value for the counting;
• 4-bit input DIN that indicates the start value used when
LOAD is high;
• 4-bit output COUNT that indicates the actual counting Fig. 12. Behavioral rule violation example.
value.
This core has been wrapped with a IEEE Standard 1500 core
test wrapper having the following characteristics: in the near future, with the introduction of IEEE Standard 1500
• instruction register (WIR), 3 bit length; compliant wrappers in all IP cores on the market, a verification
• bypass register (WBY), 1 bit length; framework following the model presented in this paper will be
• boundary register (WBR), 8 bit length; able to increase productivity, reduce design time, and optimize
• optional TransferDR wrapper serial control; the test plan of very complex SoCs.
• four implemented instructions: WS_BYPASS,
ACKNOWLEDGMENT
WS_PRELOAD, WS_INTEST, WS_EXTEST.
The verification process run on the wrapper/core was suc- The authors would like to thank D. Scollo, G. Politano, A.
cessful from the very beginning, when it allowed to discover Mouth, and L. Melchionda for their help in the development of
design bugs we involuntarily introduced the wrapper design. this manuscript.
In order to carefully validate the verification capabilities of
REFERENCES
the prototype, we generated a set of different core test wrap-
[1] R. Gupta and Y. Zorian, “Introducing core-based system design,” IEEE
pers, systematically violating different rules of the standard. As Des. Test. Comput., vol. 14, no. 2, pp. 15–25, Oct. 1997.
an example, Fig. 12 shows the result of the behavioral verifica- [2] Y. Zorian, E. Marinissen, and S. Dey, “Testing embedded-core-based
tion in case of a non-compliant wrapper. In the example, rule system chips,” IEEE Computer, vol. 32, no. 6, pp. 52–60, Jun. 1999.
[3] IEEE Standard Test Access Port and Boundary-Scan Architecture,
number 10.2.1.c fails. The prototype highlights this violation IEEE Std. 1149.1, 1990.
and also provides the waveform obtained by the simulation to [4] IEEE Standard Testability Method for Embedded Core-Based Inte-
provide a better understanding of the reasons that led to the rule grated Circuits, IEEE Std. 1500, 2005.
[5] L. Jin-Fu, H. Hsin-Jung, C. Jeng-Bin, S. Chih-Pin, W. Cheng-Wen,
violation. On going work is focusing on creating a violation-pro- S.-I. C. C. Cheng, H. Chi-Yi, and L. Hsiao-Ping, “A hierarchical test
grammable wrapper, where different violations can be enabled methodology for systems on chip,” IEEE Micro, vol. 22, no. 5, pp.
or disabled in order to verify the efficiency of the verification 69–81, Sep. 2002.
[6] Y. Zorian, “Test requirements for embedded core-based systems and
framework also in presence of multiple violations in the same IEEE p1500,” in Proc. IEEE Int. Test Conf. (ITC), Nov. 1997, pp.
wrapper. 191–199.
[7] T. McLaurin and S. Ghosh, “Etm10 incorporates hardware segment of
VII. CONCLUSION ieee p1500,” IEEE Des. Test. Comput., vol. 19, no. 3, pp. 6–11, May
2002.
In this paper, we presented a systematic methodology and an [8] S. Picchiotino, M. Diaz-Nava, B. Forest, S. Engels, and R. Wilson,
associated formal model to build a verification framework for “Platform to validate soc designs and methodologies targeting
nanometer cmos technologies,” in Proc. IP-SOC, Dec. 2004, pp.
IEEE Standard 1500 compliant cores. The framework is able to 39–44.
check if the implementation of the wrapper provided with an [9] I. Diamantidis, T. Oikonomou, and S. Diamantidis, “Towards an IEEE
IP core correctly follows the architectural and behavioral rules verification infrastructure a comprehensive approach,” in Proc. ClubV,
Mar. 2005, p. 1500.
defined in the IEEE Standard 1500. The proposed framework [10] IEEE Core Test Language, IEEE Std. 1450.6, 2005.
targets different possible users, from the core designer to the [11] Object Management Group, Needham, MA, “UML official website,”
core integrator, and therefore is able to guarantee various level 2008 [Online]. Available: http://www.uml.org/
[12] Object Management Group, Needham, MA, “Object management
of compliancy depending on the amount of information about group official website,” 2008 [Online]. Available: http://www.omg.org/
the internal core structure available to the user. We also pre- [13] N. Wirth, Compiler Construction. Reading, MA: Addison-Wesley,
sented a proof-of-concept of the proposed model implementing 1996.
[14] Standard Test Interface Language (STIL) Standard Test Interface Lan-
a prototype of the verification framwork in Java and with the guage (STIL) Standard Test Interface Language (STIL) Standard Inter-
support of the Specman Elite verification toolkit. We believe that face Test Language (STIL), IEEE Std. 1450.0, 1999.
BENSO et al.: IEEE STANDARD 1500 COMPLIANCE VERIFICATION FOR EMBEDDED CORES 407

[15] Cadence, San Jose, CA, “Specman elite home page,” 2008 [Online]. Paolo Prinetto received the M.S. degree in electronic
Available: http://www.verisity.com/products/specman.html engineering from Politecnico di Torino, Torino, Italy.
[16] Synopsys, Mountain View, CA, “Vera Web Site,” 2008 [Online]. Avail- He is a Full Professor with the Department of
able: http://www.synopsys.com/products/vera/vera.html Computer Engineering, Politecnico di Torino and
[17] Sun Microsystems, “Java official website,” 2008 [Online]. Available: a Joint Professor with the University of Illinois,
http://www.java.com Chicago. His research interests include testing, test
[18] Gerwin Klein, “Jflex home page,” 2008 [Online]. Available: generation, BIST, and dependability.
http://jflex.de/index.html Prof. Prinetto is a Golden Core Member of the
[19] GVU Center, Georgia Institute of Technology, Atlanta, “Cup home IEEE Computer Society and he has served on the
page,” 2008 [Online]. Available: http://www2.cs.tum.edu/projects/cup/ IEEE Computer Society TTTC: Test Technology
[20] Synopsys, Mountain View, CA, “Vcs home page,” [Online]. Available: Technical Council as an elected chair.
http://www.synopsys.com/products/simulation/simulation.html

Alfredo Benso (SM’07) currently holds a tenured


Associate Professor position with the Department Yervant Zorian (F’99) received the M.Sc. degree
of Computer Engineering, Politecnico di Torino, from the University of Southern California, Los An-
Torino, Italy, where he teaches Microprocessor geles, and the Ph.D. degree from McGill University,
Systems and Advanced Programming Techniques. Montreal, QC, Canada.
In his scientific career, mainly focused on hardware He is the Vice President and Chief Scientist of
testing and dependability, he coauthored more Virage Logic Corporation, Fremont, CA, and an
than 60 publications between books, journals, and Adjunct Professor with the University of British
conference proceedings. He is also actively involved Columbia, Vancouver, BC, Canada. He was previ-
in the Computer Society, where he has been the ously a Distinguished Member of the technical staff
leading volunteer for several projects such as the with AT&T Bell Laboratories and Chief Technology
Technical Committees Archives (TECA) database, and Conference Information Advisor of LogicVision. He served as the IEEE
Management Application (CIMA). Computer Society Vice President for Conferences and Tutorials, Vice President
Prof. Benso is a Computer Society Golden Core Member. for Technical Activities, Chair of the IEEE Test Technology Technical Council,
and Editor-In-Chief of the IEEE DESIGN AND TEST OF COMPUTERS. He has
authored over 300 papers, holds 16 U.S. patents.
Dr. Zorian was a recipient of numerous Best Paper Awards, a Bell Labs’ R&D
Stefano Di Carlo (M’03) received the M.S. degree Achievement Award, the 2005 prestigious IEEE Industrial Pioneer Award, and
in computer engineering and the Ph.D. degree in in- the 2006 IEEE Hans Karlsson Award. He was selected by EE Times among the
formation technologies from Politecnico di Torino, top 13 influencers on the semiconductor industry.
Torino, Italy, in 1999 and 2004.
He is an Assistant Professor with the Department
of Computer Engineering, Politecnico di Torino. His
research interests mainly focus on DFT techniques,
SoC testing, BIST, and memory testing. He coau-
thored more than 30 publications between journals
and conference proceedings.
Prof. Di Carlo is a Golden Core Member of the
IEEE Computer Society.

You might also like