Dokumen - Tips Dlmscosem Conformance Testing Figure 1 Dlmscosem Conformance Testing Process

Download as pdf or txt
Download as pdf or txt
You are on page 1of 44

TECHNICAL REPORT

COmpanion Specification
for Energy Metering

DLMS/COSEM

Conformance
Testing Process

DLMS User Association

device 
language
message
specification

DLMS User Association 2015-06-19 EXCERPT FROM DLMS UA 1001-1 Ed. 5.0 V 1.0 1/44
© Copyright 2001-2015 DLMS User Association
DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Table of Contents
Foreword...................................................................................................................................................4
1 Scope ...............................................................................................................................................5
2 Referenced documents ....................................................................................................................5
3 Terms, definitions and abbreviations ...............................................................................................5
3.1 Terms and definitions ........................................................................................................5
3.2 Abbreviations ................................................................................................................. 10
4 Conformance testing – overview ................................................................................................... 11
4.1 OSI conformance testing ............................................................................................... 11
4.2 DLMS/COSEM conformance testing ............................................................................. 11
4.3 Main features of DLMS/COSEM conformance testing process..................................... 12
5 The conformance test plans .......................................................................................................... 13
5.1 Scope of testing ............................................................................................................. 13
5.2 IUT testing ...................................................................................................................... 14
5.3 Structure of the abstract test plans ................................................................................ 14
5.4 Abstract test cases ......................................................................................................... 15
5.5 Outcomes and verdicts .................................................................................................. 16
5.6 The HDLC based data link layer ATS ............................................................................ 17
5.7 The DLMS/COSEM application layer ATS .................................................................... 17
5.8 The COSEM interface objects ATS ............................................................................... 17
5.9 The Security Suite 0 (SYMSEC_0) ATS ........................................................................ 18
5.10 Executable test suites and test cases ............................................................................ 18
6 The DLMS/COSEM conformance test tool ................................................................................... 18
6.1 Overview ........................................................................................................................ 18
6.2 CTT versions and editions ............................................................................................. 19
6.3 Operating system and hardware requirements.............................................................. 19
6.4 Licensing the CTT .......................................................................................................... 19
6.5 Installing the CTT ........................................................................................................... 19
7 The conformance assessment process ........................................................................................ 20
7.1 Overview ........................................................................................................................ 20
7.2 Preparation for testing.................................................................................................... 21
7.2.1 Preparation of the IUT ................................................................................ 21
7.2.2 Preparation of the conformance test information ....................................... 22
7.3 Test operations .............................................................................................................. 22
7.3.1 The CTT user interface .............................................................................. 22
7.3.2 Miscellaneous settings ............................................................................... 23
7.3.3 The CTI file ................................................................................................. 23
7.3.4 The COSEM object definition file ............................................................... 24
7.3.5 Selection of the test cases ......................................................................... 26
7.3.6 Connection of the IUT to CTT .................................................................... 27
7.3.7 Test sessions.............................................................................................. 29
7.3.8 Production of the Test Result ..................................................................... 29
7.4 Repeatability of results ................................................................................................... 34
7.5 Requirements for test laboratories ................................................................................. 34

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 2/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

8 The certification process ............................................................................................................... 35


8.1 General .......................................................................................................................... 35
8.2 Initiation of the certification process ............................................................................... 35
8.3 Submission of conformance test documents ................................................................. 35
8.4 Technical and administrative checks ............................................................................. 35
8.5 The Certification ............................................................................................................. 35
8.6 Scope and validity of the Certification ............................................................................ 36
8.7 Disclaimer ...................................................................................................................... 37
9 The quality program ...................................................................................................................... 37
9.1 General .......................................................................................................................... 37
9.2 Validation of the Abstract Test Suites and CTT ............................................................. 37
9.3 Assistance provided to users ......................................................................................... 37
9.4 Maintenance .................................................................................................................. 37
9.5 Use cases ...................................................................................................................... 38
9.5.1 Use case 1 – introducing a new standard OBIS code ............................... 38
9.5.2 Use case 2 – modification of an existing test ............................................. 38
9.5.3 Use case 3 – adding a test for a new standard feature ............................. 38
9.5.4 Use case 4 – revision of the specification .................................................. 38
Annex A Certification template (with sample data) (informative) .......................................................... 39
Annex B (normative) Conformance Test Plans ..................................................................................... 41
Annex C (informative) Bibliography ....................................................................................................... 42

List of Figures
Figure 1 – DLMS/COSEM conformance testing process ...................................................................... 12
Figure 2 – DLMS/COSEM interface object model and communication profiles .................................... 13
Figure 3 – Test suite structure ............................................................................................................... 15
Figure 4 – Structure of the HDLC based data link layer ATS................................................................ 17
Figure 5 – Structure of the DLMS/COSEM application layer ATS ........................................................ 17
Figure 6 – Structure of the COSEM interface objects ATS ................................................................... 18
Figure 7 – Structure of the Security Suite 0 (SYMSEC_0) ATS............................................................ 18
Figure 8 – Conformance assessment process overview ...................................................................... 20
Figure 9 – Miscellaneous settings ......................................................................................................... 23
Figure 10 – The CTI window (illustration) .............................................................................................. 24
Figure 11 – COSEM object definition file cover sheet ........................................................................... 25
Figure 12 – Selection of the test cases ................................................................................................. 27
Figure 13 – Communication settings ..................................................................................................... 28
Figure 14 – Fragment of a sample Test Report .................................................................................... 31
Figure 15 – Basic log ............................................................................................................................. 32
Figure 16 – Detailed log presenting COSEM APDUs in XML format .................................................... 33
Figure 17 – The Traffic window ............................................................................................................. 34
Figure 18 – The DLMS/COSEM compliant logo.................................................................................... 36

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 3/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Foreword
Copyright

© Copyright 2001–2015 DLMS User Association.

This document is confidential. It may not be copied, nor handed over to persons outside the
standardisation environment.

The copyright is enforced by national and international law. The "Berne Convention for the
Protection of Literary and Artistic Works", which is signed by 166 countries worldwide and
other treaties apply.

Acknowledgement

The actual document has been established by the DLMS UA Maintenance Working Group
Conformance Testing Task Force.

Revision History

Versions kept within the DLMS UA WG Conformance testing.

Version Date Author Status Comment


Edition 1.0 2001-05-01 DLMS-UA Released In line with:
- DLMS UA 1000-1 (Blue Book) Edition 4.0;
- DLMS UA 1000-2 (Green Book) Edition 2.0.
Edition 2.0 2002-06-04 DLMS-UA Released In line with:
- DLMS UA 1000-1 (Blue Book) Edition 4.0;
- DLMS UA 1000-2 (Green Book) Edition 2.0;
- CTT v 1.0
Edition 2.0 Amd. 1 2003-01-09 DLMS UA Released Brought in line with CTT v 1.01
Edition 3.0 2007-08-28 DLMS UA Released In line with:
- DLMS UA 1000-1 (Blue Book) Edition 8.0;
- DLMS UA 1000-2 (Green Book) Edition 6.0;
- CTT v 2.0.
Edition 4.0 2010-12-15 DLMS UA Released In line with:
- DLMS UA 1000-1 (Blue Book) Edition 10.0;
- DLMS UA 1000-2 (Green Book) Edition 7.0;
(except security)
- CTT v 2.3.
Edition 5.0 2015-06-19 DLMS UA Released In line with:
- DLMS UA 1000-1 (Blue Book) Edition 11.0;
- DLMS UA 1000-2 (Green Book) Edition 7.0 +
Amendment 3
- CTT 3.0

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 4/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

1 Scope
This document specifies the conformance testing process of metering equipment
implementing the DLMS/COSEM specification for meter data exchange.

This document only focuses on testing and certifying the implementation of the DLMS/COSEM
specification. Other functional and performance tests are outside the scope of this document.

This Edition 5.0 version applies to CTT 3.0, in line with Blue Book Edition 11 and Green Book
Edition 7.0, + Amendment 3.

It cancels and replaces Edition 4.0, published in 2010.

2 Referenced documents

Reference No. Title


DLMS UA 1000-1 Ed. 11.0: 2013 COSEM Interface Classes and OBIS Identification System, the Blue Book

DLMS UA 1000-2 Ed. 7.0:2009 DLMS/COSEM Architecture and Protocols, the Green Book
Amendment 3:2013
DLMS UA 1002 Ed. 1.0:2003 Glossary of terms, the White Book
DLMS UA 1001-3: ATS_DL V 5 DLMS/COSEM conformance testing – Conformance test plans – Data link
Released: 2010-12-15 layer using HDLC protocol
DLMS UA 1001-6: DLMS/COSEM conformance testing – Abstract Test Plans –
ATS_AL_COSEM_SYMSEC_0 V 1.3 DLMS/COSEM application layer – Symmetric key security suite 0 –
Released: 2015-06-18 COSEM interface objects
DLMS UA 1001-7 COSEM conformance testing – Object definition tables
IEC 62056-21 Electricity metering – Data exchange for meter reading, tariff and load
control – Part 21: Direct local data exchange
ITU-T X.290 (11/1998) OSI conformance testing methodology and framework for protocol
recommendations for IUT-T applications – General concepts

3 Terms, definitions and abbreviations


3.1 Terms and definitions
For the purposes of this document the following definitions apply:
NOTE Most of the following definitions have been taken from ITU-T Recommendation ITU-T X.290. Some definitions have
been modified to better adapt to the DLMS/COSEM conformance assessment process.
3.1.1
abstract test case
A complete and independent specification of the actions required to achieve a specific test purpose,
defined at the level of abstraction of a particular abstract test method. It includes a preamble and a
postamble to ensure starting and ending in a stable state (i.e., a state which can be maintained almost
indefinitely, such as the “idle” state or “data transfer” state) and involves one or more consecutive or
concurrent connections.
NOTE 1 – The specification should be complete in the sense that it is sufficient to enable a verdict to be assigned
unambiguously to each potentially observable outcome (i.e., sequence of test events).
NOTE 2 – The specification should be independent in the sense that it should be possible to execute the derived executable
test case in isolation from other such test cases (i.e., the specification should always include the possibility of starting and
finishing in the “idle” state – that is without any existing connections except permanent ones). For some test cases, there may
be pre-requisites in the sense that execution might require some specific capabilities of the IUT, which should have been
confirmed by results of the test cases executed earlier.

[ITU-T X.290 3.6.3]

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 5/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

3.1.2
abstract test suite (ATS)
A test suite composed of abstract test cases. [ITU-T X.290 3.6.16]

3.1.3
basic interconnection test (BIT)

Limited testing of an IUT to determine whether or not there is sufficient conformance to the main
features of the relevant protocol(s) for interconnection to be possible, without trying to perform
thorough testing. [ITU-T X.290 3.5.5]

3.1.4
behaviour testing
Testing the extent to which the dynamic conformance requirements are met by the IUT. [ITU-T X.290
3.5.8]

3.1.5
capabilities of an IUT
That set of functions and options in the relevant protocol(s) and, if appropriate, that set of facilities and
options of the relevant service definition which are supported by the IUT. [ITU-T X.290 3.4.5]

3.1.6
capability testing
Testing to determine the capabilities of an IUT.
NOTE This involves checking all mandatory capabilities and those optional ones that are stated in the CTI as being
supported, but not checking those optional ones which are stated in the CTI as not supported by the IUT.

[ITU-T X.290 3.5.6, modified]


3.1.7
conformance assessment process
The complete process of accomplishing all conformance testing activities necessary to enable the
conformance of an implementation to one or more OSI* Recommendations* to be assessed. It
includes the production of the CTI documents, preparation of the real tester and the IUT, the execution
of one or more test suites, the analysis of the results and the production of the appropriate protocol
conformance test reports. [ITU-T X.290 3.5.10, modified]

3.1.8
conformance log
A record of sufficient information necessary to verify verdict assignments as a result of conformance
testing. [ITU-T X.290 3.7.15]

3.1.9
conformance test information (CTI)
A statement made by the supplier or implementor of an IUT stating the capabilities and options that
have been implemented and additional information necessary to select and parameterize the
executable test cases.
NOTE 1 Part of this information on the implementation may be taken from the IUT itself.
NOTE 2 X.290 uses the terms Protocol Implementation Conformance Statement (PICS) and Protocol Implementation Extra
Information for Testing (PIXIT).
3.1.10
conformance testing
Testing the extent to which an IUT is a conforming implementation. [ITU-T X.290 3.5.9]

3.1.11
conforming implementation
An IUT, which is shown to satisfy conformance requirements, consistent with the capabilities stated in
the CTI.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 6/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

NOTE In case of DLMS/COSEM, the capabilities are partly declared in the CTI and they are partly provided by the IUT.

[ITU-T X.290 3.4.10, modified]

3.1.12
executable test case
A realization of an abstract test case. [ITU-T X.290 3.6.4]

3.1.13
executable test case error
A test case error in the realization of an abstract test case.

3.1.14
executable test suite (ETS)
A test suite composed of executable test cases. [ITU-T X.290 3.6.17]

3.1.15
“fail” verdict
A verdict given when the observed outcome is syntactically invalid or inopportune with respect
to the relevant Recommendation(s)* or the CTI. [ITU-T X.290 3.7.13, modified]

3.1.16
foreseen outcome
An outcome identified or categorized in the abstract test case specification. [ITU-T X.290 3.7.4]

3.1.17
idle testing state
A stable testing state in which there is no established connection of the relevant protocol(s) and in
which the state of the IUT is independent of any previously executed test cases.

3.1.18
implementation under test (IUT)
That part of a real open system which is to be studied by testing, which should be an implementation
of one or more OSI* protocols in an adjacent user/provider relationship.
NOTE In DLMS/COSEM IUT are DLMS/COSEM servers.

[ITU-T X.290 3.4.1]

3.1.19
“inapplicable” test
A test case, which cannot be performed because the necessary conditions are not available.

3.1.20
“inconclusive” verdict
A verdict given when the observed outcome is valid with respect to the relevant Recommendation(s)*
but prevents the test purpose from being accomplished. [ITU-T X.290 3.7.14]

3.1.21
initial testing state
The testing state in which a test body starts.
NOTE This may be either a stable testing state or a transient state.
3.1.22
inopportune test event
A test event which, although syntactically correct, occurs or arrives at a point in an observed outcome
when not allowed to do so by the protocol Recommendation*. [ITU-T X.290 3.7.11]

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 7/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

3.1.23
means of testing (MOT) (IUTs)
The combination of equipment and procedures that can perform the derivation, selection,
parameterization and execution of test cases, in conformance with a reference standardized ATS, and
can produce a conformance log.

3.1.24
negative test
Test to verify the correct response of the IUT on:

• DLMS/COSEM conformant information and services, which are not implemented;


• non conformant communication traffic
3.1.25
outcome
A sequence of test events together with the associated input/output, either identified by an abstract
test case specifier, or observed during test execution. [ITU-T X.290 3.7.3]

3.1.26
parameterized executable test case
An executable test case, in which all appropriate parameters have been supplied with values in
accordance with a specific CTI [ITU-T X.290 3.6.23, modified]

3.1.27
“pass” verdict
A verdict given when the observed outcome satisfies the test purpose and is valid with respect to the
relevant Recommendation(s)* and with respect to the CTI. [ITU-T X.290 3.7.12, modified]

3.1.28
positive test
test to ensure the correct implementation of the capabilities of the IUT as defined by the supplier. A
positive test has a described and defined response

3.1.29
postamble
The test steps needed to define the paths from the end of the test body up to the finishing stable state
for the test case. [X.290 3.6.9]

3.1.30
preamble
The test steps needed to define the path from the starting stable state of the test case up to the initial
state from which the test body will start. [ITU-T X.290 3.6.7]

3.1.31
protocol conformance test report (PCTR)
A document written at the end of the conformance assessment process, giving the details of the
testing carried out for a particular protocol, including the identification of the abstract test cases for
which corresponding executable test cases were run and for each test case the test purpose and
verdict. [ITU-T X.290 3.7.8]

3.1.32
repeatability (of results)
Characteristic of a test case, such that repeated executions on the same IUT lead to the same verdict,
and by extension a characteristic of a test suite. [ITU-T X.290 3.7.1]

3.1.33
semantically invalid test event
A test event which is neither inopportune nor syntactically invalid, but which contains a semantic error
with respect to the relevant protocol specification (e.g. a PDU containing a parameter value outside
the negotiated range for that parameter).

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 8/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

3.1.34
stable testing state
A testing state which can be maintained, without prescribed Lower Tester behaviour, sufficiently long
to span the gap between one test case and the next in a test session.

3.1.35
static conformance review
A review of the extent to which the static conformance requirements are met by the IUT, by comparing
the static conformance requirements expressed in the relevant Recommendation(s)* with the PICS
and the results of any associated capability testing. [ITU-T X.290 3.5.7 modified]

3.1.36
syntactically invalid test event
A test event which syntactically is not allowed by the protocol Recommendation*. [ITU-T X.290 3.7.10]

3.1.37
test body
The set of test steps that are essential in order to achieve the test purpose and assign verdicts to the
possible outcomes. [ITU-T X.290 3.6.8]

3.1.38
test case
A generic, abstract or executable test case. [ITU-T X.290]

3.1.39
test case error
The term used to describe the result of execution of a test case when an error is detected in the test
case itself.

3.1.40
test event
An indivisible unit of test specification at the level of abstraction of the specification (e.g. sending or
receiving a single PDU). [ITU-T X.290 3.6.11]

3.1.41
test group
A named set of related test cases. [ITU-T X.290 3.6.14]

3.1.42
test group objective
A description of the common objective which the test purposes within a specific test group are
designed to achieve.

3.1.43
test laboratory
An organization that carries out conformance testing. This can be a third party, a user organization, an
Administration*, or an identifiable part of the supplier organization. [ITU-T X.290 3.4.13]

3.1.44
test purpose
A description of the objective which an abstract test case is designed to achieve. [ITU-T X.290 3.6.5]

3.1.45
test step (sub-test)
A named subdivision of a test case, constructed from test events and/or other test steps, and used to
modularize abstract test cases. [ITU-T X.290 3.6.10, modified]

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 9/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

3.1.46
test session
The process of executing the Parameterized Executable Test Suite for a particular IUT and producing
the conformance log.

3.1.47
test suite
A complete set of test cases, possibly combined into nested test groups, that is necessary to perform
conformance testing or basic interconnection testing for an IUT or protocol within an IUT [ITU-T X.290
3.6.12]

3.1.48
unforeseen test outcome
An outcome not identified or categorized in the abstract test case specification. [ITU-T X.290 3.7.5]

3.1.49
valid test event
A test event which is allowed by the protocol Recommendation*, being both syntactically correct and
occurring or arriving in an allowed context in an observed outcome. [ITU-T X.290 3.7.9]

3.1.50
verdict
Statement of “pass”, “fail” or “inconclusive” concerning conformance of an IUT with respect to a test
case that has been executed and which is specified in the abstract test suite. [ITU-T X.290 3.7.6]

3.2 Abbreviations
Abbreviation Explanation

AA Application Association
APDU Application Protocol Data Unit
ATS Abstract Test Suite
COSEM Companion Specification for Energy Metering
COSEM object An instance of an interface class
CO Connection oriented
CTI Conformance Test Information
CTT Conformance Test Tool
DLMS Device Language Message Specification
ETS Executable Test Suite
HDLC High-level Data Link Control
IEC International Electrotechnical Commission
IP Internet Protocol
ITU International Telecommunication Union
IUT Implementation Under Test
LD Logical Device
MOT Means of Testing
OBIS OBject Identification System
OSI Open System Interconnection
PDU Protocol Data Unit
SAP Service Access Point
TCP Transmission Control Protocol

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 10/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

4 Conformance testing – overview


4.1 OSI conformance testing
The concept and methodology of OSI conformance testing is described in the
Recommendation ITU-T X.290.

The objective of conformance testing is to establish whether the Implementation Under Test
(IUT) conforms to the relevant specification(s).

Practical limitations make it impossible to be exhaustive, and economic considerations may


restrict testing still further.

The primary purpose of conformance testing is to increase the probability that different
implementations are able to interwork. While conformance is a necessary condition, it is not
on its own a sufficient condition to guarantee interworking capability. Even if two
implementations conform to the same protocol specification, they may fail to interwork fully.

What conformance testing does do is give confidence that an implementation has the required
capabilities and that its behaviour conforms consistently in representative instances of
communication.

4.2 DLMS/COSEM conformance testing


The DLMS/COSEM specification, as a global standard for data exchange with utility metering
equipment, includes standardized conformance tests to ensure that IUTs comply with
applicable requirements. It is based on the principles developed for OSI conformance testing.

The main elements of the DLMS/COSEM specification are the following:

• the Blue Book, specifying COSEM interface object model, see DLMS UA 1000-1;
• the Green Book, specifying communication profiles, see DLMS UA 1000-2; and
NOTE The contents of the Blue Book and the Green Book are internationally standardized, see the Bibliography.

The DLMS/COSEM conformance testing process comprises the following, see Figure 1:

• the Yellow book – this document – specifying the conformance testing process;
• the conformance test plans, i.e. the Abstract Test Suites (ATSs);
• the Conformance Test Tool (CTT);
• the conformance assessment process;
• the certification process;
• the quality program.
The conformance test plans i.e. the Abstract Test Suites (ATS) describe, at the level of
abstraction, the tests to be performed. See Clause 5.

The DLMS/COSEM Conformance Test Tool (CTT) implements the ATSs in the form of
Executable Test Suites. See Clause 6.

The conformance assessment process consists of the phases of preparation for testing, test
operations and the production of the conformance Test Result. See Clause 7.

The certification process consists of examining the conformance Test Results and the
publication of the Certifications. See Clause 8.

The quality program includes handling comments and questions and, when necessary,
initiating the maintenance of the DLMS/COSEM specification, the ATSs and/or the CTT. See
Clause 9.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 11/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

DLMS/COSEM
DLMS UA
Specification:
WG Maintenance
Blue Book and Green Book

DLMS/COSEM
Test parameters:
DLMS UA Yellow Book Conformance test plans -
from CTI and IUT
Abstract Test Suites

DLMS/COSEM DLMS/COSEM CTT -


Conformanxce Test Tool: Parameterized Executable
Executable Test Suites Test Suites

Conformance assessment

Comments, questions Test report

Yes
Defects ? Correct defects

No

Certification

Figure 1 – DLMS/COSEM conformance testing process

4.3 Main features of DLMS/COSEM conformance testing process


The main features of the DLMS/COSEM conformance testing process are summarized below:

• it covers DLMS/COSEM servers implementing the COSEM interface object model and the
DLMS/COSEM application layer, including the security suites. Conformance testing is limited to
the server’s functionality as presented at the communication interface(s). Other functions of the
server are out of Scope;
• testing can be performed using either the 3-layer, CO, HDLC based profile or using the TCP/IP
based profile. When the 3-layer, CO, HDLC based profile is used with direct HDLC connection, the
implementation of the HDLC layer is also tested;
• the CTT can be used for self-testing or third party testing;
• to obtain a Certification, the manufacturer of the IUT shall possess a registered three-letter
manufacturer ID; see http://dlms.com/organization/flagmanufacturesids/index.html;
• the CTT automatically generates the Test Result necessary for the Certification, see 7.3.8;
• the Certification is issued by the DLMS UA to the manufacturer, see Clause 8;
• the DLMS UA operates a Quality program to maintain the test plans and the CTT, see Clause 9.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 12/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

5 The conformance test plans


5.1 Scope of testing
The communication model of DLMS/COSEM servers to be tested is shown in Figure 2.

COSEM Interface object model

Messaging
3-layer, connection-oriented, HDLC TCP/IP based communication profile
based communication profile

COSEM Application layer

ACSE xDLMS

Transporting
Transporting

COSEM transport layer


(TCP based)

Network layer (IP)

Data link layer, HDLC based Data link layer

Physical layer Physical layer

Figure 2 – DLMS/COSEM interface object model and communication profiles

The COSEM Interface object model, specified in DLMS UA 1000-1 and the DLMS/COSEM
Application layer specified in DLMS UA 1000-2 are used in all IUTs.

The selection of the lower layers depends on the communication profile:

• in the 3-layer, connection-oriented, HDLC based communication profile, the DLMS/COSEM


Application layer is supported by the data link layer using HDLC protocol, specified in DLMS UA
1000-2 Ed. 7.0:2009 Clause 8 and this is supported by the physical layer specified in DLMS UA
1000-2 Ed. 7.0:2009 Clause 5;
• in the TCP/IP based communication profile the DLMS/COSEM Application layer is supported by
the DLMS/COSEM transport layer specified in DLMS UA 1000-2 Ed. 7.0:2009 Clause 7, and this
is supported by a set of lower layers appropriate for the communication media.
See also 7.3.5.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 13/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

The conformance test plans cover:

• the data link layer using the HDLC protocol;


• the DLMS/COSEM Application layer;
• the COSEM Interface objects; and
• the Security Suite 0.
The TCP and IP layers, when used, are implicitly tested.

5.2 IUT testing


A single IUT is tested against a single test source.

For the purposes of testing, the IUT is considered as a black box. The test session consists of
sending messages by the CTT to the IUT and observing the responses.

As access to protocol layer boundaries is not available, the interface object model and the
protocol stack are tested in combination. Therefore, the following assumptions are made:

• for testing the data link layer using the HDLC protocol, it is assumed that the physical layer works
correctly;
• for testing the DLMS/COSEM Application layer, it is assumed that the supporting layers work
correctly;
• for testing the COSEM Interface object model, it is assumed that the protocol stack works
correctly.
5.3 Structure of the abstract test plans
The abstract test plan comprises abstract test suites (ATSs).
NOTE The remaining part of 5.3 and 5.4 applies strictly to the protocol layer test plans only.

The abstract test suites have a hierarchical structure (see Figure 3) in which an important
level is the test case.

Each test case has a specified test purpose, such as verifying that the IUT has a certain
required capability (e.g. the ability to support certain packet sizes) or exhibit a certain
required behaviour (e.g. behave as required when a particular event occurs in a particular
state).

Within a test suite, nested test groups are used to provide a logical ordering of the test cases.

Associated with each test group is a test group objective.

Test cases may be modularised by using named subdivisions called subtests. Test events are
indivisible units of specification within a test step (e.g. the transfer of a single PDU to or from
the IUT).

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 14/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Test suite

Test group Test group Test group Test group

Test case Test case Test case

Test step Test step Test step

Test event Test event Test event

Figure 3 – Test suite structure

Test suites include test cases falling in the following categories (the list is not exhaustive):

• capability tests;
• tests of valid behaviour (positive tests);
• tests of syntactically invalid or inopportune behaviour (negative tests);
• tests related to each protocol state;
• PDU encoding variations;
• variations in values of individual parameters and/or combination of parameters.
5.4 Abstract test cases
An abstract test case is derived from a test purpose and from the relevant specifications. An
abstract test case:

• has a Test case name, used as a reference and relating the test case to the test group and the
test suite;
• gives the References pointing to the relevant clauses of the Blue Book DLMS UA 1000-1 and the
Green Book DLMS UA 1000-2 constituting the base specification the test case is related to and
derived from;
• describes the Test purpose;
• specifies the Prerequisites;
• specifies the Expected result, i.e. the expected behaviour of the IUT;
• specifies the test steps needed to define the path from the starting stable state of the test case up
to the initial state from which the test body will start; this test sequence comprises the Preamble;

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 15/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

• specifies the sequences of foreseen test events necessary in order to achieve the test purpose.
These sequences comprise the Test body. It may consist of one or more subtests;
• specifies the verdict to be assigned to each foreseen test outcome.
• specifies test steps needed to define the paths from the end of the test body up to the finishing
stable state for the test case; this test sequence comprises the Postamble;
The abstract test cases are formatted using the template shown in Table 1.

Table 1 – Template for test cases

Test case

References

Test purpose

Prerequisites

Expected result

Preamble

Test body Subtest 1

Subtest n:

Postamble

Comments

5.5 Outcomes and verdicts


The outcome is the sequence of test events of events observed during test execution.

A foreseen test outcome is one, which has been identified in the abstract test case i.e. the
events which occurred during test execution matched a sequence of test events defined in the
abstract test case. A foreseen test outcome always results in the assignment of a test verdict
to the test case.

The test verdict will be PASSED, FAILED, INAPPLICABLE or INCONCLUSIVE:

• PASSED – Means that the observed test outcome gives evidence of conformance to the
conformance requirement(s) on which the test purpose of the test case is focused, and is valid
with respect to the relevant specification(s);
• FAILED – Means that the observed test outcome either demonstrates non-conformance with
respect to (at least one of) the conformance requirement(s) on which the test purpose of the test
case is focused, or contains at least one invalid test event, with respect to the relevant
specification(s);
• INAPPLICABLE – Means that the test case cannot be run with the given CTI declarations and with
the information taken from the IUT;
• INCONCLUSIVE – Means that the observed test outcome is such that neither a pass nor a fail
verdict can be given.
An unforeseen test outcome is one, which has not been identified by the abstract test case,
i.e. the events, which occurred during execution of the test case did not match any sequence
of test events defined in the abstract test case. An unforeseen test outcome always results in
the recording of a test case error or an abnormal test case termination for the test case.

A test case error is recorded if an error is detected either in the abstract test case itself, (i.e.
an abstract test case error) or in its realization, (i.e. an executable test case error).

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 16/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

An abnormal test case termination is recorded if the execution of the test case is prematurely
terminated by the test system for reasons other than test case error.

The results of executing the relevant individual test cases will be recorded in the conformance
Test Results.

5.6 The HDLC based data link layer ATS


The HDLC based data link layer ATS is specified in DLMS UA 1001-3: ATS_DL V 5. Its
structure is shown on Figure 4.

Test suite
Data link layer
using HDLC protocol

Test group Test group Test group Test group Test group
HDLC_FRAME HDLC_ADDRESS HDLC_NDM2NRM HDLC_INFO HDLC_NDMOP

Test group Test group Test group


Test group Test group Test group Test group Test group
HDLC_ HDLC_ HDLC_
HDLC_ HDLC_ HDLC_ HDLC_ HDLC_
ADDRESS_ ADDRESS_ NDM2NRM
FRAME_P FRAME_N INFO_P INFO_N NDMOP_N
P N _P

Test cases Test cases Test cases Test cases Test cases Test cases Test cases Test cases

Figure 4 – Structure of the HDLC based data link layer ATS

5.7 The DLMS/COSEM application layer ATS


The DLMS/COSEM application layer ATS is specified in DLMS UA 1001-6:
ATS_AL_COSEM_SYMSEC_0 V 1.3. Its structure is shown on Figure 5.

Test suite
DLMS/COSEM Application layer

Test group Test group Test group Test group Test group
APPL_IDLE APPL_OPEN APPL_DATA_LN APPL_DATA_SN APPL_REL

Test group Test group


Test group Test group
APPL_ APPL_
APPL_ APPL_
DATA_LN_ DATA_SN_
IDLE_N REL_P
N N

Test cases Test cases Test cases Test cases Test cases

Figure 5 – Structure of the DLMS/COSEM application layer ATS

5.8 The COSEM interface objects ATS


The COSEM interface objects ATS is specified in DLMS UA 1001-6:
ATS_AL_COSEM_SYMSEC_0 V 1.3. Its structure is shown on Figure 6.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 17/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Test suite
COSEM interface objects

COSEM_X_Y

Multiple references

Mandatory objects

Figure 6 – Structure of the COSEM interface objects ATS

5.9 The Security Suite 0 (SYMSEC_0) ATS


The Security Suite 0 (SYMSEC_0) ATS is specified in DLMS UA 1001-6:
ATS_AL_COSEM_SYMSEC_0 V 1.3. Its structure is shown on Figure 5.

Test suite
DLMS/COSEM security Suite 0
(SYMSEC)

Test group
Test group Test group Test group Test group Test group
Secure data
Basic capability FrameCounter Key transfer Secure AA release Security policy
exchange

Test group Test group Test group Test group


Key_Tx_P Key_Tx_N DataX_P DataX_N

Test cases Test cases Test cases Test cases Test cases Test cases Test cases Test cases

Figure 7 – Structure of the Security Suite 0 (SYMSEC_0) ATS

5.10 Executable test suites and test cases


The executable test suites and test cases are derived from the relevant ATSs. There is one
ETS for each ATS and one executable test case for each abstract test case.

In each executable test case, the sequences of test events and the verdict assignments are
the same as in the corresponding ATS. The depth of analysing the PDUs is as specified in the
ATS.

6 The DLMS/COSEM conformance test tool


6.1 Overview
The DLMS/COSEM conformance test tool (CTT) is a Means Of Testing (MOT) as defined in
3.1.23, i.e. an implementation of the abstract test suites (ATS) in the form of executable test
suites (ETS).

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 18/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

It is a computer program running on a PC under 64 bit Windows (version 7 and 8.1) acting as
a test DLMS/COSEM client whereas the Implementation Under Test (IUT) acts as a
DLMS/COSEM server, see 6.3. The CTT is available in two editions:

• the standard edition produces the Test Result files (Report, Log and Line Traffic) necessary for
Certification;
• the extended edition provides a detailed log by decoding the messages exchanged between the
CTT and the IUT and thereby it facilitates interpreting the log.
The CTT can perform the following:

• the selection of test options, see 7.3.2;


• the parametrization of the test cases taking information from a Conformance Test Information
(CTI) file and the IUT itself, see 7.3.3 and 7.3.4;
• the selection of test cases to be performed, see 7.3.5;
• running test sessions, i.e. automatically executing the test cases selected; see 7.3.7;
• the generation of the Test Result, see 7.3.8.
6.2 CTT versions and editions
CTT 2.X is suitable for testing IUTs implementing Blue Book Ed. 10.0 and Green Book Ed. 6.0
(except the authentication mechanism using HLS).

CTT 3.0 is suitable for testing IUTs implementing Blue Book Ed 11.0 and Green Book Ed. 7.0
+ Amendment 3. It is available now and replaces CTT 2.X.

CTT 3.1 BB12_GB8 (X) – under development – will be suitable for testing IUTs implementing
Blue Book Ed. 12.0 and Green Book Ed. 8.0.

A new version of the CTT obsoletes all earlier version of the CTT after a transition period.

Earlier versions of the CTT can be used for re-testing earlier implementations.

The CTT standard edition allows performing all tests and produces the Report, Log and Line
Traffic files.

The CTT Extended edition produces a more detailed log file and it allows logging and viewing:

• in the case of the 3-layer, CO, HDLC based profile the HDLC frames in a decoded form;
• in the case of the TCP/IP profile the wrapper frames in a decoded form;
• the COSEM APDUs in XML format.
6.3 Operating system and hardware requirements
The CTT runs on a host computer under 64 bit versions of Windows 7 and Windows 8.1.

6.4 Licensing the CTT


The CTT can be licensed to any member of the DLMS UA.

The “Rules for availability and use” are published at the homepage of the DLMS UA, at
www.dlms.com under the “Conformance” menu.

6.5 Installing the CTT


The CTT can be downloaded from www.eurodcs.com and installed following the process
described by EuroDCS.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 19/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

7 The conformance assessment process


7.1 Overview
The conformance assessment process is the complete process of accomplishing all
conformance testing activities necessary to enable the conformance of the IUT to be
assessed.

The test can be performed by the manufacturer or by another (third) party:

• when the test is performed by the manufacturer itself, the CTT licensee and the manufacturer are
the same: this is known as self-testing;
• when the test is performed by another (third) party, the CTT licensee and the manufacturer are
different: this is known as third party testing. Any CTT licensee may act as a third party test
laboratory.
An overview of the conformance assessment process is given in Figure 8.

Start

Preparation of the
IUT for testing

Test operations

Test selection and


CTI
parametrization

Test session

Conformance test
report & log

End

Figure 8 – Conformance assessment process overview

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 20/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

The preparation for testing phase involves:

• the preparation of the IUT, see 7.2.1;


• the preparation of Conformance Test Information (CTI) file, see 7.2.2;
The test operations include:

• selection and parameterization of the test cases, see 7.3.3, 7.3.4 and 7.3.5;
• connection of the IUT to the CTT, see 7.3.6;
• running the test, see 7.3.7;
• production of the Test Result, see 7.3.8.
In the following, the elements of the conformance test process are described and the use of
the CTT is explained.

7.2 Preparation for testing


7.2.1 Preparation of the IUT
The configuration of the IUT for the test is the responsibility of the manufacturer. To facilitate
system integration, it is advisable that the test is performed on a configuration that is
representative for the intended application(s) so that all required features for which
DLMS/COSEM compliance is claimed are tested.

The following provides a guideline:

• if the IUT supports more than one logical device, then at least two logical devices should be
configured;
• if the IUT supports more than one application context, authentication security mechanism and
xDLMS context then the set of AAs declared shall cover each context and mechanism declared.
These AAs may be in the same logical device or spread across the logical devices;
• an AA between the same client and the server may may be declared several times with different
application contexts, authentication mechanisms, DLMS contexts and security contexts as
needed;
• if the IUT is a complete – fully integrated or modular – meter, then the mandatory Management
Logical Device (Server SAP = 0x01) shall be present and it shall support an AA with the public
client (Client SAP = 0x10, Server SAP = 0x01, no ciphering, no authentication security);
• if the IUT is a communication module, then the Management Logical Device (Server SAP = 0x01)
does not have to be present: it is assumed that it is present in the base meter. However a Logical
Device of the communication module shall support an AA with the public client, see above;
• the set of xDLMS services and capabilities (i.e. the conformance block) should be representative
for the intended application;
• the set of security features should be representative for the intended application;
• the set of interface objects available should be representative for the intended application;
• the AAs shall provide access to the objects and attributes to be tested, with appropriate access
rights;
• if load profiles with selective access are to be tested, then a sufficient amount of data should be
present. The conditions are specified in the COSEM objects test plan;
it is the responsibility of the manufacturer to restrict access rights to attributes, so that the CTT
cannot unduly modify them. This can be done by providing interface class and/or instance related
extra information in the CTI.
For testing IUTs supporting Security Suite 0, additional requirements are specified in the
Security Suite 0 (SYMSEC_0) ATS.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 21/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

For testing IUTs supporting push operation, additional requirements are specified in the
COSEM objects ATS.

See also 8.6, Scope and validity of the Certification.

7.2.2 Preparation of the conformance test information


For the parametrization of the executable test cases, the following information is necessary:

• information on the manufacturer;


• information on the IUT:
• logical devices;
• application associations;
• authentication security mechanisms;
• xDLMS context;
• security context and security material;
• media / energy types supported;
• COSEM interface objects.
Part of this information is obtained by CTT from the IUT itself during the test session through
the negotiation of the application context, the authentication mechanism and the xDLMS
context, as well as by reading the object_list attribute of the “Association SN /LN” objects
and, in the case of SN referencing, the access_rights_list attribute of “Association SN”
object (supported from version 1 of that IC).

Another part of this information has to be declared in the CTI.

The CTI file identifies the manufacturer and the IUT and contains specific information
necessary for testing. It can be prepared using the CTI template provided by the CTT. A Help,
explaining the syntax and the contents of the CTI is provided.

The CTI template is shown in DLMS UA 1001-6: ATS_AL_COSEM_SYMSEC_0 V 1.3, Annex


A.

7.3 Test operations


7.3.1 The CTT user interface
The CTT Main Menu comprises five items:

• the File menu allows saving and loading test results and to exit the CTT application;
• the Run menu allows to run a test session and to abort it, see 7.3.7;
• the View menu allows opening the Test Plans i.e. the Abstract Test Suites and the Line Traffic
window;
• the Settings menu allows setting the communication parameters and selecting some other
parameters and choices (Miscellaneous), see 7.3.6 and 7.3.2.
• the Help menu provides information on the CTT and its use.
The CTT also provides a number of panes:

• the REPORT pane displays the Test Report;


• the LOG pane displays a listing of the actions executed by the CTT;
• the CTI pane displays the CTI file;
• the TEST CASES pane allows selecting the test cases;

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 22/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

• the Traffic window – to be opened from the View menu – shows the messages sent (shown in
green) and received (shown in red) by the CTT. Time stamps are provided so that the Line Traffic
can be correlated with the Log. In the case of the 3-layer, CO, HDLC profile it shows the HDLC
frames. In the case of the TCP/IP profile, it shows the TCP streams.
Right-clicking in the REPORT, LOG, CTI panes and in the Traffic window opens a contextual
menu.
7.3.2 Miscellaneous settings
The Miscellaneous tab in the Settings menu – see Figure 9 – offers the following possibilities:

Figure 9 – Miscellaneous settings

• the COSEM object definition .dat file – see 7.3.4 – to be used by the CTT during the COSEM
object tests shall be selected;
NOTE The selections cannot be OK-d if this field is empty.

• the tester identity e.g. an email address. This will appear in the Test Report;
• the font used in the Test Result documents can be chosen;
• logging of lower layer frames and COSEM APDUs (in XML format) can be enabled. This feature is
available in the extended edition, see 6.2;
• the options on the occurrence of a fatal failure can be chosen, see 7.3.7.
7.3.3 The CTI file
The CTI file can be edited, saved and loaded from the CTI pane.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 23/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Figure 10 shows the CTI template loaded, with the contextual menu open, that allows loading
and saving CTI files and to edit the file.

Figure 10 – The CTI window (illustration)

The CTI file content and file syntax is described in the Help Menu.

The CTI allows declaring some DoNotTest options:

• ATTRIBUTES_TYPE_CHOICES: when this element is present, then CTT does not check the type
of COSEM object attributes specified as a CHOICE type;
• ATTRIBUTES_VALUES: when this element is present, then CTT does not check the values
(ranges, and sub-ranges) of COSEM object attributes.
If the purpose of the test session is to obtain a Certification, then DoNotTest has to be omitted
or empty.

7.3.4 The COSEM object definition file


The COSEM object definition file DLMS UA 1001-7 contains all information for testing the
COSEM objects including:

• the valid OBIS codes that – together with the COSEM interface class – provide the semantical
meaning of all data. In some cases, an OBIS code may be used with alternative interface classes;
• the valid data type of the attributes in the cases where the Blue Book DLMS UA 1000-1 allows
choices.
NOTE The COSEM object definition file currently covers abstract, electricity and gas related COSEM objects.

Figure 11 shows the COSEM object definition file cover sheet (version 2.9).

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 24/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0


COSEM conformance testing
device
language
message
specification

COSEM conformance testing -


Object definition tables

Author: DLMS UA WG Maintenance


Gyozo Kmethy, Victoria Varju

Version: V2.9
Filename: Object_defs_v2.9_released_GKVV141201.xlsx
Revision date: 01/12/2014
Status: Released
Digital signature of .dat file 389E-C19C-6A7F-0C42-32AE-5C1C-1DBA-2106
Copyright: © Copyright DLMS UA 1997-2014
Classification: DLMS User Association use only
Replaces: Object_defs_v2.8_released_141023.xlsx
Date: 22nd October 2014

Object definitions are valid combinations of Logical names, interface classes and the list of data
types for attributes which are declared in the relevant interface class definitions as type "CHOICE".

This version contains object definitions of abstract, electricity and gas related objects.

The textual name of each object is described in a modular way.

For the changes in the various version, see the "Change log" sheet.

A special tool converts the worksheets to a .dat file for use by the Conformance Test Tool.

CTT 2.X reports all objects found with their logical name, textual name and in case of "CHOICE"
attributes, with data type. The access rights to all attributes are also reported.

References:
[1] DLMS UA 1000-1 Ed. 12.0 2014-09-10, COSEM Interface Classes and OBIS Object
Identification System "Blue Book"
[2] EN 13757-1 Communication system for meters and remote reading of meters - Part 1: Data
exchange

DLMS User Association DLMS UA 1001-7 Ed. 2.9: 2014-12-01

Figure 11 – COSEM object definition file cover sheet

The COSEM object definition file is updated whenever new COSEM objects (OBIS codes
and/or interface classes) are defined by the DLMS UA.

The various versions are publicly available at www.dlms.com as excel files under the
CONFORMANCE and the DOCUMENTATION menu.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 25/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

The CTT uses a .dat file generated from the excel file. The various versions can be
downloaded by licensed CTT users from http://www.eurodcs.com/. The .dat file has to be
copied to the same folder where the CTT.exe file is located, see 6.5.
NOTE The excel file is not used by the CTT.

The Test Report contains the file name and the hash value of the excel file.

It is recommended using always the latest version. Earlier versions can be used for re-testing
whenever it is necessary.

7.3.5 Selection of the test cases


The TEST CASES window allows selecting the test cases. The test suites are:

• HDLC, see DLMS UA 1001-3: ATS_DL V 5;


• Application layer (APPL), COSEM object (COSEM) and Symmetric key security (SYMSEC_0) see
DLMS UA 1001-6: ATS_AL_COSEM_SYMSEC_0 V 1.3;
The test cases can be selected by test suites and within each test suite one by one, see
Figure 12.

During a test session, the CTT executes only those test cases that have been selected and
that are applicable based on the IUT configuration and the CTI declarations.

Test cases, which are not selected, are marked in the Report and the Log as “SKIPPED”.

If the purpose of the test session is to obtain a Certification, then all test cases shall be
selected. The CTT will automatically run all test cases applicable with the given CTI file and
IUT configuration. The tests that cannot be run are marked as “INAPPLICABLE”.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 26/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Figure 12 – Selection of the test cases

7.3.6 Connection of the IUT to CTT


The connection of the IUT to CTT depends on the communication profile used:

• when the 3-layer, CO, HDLC based communication profile is used, the IUT can be connected
directly or via an optical probe to a communication port of the host computer running the CTT. This
port shall not be shared by any other applications during running time. If an optical probe is used, it
shall be checked if the probe is echoing or not.
Battery operated meters can also be tested using the wake-up sequence specified in IEC
62056-21.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 27/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

• to connect the IUT via a pair of modems, the connection (com) – MODEM – MODEM –
IUT has to be established before testing, i.e. the dialling of (and the connection to) the
remote modem has to be done before testing;

• when the TCP/IP based communication profile is used over the GPRS network, then the
connection OS(com) – GPRS-MODEM – GPRS-MODEM – IUT has to be established
before testing.

Figure 13 – Communication settings

The communication parameters can be set from the Settings menu, Communication tab as
shown in Figure 13:

• COM port: here, the communication port used in the case of the the 3-layer, CO, HDLC based
profile can be selected;
• Format: refers to the physical layer of the 3-layer, CO, HDLC based communication profile and it
shall be set to 8N1: asynchronous transmission with 1 start bit, 8 data bits, no parity and 1 stop bit;
• Echoing: this shall be set if the optical probe is used and it is echoing;
• Server IP: when the TCP/IP based profile is used, the IP address of the IUT shall be entered here;
NOTE The way to obtain the IP address depends on the provider and the SIM card and the process is out of the
Scope of this document.

• “My IP addresses” holds the IP address of the PC running the CTT. It is needed only if the IUT has
to establish the TCP connection to the CTT i.e. when the IUT is pushing. It shall be configured in
the IUT.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 28/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

7.3.7 Test sessions


If the IUT supports more than one communication profile, then a test session shall be
performed for each communication profile for which compliance is claimed.

If the IUT supports more than one communication interface, then a test session may be
performed on each interface.

A test session can be started from the Run menu in two ways:

• “Run”: the test session runs on the Logical Devices and AAs enabled, with the test cases, media
and “Do not test” options selected. At the end of the test session a “Test Result” .zip file can be
created. Test cases not selected are marked in the Test Report and Log as “SKIPPED”;
• “Run for Certification”: for this, all Logical Devices and AAs must be enabled, all tests cases shall
be selected and there shall be no “Do not test” options specified. At the end of the test session a
“Test Result” .zip file can be created. Test cases that could not be run on the IUT with the given
IUT configuration and CTI declarations are marked in the Test Report and the Log as
“INAPPLICABLE”.
The progress of the test can be followed via the REPORT and LOG panes and the Traffic
window (to be opened from the View menu).

In the case when a failure occurs in a test case that may affect running other test cases the
CTT raises a fatal failure. CTT allows the user to choose one of a predefined set of options:

• “Abort the test session”: if this choice is taken, the test session is aborted;
• “Continue”: if this choice is taken, the test session is continued;
• “Ask the user”: if this choice is taken, the test session is suspended and a dialog box is displayed.
If the operator chooses to continue the test session, it is resumed. If the operator chooses not to
continue, then the test session is aborted.
In all cases, the reason for raising the fatal failure is logged.

In some cases, an EXCEPTION can occur during the test session. These may be caused by
inappropriate settings or by abnormal data received from the IUT. The Help file provides more
information on this.

The test session can be aborted any time from the Run menu.

7.3.8 Production of the Test Result


7.3.8.1 Overview
The Test Result comprises five files that can be produced at the end of the test session from
the File / Save Test-Result:

• _CTI.txt is the CTI file;


• _Report.txt is a text file holding the content the ''REPORT'' pane;
• _Log.txt is a text file holding the content of the ''LOG'' pane;
• _Traffic.rtf is a rich text file holding the content of the Traffic window;
• _hash.txt is a text file containing a digest of the 4 other files.
The files are best viewed from CTT itself or using a suitable Large Text Viewer program.

An existing Test Result file can be loaded using File / Load Test-Result. The files of the zip
archive are displayed in their respective panes and windows. The integrity of the archive is
verified using the _hash file.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 29/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

7.3.8.2 The Report


The main elements of the Report are the following:

1) General information on the test session:


• date of testing;
• CTT version;
• information on the licensee;
• information on the tester;
• whether the test session was started as “Run for Certification”;
2) Identification of the IUT;
3) A summary of results and the features supported (aggregated over all logical devices and AAs):
• number of tests executed for each test suite, and the summary of the verdicts;
• the communication profile supported;
• the application context names supported;
• the ACSE and xDLMS features supported;
• the security features supported;
• information on the Logical Devices found;
• the COSEM interface classes tested;
• the COSEM interface classes found but not tested;
4) the result of HDLC tests (when applicable);
5) the result of the Application layer test cases;
6) the result of the COSEM test cases;
7) the result of the SYMSEC_0 test cases;
8) the CTI file;
9) the name of the Object definition file used and its hash value.
A fragment of a sample report is shown in Figure 14.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 30/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

********************************************************************************
DLMS Conformance test report
27-MAY-2015 17:17:04
CTT 3.0 extended edition, 64bits (100)
Licensed to: i-cube (21-Jul-05)
Tester: Christian
RUN FOR CERTIFICATION
********************************************************************************

******************
* Identification *
******************

Identification = {
Manufacturer = "\i-cube"
FLAGid = "\ICU"
Type = "\simulation"
SerialNr = "000102"
Comment = 1234
Comment = "\comment 2"
}

***********
* Summary *
***********

TYPE TOTAL SKIPPED INAPPLICABLE INCONCLUSIVE PASSED FAILED


---- ----- ------- ------------ ------------ ------ ------
HDLC All(Communication profile not HDLC)
APPL 52 0 10 0 42 0
COSEM 988 0 1 0 986 1
SYMSEC 55 0 10 0 44 1

Communication profile supported: TCP

Application context names supported: LONG_NAMES,LONG_NAMES_WITH_CIPHERING

Security mechanisms supported: NO_SECURITY,HIGH_LEVEL_SECURITY_GMAC

Features supported: ACTION,ACTIVATE_SECURITY_POLICY


GENERAL_BLOCK_TRANSFER
GENERAL_GLO_CIPHERING,GET
MULTIPLE_REFERENCES,RLRQ_RLRE
SELECTIVE_ACCESS
SERVICE_SPECIFIC_BLOCK_TRANSFER,SET

Logical device(s) found: SAP = 1 is "4943553030303030" (ICU00000)


SAP = 2 is "4943553030303032" (ICU00002)
SAP = 3 is "4943553030303033" (ICU00003)

Tested COSEM classes: 1,3,4,5,6,7(1),8,9,10,11,12(3),15(2),17,18


19(1),20,21,22,23(1),24,24(1),25,26,27,28
28(2),29,29(1),29(2),40,41,42,43,44,45,46
47,48,50(1),51,52,53,55(1),56,57,58,59,61
63,64,65,70,71,72,73(1),74,80,81,82,83,84
85,86,90(1),91(1),92(1),101,102,103,104,105

COSEM classes found but not tested: (none)

**************
* HDLC Tests *
**************

**************
* APPL Tests *
**************

T_APPL_IDLE_N1

Figure 14 – Fragment of a sample Test Report

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 31/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

7.3.8.3 The Log


The Log displays the listing of all actions executed by CTT, see Figure 15.
STARTED 27-May-15 17:35:47
0'00.001 Starting server listening on port 4059
0'00.013 HDLC Tests
0'00.014 WARNING: All HDLC tests skipped, communication profile is TCP
0'00.017 APPL Tests
0'00.020 Starting T_APPL_IDLE_N1
0'00.021 T_APPL_IDLE_N1 0/0, ServerSAP = 1, ClientSAP = 16, LONG_NAMES, NO_SECURITY
0'00.569 VERDICT: Data exchange in IDLE state PASSED

0'00.572 Starting T_APPL_OPEN_1


0'00.574 T_APPL_OPEN_1 0/0, ServerSAP = 1, ClientSAP = 16, LONG_NAMES, NO_SECURITY
0'00.576 SubTest 1.Establish an AA using the parameters declared
0'01.744 VERDICT: 1.Establish an AA using the parameters declared PASSED

0'01.746 SubTest 2.Check that the AA has been established


0'01.858 VERDICT: 2.Check that the AA has been established PASSED

0'01.860 SubTest 3.Release the AA


0'01.962 VERDICT: 3.Release the AA PASSED

0'01.965 T_APPL_OPEN_1 0/1, ServerSAP = 1, ClientSAP = 1, LONG_NAMES_WITH_CIPHERING,


HIGH_LEVEL_SECURITY_GMAC
0'01.967 SubTest 1.Establish an AA using the parameters declared
0'03.259 VERDICT: 1.Establish an AA using the parameters declared PASSED

0'03.261 SubTest 2.Check that the AA has been established


0'03.374 VERDICT: 2.Check that the AA has been established PASSED

0'03.376 SubTest 3.Release the AA


0'03.480 VERDICT: 3.Release the AA PASSED

Figure 15 – Basic log

The CTT extended edition provides a more detailed log. Figure 16 shows the detailed log
presenting COSEM APDUs in XML format.

STARTED 27-May-15 17:17:04


0'00.001 Starting server listening on port 4059
0'00.002 HDLC Tests
0'00.008 WARNING: All HDLC tests skipped, communication profile is TCP
0'00.008 APPL Tests
0'00.010 Starting T_APPL_IDLE_N1
0'00.011 T_APPL_IDLE_N1 0/0, ServerSAP = 1, ClientSAP = 16, LONG_NAMES, NO_SECURITY
REQUEST:
<GetRequest>
<GetRequestNormal>
<InvokeIdAndPriority Value="C1" />
<AttributeDescriptor>
<ClassId Value="000F" />
<InstanceId Value="0000280000FF" />
<AttributeId Value="01" />
</AttributeDescriptor>
</GetRequestNormal>
</GetRequest>
WRAPPER[1] Sent 000100100001000DC001C1000F0000280000FF0100
+400 WRAPPER[1] Rec 0001000100100003D80101
RESPONSE:
<ExceptionResponse>
<StateError Value="ServiceNotAllowed" />
<ServiceError Value="OperationNotPossible" />
</ExceptionResponse>
0'00.552 VERDICT: Data exchange in IDLE state PASSED

0'00.555 Starting T_APPL_OPEN_1


0'00.557 T_APPL_OPEN_1 0/0, ServerSAP = 1, ClientSAP = 16, LONG_NAMES, NO_SECURITY
0'00.559 SubTest 1.Establish an AA using the parameters declared

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 32/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

REQUEST:
<AssociationRequest>
<ApplicationContextName Value="LN" />
<InitiateRequest>
<ProposedDlmsVersionNumber Value="06" />
<ProposedConformance>
<ConformanceBit Name="Action" />
<ConformanceBit Name="EventNotification" />
<ConformanceBit Name="SelectiveAccess" />
<ConformanceBit Name="Set" />
<ConformanceBit Name="Get" />
<ConformanceBit Name="DataNotification" />
<ConformanceBit Name="MultipleReferences" />
<ConformanceBit Name="BlockTransferWithAction" />
<ConformanceBit Name="BlockTransferWithSetOrWrite" />
<ConformanceBit Name="BlockTransferWithGetOrRead" />
<ConformanceBit Name="Attribute0SupportedWithGet" />
<ConformanceBit Name="PriorityMgmtSupported" />
<ConformanceBit Name="Attribute0SupportedWithSet" />
<ConformanceBit Name="GeneralBlockTransfer" />
<ConformanceBit Name="GeneralProtection" />
</ProposedConformance>
<ProposedMaxPduSize Value="FFFF" />
</InitiateRequest>
</AssociationRequest>
WRAPPER[1] Sent
000100100001001F601DA109060760857405080101BE10040E01000000065F1F040060FE9FFFFF
+530 WRAPPER[1] Rec
00010001001000346132A109060760857405080101A203020100A305A103020100890760857405080200BE10040E08
00065F1F0400601A9D05000007
RESPONSE:

Figure 16 – Detailed log presenting COSEM APDUs in XML format

7.3.8.4 Traffic
The Traffic window– see Figure 17 – can be opened from the View menu. As the test
progresses, it shows the frames sent to (in green) and the received from (in red) the IUT.
TCPServer 0'00.001 +88851 Start Listening
TCPClient 0'00.022 +21 Connecting
TCPClient 0'00.147 +124 Connected to 127.0.0.1:4058
TCPClient 0'00.149 +1 000100100001000DC001C1000F0000280000FF0100
TCPClient 0'00.549 +400 0001000100100003D80101
TCPClient 0'00.558 +8 Disconnecting
TCPClient 0'00.559 +0 Disconnected
TCPClient 0'00.560 +1 Waiting TCP disconnect to connect delay 500 ms
TCPClient 0'01.060 +499 Connecting
TCPClient 0'01.185 +124 Connected to 127.0.0.1:4058
TCPClient 0'01.191 +5
000100100001001F601DA109060760857405080101BE10040E01000000065F1F040060FE9FFFFF
TCPClient 0'01.721 +530
00010001001000346132A109060760857405080101A203020100A305A103020100890760857405080200BE10040E08
00065F1F0400601A9D05000007
TCPClient 0'01.746 +24 000100100001000DC001C1000F0000280000FF0100
TCPClient 0'01.836 +90 000100010010000CC401C10009060000280000FF
TCPClient 0'01.864 +27 00010010000100056203800100
TCPClient 0'01.944 +80 00010001001000056303800100
TCPClient 0'01.966 +21 Disconnecting
TCPClient 0'01.967 +0 Disconnected
TCPClient 0'01.986 +19 Waiting TCP disconnect to connect delay 485 ms
TCPClient 0'02.471 +485 Connecting
TCPClient 0'02.596 +124 Connected to 127.0.0.1:4058
TCPClient 0'02.602 +5
000100010001005A6058A109060760857405080103A60A040843545430303030308A0207808B0760857405080205AC
0D800BAC93A248C8FC1CA5C20B27BE230421211F3026ECF3DAC94C41B52F6118D3FAAF68BA253187948CF66D01FD93
264BDCC9
TCPClient 0'03.112 +510
000100010001006B6169A109060760857405080103A203020100A305A10302010EA40A040849435530303030308802
0780890760857405080205AA1280106A495F8D185723E288A794A19AE64024BE230421281F3026ECF3DFBA974DD84A
A9262FA3FD4349F7A268D0B41B4BBCDA8448715093

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 33/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

TCPClient 0'03.145 +32


000100010001003CDB084354543030303030313026ECF3E01ACF25914E9576DC0DA746172438C8242066F456E2D73F
4EAB5264C69B2608DD37694CCF5BEB4FA71D51424D
TCPClient 0'03.235 +90
0001000100010035DB0849435530303030302A3026ECF3E1E385CBA72F04AF871CE9C0760AC57172D7F69120791D8F
C5E0A2C5E74BB43B53D2E1560671
TCPClient 0'03.267 +32 000100010001000DC001C1000F0000280000FF0100
TCPClient 0'03.347 +79 000100010001000CC401C10009060000280000FF
TCPClient 0'03.382 +34
000100010001002A6228800100BE230421211F3026ECF3E1B2CC46BB00C666E7B147F7C2005397473D1862BC5D0257
F64359

Figure 17 – The Traffic window

7.4 Repeatability of results


In order to achieve the objective of credible conformance testing, it is clear that the result of
executing a test case on an IUT should be the same whenever it is performed. Experience
shows that it may not be possible to execute a complete conformance test suite and observed
test outcomes which are identical to those obtained on another occasion.

Nevertheless, at the test case level, every effort has been made to minimize the possibility
that a test case produces different test outcomes on different occasions.

7.5 Requirements for test laboratories


Conformance assessment may be performed by a manufacturer (self-testing), a third party or
a user.

If the test is done by the manufacturer, the test laboratory should be an identifiable part of the
manufacturer’s organisation.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 34/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

8 The certification process


8.1 General
The purpose of the certification process is to obtain a “DLMS/COSEM compliant” Certification.
This clause describes the necessary steps.

8.2 Initiation of the certification process


The certification process may be initiated by any member of the DLMS UA.

The manufacturer of the device to be certified shall be a member of the DLMS UA and shall
possess a three-letter manufacturer ID.

For more information on the manufacturer IDs (FLAG ID) see www.dlms.com,
ORGANIZATION menu.

8.3 Submission of conformance test documents


The Test Results generated by the CTT (as .zip files) shall be submitted to
[email protected]. The DLMS UA registers and examines them to confirm their
authenticity and to verify that all technical and administrative acceptance conditions are met.
If so, it prepares the Certification. The DLMS UA does not publish the test results; these may
be obtained from the manufacturer.

The DLMS UA maintains the right to discuss the contents of the conformance test result with
the organization having initiated the certification process.

8.4 Technical and administrative checks


The technical verification and acceptance criteria are the following:

• the Test result .zip file includes the Report, the Log, the Line traffic, the CTI and the hash files and
the hash value is correct;
• a recent version of the COSEM object definitions file has been used;
• the Logical Device Name of the Management Logical Device is syntactically correct and the Three
Letter Manufacturer ID (AKA FLAG ID) matches the one registered for the manufacturer;
• the test session has been started with “Run for Certification”;
• there are no FAILED test cases;
• the reasons for INCONCLUSIVE test results are acceptable. For example selective access –
which is not mandatory – is not available,
The administrative checks before issuing the Certification are are the following:

• the test result has been generated by a test laboratory having purchased the CTT. This is verified
by checking the license owner’s name reported;
• the manufacturer and the tester – if different – are active members of the DLMS UA;
• in the case of third party testing, the license fee is paid;
• there was no Certification issued for the same type.
8.5 The Certification
A Certification is issued if the IUT passed all applicable tests. It is prepared using the data
taken from the Test Result(s) and contains the following elements:

• a unique Certification number assigned by the DLMS UA;


• the identification of the IUT;
• the identification of the Management Logical Device (bound to SAP = 1);

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 35/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

• the identification of the manufacturer as declared in the CTI;


• the identification of the CTT version;
• the identification of the licensee;
• the version of the COSEM Object definitions file used;
• the media (energy types) used for COSEM object testing;
• for each test performed:
• the communication profile and the opening mode (when applicable);
• the application contexts;
• the security suite;
• the date and time of testing; and
• the hash value of the Test Result
• an indication that the Certification is only valid for the functions successfully tested;
• an indication that the test is executed on one specimen of the product and that the test results may
not be applicable for other test specimens;
• any remarks added by the DLMS UA, as seen fit;
• date of issue and signature;
• a summary of features for each test session extracted from the Test Report.
The Certification with example data is included as Annex A.

The Certification is always issued to the manufacturer as given in the Test Report. The
Certifications are published on the website of the DLMS UA at www.dlms.com,
CONFORMANCE menu.

The data in a Certification published cannot be changed. If the data of the manufacturer or the
IUT change, a new Certification is necessary.

The Certification obtained entitles the manufacturer to place the “DLMS/COSEM compliant
mark” on its products and documentation. See Figure 18.

Figure 18 – The DLMS/COSEM compliant logo

The test results are filed by the DLMS UA.

8.6 Scope and validity of the Certification


The DLMS/COSEM Certification certifies that the IUT as identified by the manufacturer/test
house passed the tests applicable for the given configuration.

The supporting evidence is the conformance Test Result.

DLMS/COSEM compliance can be claimed only for the features tested.

The DLMS UA does not control if the meters manufactured are identical to the IUT tested.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 36/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

The Certification remains valid as long as no design or manufacturing changes in


communication hardware and firmware with essential influence on the implementation have
been made. If changes have been made, a re-test is necessary. This is left to the judgement
of the manufacturer.

8.7 Disclaimer
The DLMS UA takes all possible effort to ensure that the conformance test plans and the CTT
are line with the DLMS/COSEM specification and provide a reasonable depth of testing.

The Certification does not mean however that an absolute proof of conformance is given.

9 The quality program


9.1 General
An important element of the DLMS/COSEM conformance testing process is the quality
program. It includes:

• validation of the conformance test plans and the CTT;


• the support provided to users;
• maintenance.
9.2 Validation of the Abstract Test Suites and CTT
The validation of the ATSs and its implementation, the CTT has been done in several steps:

1. the test plans have been written by experts from different members of the DLMS UA WG
Maintenance based on DLMS UA 1000-1 and DLMS UA 1000-2.
2. the executable Test Suite has been validated by running them against several implementations.
9.3 Assistance provided to users
The DLMS UA, upon request, provides support to the users of the tool. For this purpose, test
results can be sent to the DLMS UA: [email protected].

9.4 Maintenance
The DLMS UA maintains the conformance testing process to eliminate any problems with the tool
found during testing, to enhance tests and to accommodate changes in the DLMS/COSEM
specification. The procedure is the following:

1. a proposal, together with a justification is made to modify or to add a test. This can be initiated by
any member of the DLMS UA or by the DLMS UA itself;
2. the request is investigated by the DLMS UA;
3. if the request is accepted, the relevant abstract test cases are amended by the DLMS UA;
4. the new abstract test cases are validated by the DLMS UA;
5. the new abstract test cases are implemented;
6. the amended ATSs are published;
7. a new version of the CTT is made available to the licensed tool users.
This process is supported by the DLMS UA website.

In the following, the process is illustrated by use cases.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 37/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

9.5 Use cases


9.5.1 Use case 1 – introducing a new standard OBIS code
A manufacturer needs a new standard OBIS code to support a new functionality in the
metering equipment.

The proposal is submitted to the DLMS UA. The DLMS UA checks if the proposal is in line
and can fit with the DLMS/COSEM specification. If approved, the COSEM Object definition
tables DLMS UA 1001-7 are amended and a new .dat file is made available for download. See
also 7.3.4.

9.5.2 Use case 2 – modification of an existing test


If it is found that despite of careful validation, a test case is not fully compliant with the
DLMS/COSEM specification or if it is necessary to enhance a test, then a proposal may be
submitted to the DLMS UA and the process described above is followed.

9.5.3 Use case 3 – adding a test for a new standard feature


A manufacturer implements a feature described in the DLMS/COSEM specification, but which
is not yet covered in the CTT.

The manufacturer submits the proposed test plan to the DLMS UA and the process described
above is followed.

9.5.4 Use case 4 – revision of the specification


The DLMS UA initiates a revision or amendment of the DLMS/COSEM specification (e.g.
introducing a new protocol stack).

The conformance requirements and the test plans are prepared together with the standard,
but at least upon the acceptance of the new standard.

The DLMS UA initiates the maintenance of the tool.

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 38/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Annex A
Certification template (with sample data) - (informative)

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 39/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 40/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Annex B
(normative)
Conformance Test Plans
The Conformance test plans – Abstract Test Suites – used in CTT 3.0 are the following:

• DLMS/COSEM conformance testing – Conformance test plans - Data link layer using HDLC
protocol: DLMS UA 1001-3: ATS_DL V 5;
• DLMS/COSEM conformance testing – Abstract Test Plans – DLMS/COSEM application layer –
COSEM interface objects – Symmetric key security suite: DLMS UA 1001-6:
ATS_AL_COSEM_SYMSEC_0 V 1.3.
The Conformance test plans are attached to the complete Yellow Book …

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 41/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Annex C
(informative)
Bibliography
ETSI ETR 021: 1991, Advanced testing methods (ATM); Tutorial on protocol conformance
testing (especially OSI standards and profiles) ETR/ATM-1002

IEC 60870-5-6:2006, Telecontrol equipment and systems – Part 5-6: Guidelines for
conformance testing for the IEC 60870-5 companion standards

IEC 61850-10: 2005, Communication networks and systems in substations – Part 10:
Conformance testing

IEC 62056-1-0, Electricity metering data exchange – The DLMS/COSEM suite – Part 1 0:
Smart metering standardisation framework

IEC 62056-21, Electricity metering – Data exchange for meter reading, tariff and load control
– Part 21: Direct local data exchange

IEC 62056-46, Electricity metering – Data exchange for meter reading, tariff and load control
– Part 46: Data link layer using HDLC protocol

IEC 62056-5-3, Electricity metering data exchange – The DLMS/COSEM suite – Part 5-3:
DLMS/COSEM application layer

IEC 62056-6-1, Electricity metering data exchange – The DLMS/COSEM suite – Part 6-1:
Object Identification System (OBIS)

IEC 62056-6-2, Electricity metering data exchange – The DLMS/COSEM suite – Part 6-2:
COSEM interface classes 3

IEC 62056-7-6, Electricity metering data exchange – The DLMS/COSEM suite – Part 7-6: The
3-layer, connection-oriented HDLC based communication profile

IEC 62056-8-3, Electricity metering data exchange – The DLMS/COSEM suite – Part 8-3:
Communication profile for PLC S-FSK neighbourhood networks

IEC 62056-9-7, Electricity metering data exchange – The DLMS/COSEM suite – Part 9-7:
Communication profile for TCP-UDP/IP networks

EN 13757-1:2014, Communication system for meters – Part 1: Data exchange

EN 13757-3:2004, Communication systems for and remote reading of meters – Part 3:


Dedicated application layer
NOTE This standard is referenced in the “M-Bus client setup” interface class version 0.
EN 13757-3:2013, Communication systems for and remote reading of meters – Part 3:
Dedicated application layer
NOTE This standard is referenced in the M-Bus client setup interface class version 1.

ITU-T X.291:1992, OSI conformance testing methodology and framework for protocol
recommendations for IUT-T applications – Abstract test suite specification

ITU-T X.293:1995, OSI conformance testing methodology and framework for protocol
recommendations for IUT-T applications – Test realization

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 42/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

INDEX

3-layer, CO, HDLC based communication Idle testing state 7


profile 13, 27 Implementation Under Test (IUT) 7, 11
Abnormal test case termination 16, 17 Inapplicable 7
Abstract Test Case 5, 15, 16 INAPPLICABLE 16, 26
Abstract Test Case error 16 INCONCLUSIVE 16
Abstract Test Suite 6, 10, 11, 14, 18, 37 Inconclusive (verdict) 7
Access right 21 Initial testing state 7
Application Association 10, 22 Inopportune test event 7
Application context 21, 36 Intended application 21
Ask the user 29 Interface object 21
Authentication security mechanism 21, 22 Interworking 11
base specification 15 Invalid test event 16
Basic interconnection test (BIT) 6 IP address 28
Behaviour 14 IUT, identification 35
Behaviour test 6, 15 IUT, preparation 21
Black box 14 Line traffic 33
Capabilities, required 11 Load profile 21
Capability 6, 14 Log 32
Capability test 6, 15 Logging, basic 32
Certification 26, 35 Logging, detailed 33
Certification process 11, 35 Logical device 21
Certification, scope and validity 36 Maintenance 37
Communication interface 12, 29 Management Logical Device 21
Communication model 13 Manufacturer 35
Communication port 28 Manufacturer identification 36
Communication profile 11, 29, 36 Means Of Testing 8, 18
Companion Specification 10 Negative test 8, 15
Conformance assessment process 5, 6, 11, Observed test outcome 8, 16, 34
20 OSI conformance testing 11
Conformance log 6 Parameterized executable test case 8
Conformance test information 21 Pass (verdict) 8
Conformance test information (CTI) 6 PASSED 16
Conformance test plan 11, 13, 14, 37 Physical layer 14
Conformance test result 17, 36 Positive test 8, 15
Conformance Test Tool 11, 18 Postamble 16
Conformance testing 6, 11, 12, 20, 34, 37 Preamble 15
Conforming implementation 6 Protocol conformance test report (PCTR) 8
COSEM interface object 17, 18 Protocol Data Unit 10
COSEM interface object model 11, 12, 13 Protocol stack 14
COSEM object definition file 23, 24, 36 Quality program 11, 12, 37
CTI file 23 Repeatability of results 8, 34
CTT, installation 19 Run 29
CTT, licensing 19 Run for Certification 29
Data link layer using HDLC protocol 13 Security context 22
DLMS/COSEM application layer 12 Security features 21
DLMS/COSEM compliant 35 Security material 22
DLMS/COSEM compliant mark 36 Security suite 21, 36
DLMS/COSEM conformance testing 11 Selective access 21
DLMS/COSEM transport layer 13 Self-testing 12, 20, 34
DoNotTest options 24 Semantically invalid test event 8
Executable Test Case 7 Service Access Point 10
Executable Test Case error 7, 16 SKIPPED 26
Executable Test Suite 7, 10, 11, 18 Stable testing state 9
Expected result 15 Static conformance review 9
Extra information 21 Supporting layer 14
FAILED 16 Syntactically invalid test event 9
Foreseen test event 16 TCP/IP based communication profile 13, 28

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 43/44

© Copyright 2001-2015 DLMS User Association


DLMS/COSEM - EXCERPT FROM Conformance Testing Process, Fifth Edition V 1.0

Template, test case 16 Test result, hash value 36


Test 16 Test session 10, 26
Test body 9, 16 Test step 14
Test case 9, 14, 15, 34 Test step (sub-test) 9
Test case error 9, 16 Test suite 10, 15, 34
Test case name 15 Test tool version 36
Test event 9, 14, 16 Third party testing 12, 20, 34
Test group 9, 14, 15 Three-letter manufacturer ID 12, 35
Test group objective 9, 14 Valid test event 10
Test laboratory 9, 34 Validation of the conformance test plan 37
Test outcome 16 Verdict 10, 16
Test outcome, foreseen 7, 16 Verdict, FAIL 7
Test outcome, unforeseen 10, 16 xDLMS context 21, 22
Test purpose 9, 14, 15 xDLMS services 21

DLMS User Association EXCERPT - 2015-06-19 DLMS UA 1001-1 Ed. 5.0 V 1.0 44/44

© Copyright 2001-2015 DLMS User Association

You might also like