Eassp Specification Vol1 v1.1
Eassp Specification Vol1 v1.1
EUROCONTROL Specification
for ATM Surveillance System Performance
(Volume 1)
Edition: 1.1
Edition date: September 2015
Reference nr: EUROCONTROL-SPEC-147
ISBN: 978-2-87497-022-1
EUROPEAN ORGANISATION
FOR THE SAFETY OF AIR NAVIGATION
DOCUMENT CHARACTERISTICS
TITLE
The following table records the complete history of the successive editions of the present document.
Volume 2
Alignment of the Executive Summary on the
one of Volume 1.
Publications
EUROCONTROL Headquarters
96 Rue de la Fusée
B-1130 BRUSSELS
CONTENTS
CONTENTS ................................................................................................................. 6
1 Introduction ....................................................................................................... 13
1.1 Aim, scope and object of the document ..................................................................................13
1.2 The supported Air Traffic Services and functions ...................................................................13
1.3 Category of surveillance system .............................................................................................13
1.4 Structure of the document .......................................................................................................14
1.5 Intended readers .....................................................................................................................14
1.6 Relationship with ICAO approach ...........................................................................................15
4 Conformity assessment.................................................................................... 38
4.1 Generalities .............................................................................................................................38
4.1.1 Conformity assessment approaches ...............................................................................38
4.1.2 Conformity assessment volume ......................................................................................38
4.1.3 Conformity assessment datasets ....................................................................................39
4.1.4 Conformity assessment periodicity..................................................................................39
4.1.5 Conformity assessment measurement point ...................................................................39
4.1.6 Definitions ........................................................................................................................40
4.1.7 Specific events to be investigated ...................................................................................40
4.2 Conformity assessment procedures and criteria ....................................................................41
4.2.1 Measurement interval ......................................................................................................41
4.2.2 Data item(s) probability of update ...................................................................................42
4.2.3 Ratio of missed 3D/2D position involved in long gaps ....................................................46
4.2.4 RMS error of the horizontal position ................................................................................47
4.2.5 Ratio of correlated horizontal position errors ..................................................................49
4.2.6 RMS value of the relative time of applicability of target reports in close proximity .........50
4.2.7 Average and maximum data age of forwarded pressure altitude ...................................50
4.2.8 Ratio of incorrect forwarded pressure altitude ................................................................50
LIST OF TABLES
LIST OF FIGURES
EXECUTIVE SUMMARY
This document provides performance requirements for ATM surveillance systems when supporting 3
and 5 NM horizontal separation applications. This specification has been developed by an
international group of experts from Air Navigation Service Providers (ANSP), system manufacturers
and National Supervisory Authorities (NSA).
This specification was developed in parallel with the draft Surveillance Performance and
Interoperability Implementing Rule (SPI IR). On 21 November 2011 the final rule (Commission
Implementing Regulation (EU) No 1207/2011) was published within the European Union Official
Journal. This specification therefore complements and refines the requirements included in this Single
European Sky (SES) regulation.
This document can be used by air navigation service providers to define, as required by Commission
Implementing Regulation (EU) No 1207/2011 of 22 November 2011, the minimum performance that
their surveillance system must meet. This specification also defines how the associated conformity
assessment must be performed.
This specification is generic and independent of technology. It must be supplemented by specific local
requirements that may be due to safety constraints, to local technological choices, to the need to
support other services and functions and other local requirements. This specification is written to be
compatible with recently published industry standards (EUROCAE) applicable to specific surveillance
sensor technologies (ADS-B RAD and NRA and WAM).
The requirements defined in this specification are mainly derived from practical experience,
operational needs analysis studies and technical studies.
Particular attention was paid to ensuring that each performance requirement is measurable and
accompanied by an associated conformity assessment process. In this regard, measurements made
on the basis of opportunity traffic are preferable as they fully reflect the system performance in its
operational environment. Alternatively flight trials may also be undertaken. Proof offered through
system design files or by system design assurance, the use of a test transponder or an injected test
target is also acceptable when the other options are impracticable.
For the time being this specification is addressing the ATM surveillance systems performance needed
to support 3 and 5 NM horizontal separation. In the future this specification may be extended to
address other air traffic services (e.g. other horizontal separation minima) and/or functions.
This volume 1 contains the mandatory and recommended requirements whereas volume 2 [RD 1]
contains informative appendices.
The changes introduced to raise this specification to Edition 1.1 correct minor errors and address
clarification requests which have become apparent since the publication of Edition 1.0.
It should be noted that further work to develop a Generic Surveillance Safety and Performance
Requirement (GEN SUR SPR) document is being performed in the frame of EUROCAE Working
Group 102. Upon its formal publication the GEN SUR SPR document will complement and in some
aspects is expected to partially supersede this EUROCONTROL Specification. It is therefore foreseen
that a further update to this specification may be required in the future to reflect and incorporate
elements of the published GEN SUR SPR.
It may be of interest to the reader to note that whilst European Commission Implementing Regulation
(EU) No 1207/2011 of 22 November 2011 referred to in this document was amended by Commission
Implementing Regulation (EU) No 1028/2014 of 26 September 2014 the amendment does not affect
the surveillance system performance requirements.
1 INTRODUCTION
1 In the context of this document surveillance system is restricted to equipment only, not covering people and procedures.
This document considers both cooperative and non-cooperative categories of surveillance systems:
• A cooperative surveillance system relies on and requires equipment on board the aircraft.
Such a system can provide all the surveillance data items pertaining to an aircraft
including information coming from the aircraft itself (e.g. pressure altitude, aircraft identity).
• A non-cooperative surveillance system does not require equipment on board the aircraft
but cannot provide information coming from the aircraft.
This document takes into account the lessons learnt from the application of the EUROCONTROL
Standard Document for Radar Surveillance in En-route Airspace and Major Terminal Areas [RD 2],
which are:
• Difficulties in practically assessing some specified requirements.
• Only applicable to SSR and PSR whereas new surveillance technologies are now
available (Mode S, WAM, ADS-B) and difficulties to transpose requirements to other
technologies (e.g. MSPSR).
• Imposed high level implementation choices (2 SSR for en-route and one PSR + 2 SSR for
major TMA) and difficulties to transpose requirements for other architectures.
• Lack of traceability between supported air traffic services or functions (i.e. users needs)
and technical requirements.
• …
It also takes into account lessons learnt from past and ongoing EUROCONTROL surveillance
deployment programmes and surveillance performance appraisal activity.
Commission Regulations
Essential
Requirements
(mandatory)
As such the provisions detailed herein support the following Essential Requirements:
• Seamless of operation by:
o Ensuring seamless operation of aircraft with surveillance systems over all of
Europe.
o Ensuring a defined minimum level of performance of surveillance systems in
Europe and therefore facilitating and enhancing surveillance data sharing in
Europe.
• Support new concept of operations by:
o Facilitating the introduction of new surveillance technologies.
The provisions detailed herein could also be used as an input to a surveillance system safety
assessment, the production of which is required by [AD1].
The objective is to define these requirements for each supported air traffic services or functions.
Being service/function specific will permit ANSP’s to tailor the surveillance system requirements in
accordance with its intended use (e.g. the services and functions it supports).
The objective of this specification is to define requirements that are as much as possible independent
of environment – applicable everywhere in Europe. Whereas the number of supported services and
functions is reasonable and they are well defined, the range of environments that can be met in
Europe is currently wide and it is difficult to classify objectively these different environments. Thus a
generic approach has been adopted.
2.2.3 Measurable
To be of use it is recognised that the requirements specified in this document can be easily
measurable and regularly monitored.
Section 4 specifies the conformity assement method for each specified requirement.
This specification defines a level of performance (quality of service) that a surveillance system shall
provide to ensure both a defined minimum level of interoperability with neighbouring systems and for
the seamless operation of flights over all of Europe.
It is to be noted that this document does not address interoperability from a data format point of view.
The objective is to define these requirements at a level which will allow as much design flexibility as
possible. For this reason the surveillance system performance requirements are defined end-to-end
(see Annex A - 2). The objective is to leave the maximum freedom to system designers in their
choices.
2.3 Role of this document within the surveillance system design process
The performance requirements detailed in this document are an initial input in the complex process of
designing a surveillance system.
The document contains requirements to cover generic scenarios for identified air traffic services.
These requirements should be supplemented by local criteria addressing particular features of the
local surveillance system environment and/or local business objectives. Such criteria may include, for
example:
• system capacity (business objectives)
• additional data items (e.g. Downlink Aircraft Parameters -DAP)
Surveillance systems have been developed and are used to improve ATM safety. However, infrequent
failures of its functions may contribute to ATM risk. A role of surveillance system safety assessment is
to analyse such failures, to verify that the potential contribution of surveillance system failures to ATM
risk remains within agreed limits and to define, if necessary, mitigations.
As an integral part of the design process, any surveillance system either being put in operation or
being modified, will be subject to a complete safety assessment process as required in [RD 28].
[AD1] also introduces mandatory requirements for a safety assessment to be conducted for existing
surveillance system.
The surveillance system mandatory performance requirements defined in this document can be used
as an input to local surveillance system safety assessment. For example, when using the
EUROCONTROL SAME (Safety Assessment Made Easier) framework ([RD 4]) these requirements
can be used as an input to the “Success approach”.
An association of cooperative and non-cooperative surveillance systems may also be used to cope
with a mixed environment (equipped and non-equipped aircraft) provided that the non-cooperative
traffic density is compatible with the additional ATCO workload described above. The required safety
assessment demonstrating that the system (equipment, procedure and people) can support the
intended services and functions in its environment shall take into account this specific workload.
In that case, the two systems may be, plus or minus, integrated into a single system or may even be
operated as two independent systems providing two parallel data streams to the ATCO.
The following chapters define separate performance requirements for cooperative and non-
cooperative surveillance systems. The conformity assessment procedures describe how to separate
the assessment of cooperative and non-cooperative performance requirements in case of association
of the two categories of surveillance systems.
In summary, the choice of the category of surveillance system(s) to be deployed, cooperative, non-
cooperative or association of both, is the decision of the ANSP depending on the local environment
and constraints such as the percentage of transponder equipped aircraft, traffic density, airspace
structure and design, business objectives, etc. Therefore there are no generic criteria to define which
category of system needs to be deployed.
From the previous list the following quality of service characteristics have been selected and further
refined:
• Time is translated in processing delay for the data items that are forwarded from the
aircraft to the surveillance system user on the ground.
• Coherence is translated in the time consistency of the provided aircraft positions.
• Capacity is not retained because it depends on surveillance system environment and
cannot be defined generically.
• Integrity is further refined in three different performance characteristics: core errors,
correlated errors, spurious and large errors of data items.
• Safety and security are deliberately not addressed in this document, but must be
addressed separately.
• Reliability is further refined in availability and continuity of the data items and of the
complete surveillance system.
• Priority has not been retained because it was not found applicable to the current
applications addressed in this document.
For each data item and for the complete system, performance metrics will be chosen within the 7
columns corresponding to the different quality of service that have been considered in this document.
For each of the addressed application, a table (see example Table 1) will map for the provided data
items and for the system (rows), the specified performance requirements/metrics onto the retained
quality of service (columns).
Availability Continuity Integrity Time Coherence
Data item 1 X X X X X - X
Data item 2 X X - - - X -
System X X - - - - -
5 NM horizontal 3 NM horizontal
separation separation
The following OPA scenarios have been defined for 3 NM and 5 NM horizontal separation:
• Crossing track scenarios (3 and 5 NM separation) see Volume 2 [RD 1] Appendix V - 1
and 2
• Same track scenario (3 and 5 NM separation) see Volume 2 [RD 1] Appendix V - 3 and 4
• Reciprocal track scenario (3 and 5 NM separation) see Volume 2 [RD 1] Appendix V - 5
and 6
• Vertical crossing track scenario (3 and 5 NM separation) see Volume 2 [RD 1] Appendix V
- 7 and 8
• Vertically separated track scenarion (3 and 5 NM separation) see Volume 2 [RD 1]
Appendix V - 9 and 10
This environment descrition together with the operational service is described in § 3.1 form the OSED
(Operational Service and Environment Description) for 3/5 NM separation provided by ATCO using
cooperative surveillance system.
A fundamental assumption of the OSED is that the operational service is provided to cooperative
aircraft that are fully compliant with the avionics requirements detailed in [AD1].These requirements
will be further detailed in a forthcoming EASA Certification Specification ACNS.
The local surveillance system safety assessment will therefore address instances in which the
aircraft’s avionics presents an anomaly as well as the possible intrusion of aircraft that are not
equipped in accordance with the requirements detailed in [AD1] and in the upcoming EASA
Certification Specification ACNS.
Any differences in local environments from that defined in this sub-section shall be accounted for in
accompanying analysis prior to local implementation.
The airspace classes in which separation services must be provided are described in Annex C - 4.1.
The airspace structure is further defined in Annex C - 4.2.
The following information elements are required from the cooperative surveillance system for the
provision of surveillance separation. This list does not include flight plan elements.
The following data items shall be provided by the cooperative surveillance system under the form of
message-structured and digitised information:
Positional data:
• Horizontal (2D) position;
• Time of applicability of horizontal position(for conformity assessment);
• Vertical position based upon pressure altitude received from the aircraft;
• Time of applicability of vertical position (for conformity assessment).
Operational identification data:
• Aircraft identity (ICAO Aircraft Identification and/or Mode 3/A code) reported by the
aircraft.
Supplementry indicators:
• Emergency indicator (General emergency, radio failure and unlawful interference);
• Special Position Identification (or Indicator) SPI.
Surveillance data status:
• Cooperative/non-cooperative/combined;
• Coasted/not coasted (position).
The provision of the above data items is compliant with Annex I § 1.1 and 1.2 of [AD1] when using a
cooperative surveillance system.
The following data items should be provided:
• Track velocity vector;
• Rate of climb/descent (this data item may be reduced to a trend).
These data items are further described in Annex C - 1.
3.4.3 Mandatory and recommended performance requirements for 5 NM horizontal separation provided by ATCO using cooperative surveillance
system
Table 3: Cooperative surveillance system requirements for supporting 5 NM horizontal separation (5N_C)
3.4.4 Mandatory and recommended performance requirements for 3 NM horizontal separation provided by ATCO using cooperative surveillance
system
Table 4: Cooperative surveillance system requirements for supporting 3 NM horizontal separation (3N_C)
These performance requirements are mainly derived from the experience gained in Europe over the
last decades on the basis of radar technology.
In addition some specific studies have been undertaken to further refine performance requirement
specifications when necessary.
The following conventions are applied in Table 3 and Table 4:
• Mandatory performance requirements are in bold font
• Recommended performance requirements are in normal font
th
The 5 column “Ref./Justif.” provides the corresponding paragraph of Volume 2 [RD 1] Appendix II
where further references and justifications can be found.
Volume 2 [RD 1] Appendix II provides links between the requirements specified in this document and
requirements specified in similar documents, in particular the EUROCONTROL Standard Document
for Radar Surveillance in En-route Airspace and Major Terminal Areas ([RD 2]) and the ADS-B
specifications developed by the ADS-B Requirement Focus Group ([RD 14], [RD 15]).
Volume 2 [RD 1] Appendix II also provides links with study reports that have produced to support the
development of this document and to decisions of the Surveillance Standard Task Force (SSTF) who
is the group in charge of the writing of this document.
The last column “Conf. method” provides the corresponding sub-section of section 4 Conformity
assessment.
A mapping of the performance requirements detailed in Table 3 and Table 4 on the quality of services
described in sub-section 2.5 is provided in Table 5 below and additional justifications are provided in
Volume 2 [RD 1] Appendix I. A cell populated by a “-“ indicates that it was considered not necessary to
define corresponding detailed requirements.
Availability Continuity Integrity Time Coherence
Horizontal
R1-R2 R3 R4 R5 & R20 R19 Note 1 R6
position
R11 &
Pressure altitude R1-R7 R3 Note 2 R10 R8 & R9 Note 3
R7
SPI/Emergency
Note 4 R12 -
indicator
Rate of
Note 5 Note 6 R16 Note 7 Note 1 -
climb/descent
R17 &
Track velocity Note 5 Note 6 Note 7 Note 1 -
R18
Table 5: Mapping of performance metrics on quality of service characteristics for 5N_C and
3N_C applications
Note 1: Impact of information latency is taken into account within error calculation method.
Note 2: Pressure altitude correlated error due to ground surveillance system is assumed to be
addressed by R10. Pressure altitude correlated error due to the airborne system cannot be assessed
by a ground surveillance system. It has to be assessed by specific systems like the Height Monitoring
Unit’s (HMU) deployed in the frame of RVSM monitoring.
Note 3: Time consistency of pressure altitude data item is partly addressed through R8 and R9.
Note 4: Data items are checked procedurally by ATCO (SPI is requested by ATCO, if it does not
appear the ATCO will check with the pilot – in case of emergency indicator the ATCO will call the pilot
for further information).
Note 5: Requirements are to be defined locally when these data items are provided and used.
Note 6: Because of the way these data items are calculated (by a tracker), once started they will
continue to be provided, so continuity is 100% by design.
This environment descrition together with the operational service is described in § 3.1 form the OSED
(Operational Service and Environment Description) for 3/5 NM separation provided by ATCO using
non-cooperative surveillance system.
The operational service can be provided to all aircraft provided they have the minimum physical
characteristics (e.g. Radar Cross Section) that need to be locally defined in accordance with the
specifications of the non cooperative surveillance system. The presence and the possible proximity of
aircraft not meeting these minimum physical characteristics shall be taken into account in the
surveillance system safety assessment.
When a non-cooperative surveillance system is used in stand-alone the traffic environment is
assumed to be low density. The local surveillance system safety assessment will define the limit in
terms of quantity of traffic that can be managed with a non–cooperative surveillance system. This is
locally defined taking into account the complexity of the airspace and the complexity of the
environment.
3.5.2 Required data items for 3/5 NM horizontal separation provided by ATCO using non-
cooperative surveillance system
The following information elements are required from the surveillance system for the provision of
surveillance separation. This list does not include flight plan elements.
The following data items shall be provided by the surveillance system under the form of message-
structured and digitised information 2:
Positional data:
• Horizontal position (2D).
Surveillance data status:
• Coasted/not coasted;
• Time of applicability (for conformity assessment).
The provision of the above data items is compliant with Annex I § 1.1 of [AD1] when using a non-
cooperative surveillance system.
The following data item should be provided:
• Track velocity vector.
These data items are further described in Annex C - 1.
2 Although excluded for supporting separation application, analogue or digitised video can be presented to ATCO as mitigation
and/or as a confidence re-enforcement.
3.5.3 Mandatory and recommended performance requirements for 5 NM horizontal separation provided by ATCO using non-cooperative surveillance
system
5N_N-R1 Measurement interval for probability of update Less than or equal to 8 seconds Less than or equal to 6 seconds 2.1.1 4.2.1
assessment (R2) 2.1.2
5N_N-R2 Probability of update of horizontal position in Greater than 90 % global Greater than or equal to 97 % for 100% of the flights, any 2.1.3 4.2.2.1
accordance with selected measurement interval flight below 97% shall be investigated as defined in R10 2.1.4
5N_N-R3 Ratio of missed horizontal position involved in long Less than or equal to 0.5 % 2.1.4 4.2.3
gaps (larger than 26.4 s = 3 x 8s + 10%)
5N_N-R4 Horizontal position RMS error Less than or equal to 500 m Less than or equal to 350 m global 2.1.5 4.2.4
global 2.1.6
5N_N-R5 Ratio of target reports involved in series of at least 3 Less than or equal to 0.03 % 2.1.7 4.2.5
consecutive horizontal position correlated errors larger
than 926 m - 0.5 NM
5N_N-R6 Relative time of applicability of horizontal position for Less than or equal to 0.3 second RMS for relative data age 2.1.9 4.2.6
aircraft in close proximity (less than 18520 m – 10 NM)
5N_N-R7 Track velocity RMS error Less than or equal to 4 m/s for straight line and less than or 2.1.22 4.2.14
equal to 8 m/s for turn
5N_N-R8 Track velocity angle RMS error Less than or equal to 10° for straight line and less than or 2.1.22 4.2.14
equal to 25° for turn
5N_N-R9 Continuity (probability of critical failure) Less than or equal to 2.5 10-5 per hour of operation 2.1.25 4.2.17
Table 6: Non-cooperative surveillance system requirements for supporting 5 NM horizontal separation (5N_N)
3.5.4 Mandatory and recommended performance requirements for 3 NM horizontal separation provided by ATCO using non-cooperative surveillance
system
3N_N-R1 Measurement interval for probability of update Less than or equal to 5 seconds Less than or equal to 4 seconds 2.2.1 4.2.1
assessment (R2) 2.2.2
3N_N-R2 Probability of update of horizontal position in Greater than 90 % global Greater than or equal to 97 % for 100% of the flights, any 2.2.3 4.2.2.1
accordance with selected measurement interval flight below 97% shall be investigated as defined in R10 2.2.4
3N_N-R3 Ratio of missed horizontal position involved in long Less than or equal to 0.5 % 2.2.4 4.2.3
gaps (larger than 16.5 s = 3 x 5 s + 10%)
3N_N-R4 Horizontal position RMS error Less than or equal to 300 m Less than or equal to 210 m global 2.2.5 4.2.4
global 2.2.6
3N_N-R5 Ratio of target reports involved in series of at least 3 Less than or equal to 0.03 % 2.2.7 4.2.5
consecutive horizontal position correlated errors larger
than 555 m - 0.3 NM
3N_N-R6 Relative time of applicability of horizontal position for Less than or equal to 0.3 seconds RMS 2.2.9 4.2.6
aircraft in close proximity (less than 11110 m - 6 NM)
3N_N-R7 Track velocity RMS error Less than or equal to 4 m/s for straight line and less than or 2.2.22 4.2.14
equal to 8 m/s for turn
3N_N-R8 Track velocity angle RMS error Less than or equal to 10° for straight line and less than or 2.2.22 4.2.14
equal to 25° for turn
3N_N-R9 Continuity (probability of critical failure) Less than or equal to 2.5 10-5 per hour of operation 2.2.25 4.2.17
Table 7: Non-cooperative surveillance system requirements for supporting 3 NM horizontal separation (3N_N)
These performance requirements are identical to the performance requirements specified for a
cooperative surveillance system but are adapted to take account for the non-provided data items.
The following conventions are applied in Table 6 and Table 7:
• Mandatory performance requirements are in bold font
• Recommended performance requirements are in normal font
th
The 5 column “Ref./Justif.” provides the corresponding paragraph of Volume 2 [RD 1] Appendix II
where further references and justifications can be found.
Volume 2 [RD 1] Appendix II provides links between the requirements specified in this document and
requirements specified in similar documents, in particular the EUROCONTROL Standard Document
for Radar Surveillance in En-route Airspace and Major Terminal Areas ([RD 2]).
Volume 2 [RD 1] Appendix II also provides links with study reports that have been produced to support
both the development of this document and the decisions of the Surveillance Standard Task Force
(SSTF), the group responsible for the drafting of this document.
The last column “Conf. method” provides the corresponding sub-section of section 4 Conformity
assessment.
A mapping of the performance requirements detailed in Table 6 and Table 7 on the quality of services
described in sub-section 2.5 is provided in Table 8 below and additional justifications are provided in
Volume 2 [RD 1] Appendix I. A cell populated by a “-“ indicates that it was considered not necessary to
define corresponding detailed requirements.
Availability Continuity Integrity Time Coherence
Horizontal
R1/R2 R3 R4 R5 Note 1 Note 2 R6
position
System Note 3 R9 - - - - -
Note 6: In principle the required probability of update of horizontal position should be the same for
cooperative and non-cooperative surveillance systems. However, because the probability of update of
horizontal position of non-cooperative surveillance sytems is a performance characteristic which
depends on the environment conditions, it is only recommended to apply the same performance
specification for non-cooperative surveillance system as for cooperative one’s. As a matter of
consistency, 90 % is being required as it is the usual required value for primary surveillance radars
since 1997. In any case this performance characteristic will have to be confirmed by the system safety
assessment.
4 CONFORMITY ASSESSMENT
4.1 Generalities
4.1.1 Conformity assessment approaches
The conformity assessment of surveillance systems can be undertaken on the basis of one or more of
the five following approaches and in accordance with its associated priority:
• Opportunity traffic (priority 1),
• Flight trials (priority 2),
• Proof offered through system design files or by system design assurance (priority 3),
• Test transponder (priority 3),
• Injected test target (priority 3).
The priorities have been allocated on the basis of the operational relevance of each approach. The
approach based on opportunity traffic has priority 1 as it is fully representative of the operational traffic
and of the operational environment. The remaining 3 approaches are rather partially representative of
the operational traffic and operational environment and have the lowest priority.
The conformity assessment of a surveillance system against the cooperative surveillance performance
requirements shall be performed on the basis of cooperative and, if provided, combined target reports
delivered by the system.
The conformity assessment of a surveillance system against the non-cooperative surveillance
performance requirements shall be performed on the basis of non-cooperative target reports delivered
by the system except for requirements R2 & R3 for which combined target reports, if provided, shall
also be taken into account.
It is to be noted that a statistical measurement uncertainty may be generated if a low number of data
samples are used when performing the assessment of an individual aircraft. The application of an
additional measurement margin or concession may be required to address such an eventuality.
The conformity assessment measurements shall be performed within the volume of airspace where
the corresponding application/service is supported/provided and limited to the aircraft to which the
separation service is provided (see Annex C - 4.1 for the identification of these aircraft with respect
to airspace classes). This set of aircraft target reports is called the Conformity Assessment Volume
(CAV).
A target report is belonging to the CAV and is to be assessed if its reference flight trajectory 3D
position is located in the CAV.
A false target report is belonging to the CAV and is to be assessed if its 3D position is located in the
CAV or if there is no pressure altitude data item if the horizontal position is located in the largest
horizontal footprint of the CAV.
Some aircraft target reports, although located within the CAV, may be excluded from the conformity
assessment process:
• Aircraft to which the corresponding service is not provided based on individual aircraft or
on specific temporary area (e.g. military exercise).
• Aircraft whose avionics exhibit a functional anomaly (with respect to applicable
regulations), when assessing a cooperative surveillance system and if anomaly is
confirmed by data analysis.
The aircraft to which the service is provided (IFR or VFR) can be identified taking into account the
class of the airspace (see Annex C - 4.1 and Figure 17).
For example, when analyzing the performance to support the 3/5NM separation service, aircraft not
expected to be in the controlled airspace (e.g. intruding non-equipped VFR) may be excluded of the
conformity assessment process. The detection of intrusion within the controlled airspace is a separate
application requiring a different level of performance.
Aircraft whose avionics exhibit a functional anomaly shall be analysed separately to verify that the
assumptions of the system safety assessment remain valid. Such cases and their consequences
(assessed or assumed) on the performance of the surveillance system shall be reported to the local
safety monitoring process. Additionally, as foreseen in [AD1] Article 4(4), ANSP will inform aircraft
operators of any aircraft whose avionics exhibit a confirmed functional anomaly. Similar requirements
are placed upon the aircraft operator to investigate and resolve anomalies identified in this manner.
In case of association of cooperative and non-cooperative surveillance systems to support the service,
the cooperative and non-cooperative surveillance performance may be assessed in different CAV’s.
Assuming an assessment is made periodically (as opposed to continuously) using opportunity traffic
data, the performance requirements should be assessed on the basis of opportunity traffic datasets
containing at least 50.000 position reports from the system under assessment obtained from flight
trajectories for which the system has provided target reports during at least 50 measurement intervals.
The assessment shall be made periodically on each ground surveillance system and after each
system or environment modification that may have an impact on its performance characteristics.
The periodicity of the conformity assessment is to be defined depending on the system design and the
type of technology used.
When assessing the surveillance system performance on the basis of opportunity traffic, the system is
only evaluated where there are flights. If airspace design modifications are to be implemented, a study
will have to be undertaken to check that the system will still meet the required performance with the
new traffic and specific flight trials may be needed.
The performance shall be assessed at the point where surveillance data is used to provide the service
(e.g. 3 or 5 NM horizontal separation).
In practice, performance shall be measured at a point where surveillance data can be recorded in a
digitised way and which is as close as possible to where the service is delivered.
If a data processing stage is located in between that recording point and the point where the service is
delivered, an analysis shall be performed to determine the contribution of this processing stage to the
surveillance system end-to-end performance.
Should the provider of the 3/5 NM horizontal separation not be the provider of the surveillance data
used to support the service, it is up to the separation service provider to derive the performance of the
provided surveillance data, in order to meet the requirements described in this document and provided
that he has chosen to apply the present specification.
4.1.6 Definitions
The measurement interval is a parameter that is used to assess the probability of update of data
items.
The surveillance system applicable measurement interval will be used to assess the probability of
update of horizontal position, pressure altitude and aircraft identity data items (§ 4.2.2).
NR
PU = Equation 1
NT
NT − N R
PU = 1 − Equation 2
100
∑ NR
PU = n
Equation 3
∑N
n
T
Target report
Target report with incorrect
with correct pressure
pressure altitude/
altitude/aircraft aircraft
identity (NC) identity (NI)
Target report with pressure
altitude/aircraft identity (NV)
Figure 2: Numbers of target reports used for the calculation of PUC and the ratios of incorrect
pressure altitude / aircraft identity
Figure 2 above illustrates the different numbers (N X ) that are used for the calculation of PUC and the
calculation of the ratio of incorrect pressure altitude (§ 4.2.8) and the ratio of incorrect aircraft identity
(§ 4.2.12).
Aircraft trajectory
TP 4
TP 3
TP 2
Target report
TP 1 with horizontal position and
with pressure altitude
Target report
Aircraft trajectory with horizontal position and
without valid pressure altitude
Notes
This method does not allow the detection of small areas where there is no detection or a lack of
detection. This issue could be addressed with a requirement for cellular calculation of update
probability.
In order to make a calculation of the PU that is the least sensitive to the possible variations of the
actual measurement interval between consecutive target reports, it is recommended to ensure that the
first target report is time-located in the middle of the first MI, as shown in Figure 3.
NC
PUC = Equation 4
NR
N − NC
PUC = 1 − R Equation 5
100
∑n N C
PUC = Equation 6
∑ NR n
3 Correctness criteria for pressure altitude and aircraft identity are respectively defined in § 4.2.8 and § 4.2.12.
TP 4
Limit of the airspace where the application is provided
TP 3
TP 2
Target report
TP 1 with horizontal position and
with correct pressure altitude
Aircraft trajectory
Target report
with valid horizontal position and
With incorrect pressure altitude
Target report
with horizontal position and
without valid pressure altitude
Notes
The requirement on the probability of update of the aircraft identity data item is a requirement on the
provision of correct information at the output of the system and does not require the extraction from
the aircraft at each measurement interval. The rate of incorrect and the processing delay of these two
data items are assessed through other requirements (see § 4.2.7, 4.2.8, 4.2.11 and 4.2.12).
In order to make a calculation of the PUC that is as less sensitive as possible to the possible
variations of the actual measurement interval between consecutive target reports it is recommended to
ensure that the first target report is time-located in the middle of the first MI, as shown on Figure 4.
A gap is a portion of aircraft reference trajectory between 2 consecutive target report updates
including full update of the position (i.e. with horizontal position and pressure altitude for cooperative
surveillance system and with horizontal position for non-cooperative surveillance system). The size of
the gap is the time difference between these two target report updates. If the gap is partially located
outside the CAV, it shall not be taken into account.
Gap
Aircraft trajectory
Target report
with horizontal position
and
without pressure altitude
Target report
with horizontal position and
with pressure altitude
Method
Determine the gaps of a size (G S ) larger than 3 times the maximum measurement interval (8 s for 5
NM separation, 5 s for 3 NM separation) + 10% (i.e. long gap).
Count the number (N G ) of MIs (as determined in § 4.2.2.1) fully included in these gaps.
Calculate N A as the sum of all the N T calculated for the update probability.
N A = ∑ NT Equation 7
n
Calculate the long gap ratio as R G in accordance with Equation 8 where n is the number of flights and
g the number of long gaps.
∑N G
RG = Equation 8
g
NA
This ratio does not depend on the applicable measurement interval of the system both N G and N A are
proportional to 1/MI.
The error on the provided horizontal position is the 2D Euclidian distance between the horizontal
position provided by the surveillance system and the reference horizontal position of the
corresponding aircraft at the time when the updated position was output/delivered (see Figure 6
below). This error takes into account any uncompensated latency between the time of applicability of
the provided horizontal position and the time when the horizontal position was delivered to another
system.
Position error
at T5
Target reports
with horizontal position
delivered
at T1, T2, T3, T4, T5
Position error
at T4
Aircraft trajectory
Position error
at T2
Position error
at T3
Reference positions of
target
at T1, T2, T3, T4, T5
Position error
at T1
∑E i
2
Equation 9
RMS = 1
Horizontal position
along error
component
s
axi
ss
cro
Reference trajectory
A
Delivered horizontal
position
Aircrat reference trajectory
Horizontal position
across error
component
Four (4) main directions shall be considered: positive across error, negative across error, positive
along error, negative along error.
Identify sets of at least 3 consecutive errors in the same main direction greater than the specified
value. Count the number of target reports involved in such scenario and divide it by the total number of
target reports.
Note
As this kind of event may be relatively rare, it could be difficult to collect a reasonable number of
samples in dataset of opportunity traffic so as to get a reliable statistical figure.
4.2.6 RMS value of the relative time of applicability of target reports in close proximity
The time to be considered is the time of applicability (e.g. the time data item of the Asterix message).
Method
Identify pair of target reports that are close in horizontal position (less than 18520/11110 m – 10/6 NM
horizontally for respectively 5/3 NM horizontal separation service) and close in time (less than half the
applicable measurement interval).
For each pair calculate the unsigned difference between times of applicability of the two target reports.
Then calculate the RMS (see Equation 9) of these values.
Note
In case of rotating surveillance system the population may be limited to discard target reports located
near the rotation axis.
Method
The age of forwarded pressure altitude shall be the time difference between when it was output and
when it was time stamped by the receiving sensor (it is assumed that in the case of the pressure
altitude data item, the airborne latency and the transmission latency are negligible).
In case of a single sensor system, the age is the difference between the time of output and the
pressure altitude time of applicability reported within the sensor target report.
In the case of a tracker, this age shall be derived from the time of output and from information
provided by the tracker (e.g. MFL sub-field of data item I062/295 “Track Data Ages” and “Time of
Track Information” data item I062/070 of Asterix category 062 for system track data ([RD 5])).
Definitions
To determine the correctness of the forwarded pressure altitude, the value at the output of the
surveillance system shall be compared with the reference value.
This reference value is the altitude of the reference trajectory sampled at the time the target report
was output, minus the pressure altitude data age (as defined in § 4.2.7).
The tolerance being +/- 200 ft for aircraft reference position located in airspace where VSM = 1000 ft
and +/- 300 ft for aircraft reference position located in airspace where VSM = 2000 ft.
Method
The percentage incorrect pressure altitude shall be calculated as the ratio R I between the number N I
of target reports including an incorrect (see definitions above) pressure altitude and the total number
of target reports including a pressure altitude N V .
NI
RI = Equation 10
NV
Figure 2 illustrates the numbers used for the calculation of this indicator.
Notes
In case it is not possible to calculate pressure altitude date age, it will not be possible to perform the
assessment of pressure altitude correctness.
The assessment of the correctness of pressure altitude does not take into account the time needed to
process the information (i.e. its latency or data age); pressure altitude data age is assessed in
accordance with a specific performance characteristic (see § 4.2.7 above).
Therefore this method does not apply to extrapolated/calculated pressure altitude. Should an
extrapolated/calculated pressure altitude be provided, the requirement detailed in § 4.2.9 shall be
verified.
Definitions
This assessment can be performed for whatever type of pressure altitude (e.g. forwarded or
calculated).
The error of pressure altitude shall be calculated in accordance with the principle applicable for
horizontal position error (see 4.2.4 above).
Method
Unsigned pressure altitude error = |Output pressure altitude – Reference trajectory pressure altitude at
the time it was output|
The percentage of cases within the containment value shall be calculated separately for stable flights
and for climbing/descending flights.
Stable flight means when reference trajectory climbing/descending speed that is lower than or equal to
300 ft/mn.
Climbing/descending flight means when reference trajectory climbing/descending speed that is greater
than or equal to 200 ft/mn and is lower than or equal to 8000 ft/mn.
Note
There is some overlapping of target reports between stable flights and climbing/descending flights.
This is deliberate because trajectory reconstruction of transition between stable and
climbing/descending flights may be difficult; in any case the number of target reports belonging to the
overlapping area is in general very small and would not influence the measurements.
Method
The delay of forwarded emergency indicator / SPI report shall be calculated as the difference between
the time when the new information is present at the output of the surveillance system and the time
when the emergency indicator / SPI report has been reported for the first time by one of the ground
sensors.
Method
The delay of change in aircraft identity shall be calculated as the difference between the time when the
new aircraft identity data item is present at the output of the surveillance system and the time when
new aircraft identity has been reported for the second time by a ground sensor.
Note
The reference assessment shall be performed on the specified data item, which is the aircraft identity
reported by the aircraft and which is not to be confused with the aircraft identity reported by the Flight
Data Processing System (FDPS). Further explanations are provided in Annex A - 2.
Aircraft identity (Mode A code or Aircraft Identification) shall be considered correct if the provided
value is matching (exactly no tolerance) one of the values of the reference trajectory within the last
applicable measurement interval.
Method
The percentage of incorrect aircraft identity (Mode A or Aircraft Identification) shall be calculated as
the ratio R I between the number N I of target reports including an incorrect (see definitions above)
aircraft identity and the total number of target reports including a aircraft identity N V .
NI
RI = Equation 11
NV
Figure 2 illustrates the numbers used for the calculation of this indicator.
The 3 following figures (Figure 8, Figure 9 and Figure 10) provide examples of correct and incorrect
aircraft identity based on the method above.
Time
Measurement
interval Measurement
interval
Provided aircraft
identity = AC2
Time
Measurement
interval
Provided aircraft
identity = AC1
Time
Measurement
interval
This is applicable when rate of climb/descent data item is provided. A conformity assessment
procedure is not yet defined when only the trend is provided.
Method
The calculation of the reference aircraft rate of climb/descent will follow the same principle as for
horizontal position error (§ 4.2.4 above) i.e. the provided value will be compared with the reference
value at the time the target report including the rate of climb/descent data item was output.
For a target report, the rate of climb/descent error is the difference, in absolute value, between the
reference aircraft rate of climb/descent (as defined above) and the aircraft rate of climb/descent
provided in the target report.
The reference trajectory rate of climb/descent shall be used to determine if the flight is stable and/or
climbing/descending.
Comment
If the system is only providing a trend and not the actual value of the rate of climb/descent, a specific
conformity assessment procedure will have to be defined.
This is applicable when track velocity data item is provided by the surveillance system.
Method
The calculation of the reference aircraft velocity amplitude and angle will follow the same principle as
for horizontal position error (§ 4.2.4 above) i.e. the provided value will be compared with the reference
value at the time the target report including track velocity data item was output.
For a target report, the track velocity error is the difference between the reference aircraft velocity
amplitude (as defined above) and the aircraft velocity amplitude provided in the target report.
For a target report, the track velocity angle error is the difference between the reference aircraft
velocity angle (as defined above) and the aircraft velocity angle provided in the target report. The
calculation of that error shall be performed in the system of coordinates in which the track velocity data
item has been provided.
The following Figure 11 provides an example of the calculation of track velocity error components
based on the method above.
Reference trajectory
horizontal position
Delivered horizontal
Aircrat reference trajectory position
Delivered track
velocity
The table below provides the turn rate corresponding to 1.5 m/s² acceleration at different speeds.
Speed (knots) Turn rate (°/s)
100 1,67
200 0,84
300 0,56
400 0,42
500 0,33
600 0,28
700 0,24
800 0,21
900 0,19
1000 0,17
Table 9: Turn rate as a function of speed for an acceleration of 1.5 m/s²
Definition
The uncorrelated false target reports are those false target reports which do not form a falsely
confirmed track as defined in § 4.2.16.
Method
Identify the uncorrelated false target reports in accordance with the definition above.
For each false target report count, over a period of one hour, how many other false target reports are
2 2
located in a circular area (900 NM or 100 NM ) centred on the initial false target report and in the
CAV. The initial false target report shall also be counted.
The maximum value is the performance indicator.
Definition
A falsely confirmed track is a time (during at least 16/10 seconds for respectively 5/3 NM separation)
and space (maximum the horizontal outlier criteria) correlated set of at least 3 false target reports with
the same aircraft identity and belonging to the CAV.
The identification of falsely confirmed tracks shall be performed independently of the tracking
information that may be provided by the surveillance system. For that reason, these 3 false target
reports may not necessarily belong to the same track as declared by the surveillance system.
Method
Identify the falsely confirmed tracks in accordance with the definition above.
Once the falsely confirmed tracks are identified; select those that are close to the true tracks
(corresponding to a true aircraft trajectory) and count them per time frames of one hour.
To determine if a falsely confirmed track is close to true tracks, the following process is proposed:
• Select a falsely confirmed track.
• Around each update of the falsely confirmed track open a geographical cylindrical analysis
window of a radius equal to the proximity criteria.
• Identify the true tracks that have at least one track update located within this geographical
window and within a time window of +/- half the applicable measurement interval centred
on the falsely confirmed track update time.
• Repeat the operation for each update of the falsely confirmed track.
• If a true track has been identified at the third stage for at least two updates, the falsely
confirmed track is considered to be close to that real track.
• Repeat the operation for each falsely confirmed track.
• Count, for each hour of operation, the number of falsely confirmed tracks that are close
from at least one real track and record the start and end times of these falsely confirmed
tracks.
• 2 falsely confirmed tracks are coincident if the time difference between the first update of
the later false track and the last update of the earlier false track is less than the applicable
measurement interval.
The continuity of the surveillance system shall be verified by design and if sufficient data is available
on the basis of operations. The definition of continuity is provided in Annex C - 2.4.
Assessing opportunity traffic provides a relatively cheap means to access a potentially large data set
exhibiting the ‘real life’ characteristics in the real environment that are beyond the most complex
simulations. However, to utilise the data for a specific trajectory, it is necessary to construct a
reference against which comparisons can be made. There are test tools available which can be used
for this purpose.
To permit an accurate assessment it is essential to ensure that the reference aircraft trajectories that
are created in the analysis tool are of better quality than the trajectories that could be derived from the
surveillance system outputs. This can be achieved as the construction of the reference trajectory can
be conducted off-line and can thereby benefit by using information from future plot data.
An accurate assessment requires a sufficiently large data sample to reduce the impact of spurious
data that could otherwise introduce statistical anomalies. The data sample used shall be of sufficient
duration to examine all the characteristics of the surveillance system under assessment. It is
recognised that to be able to assess some of the parameters identified in this document, a significant
amount of time and data recording would be required in order to obtain a statistically relevant
assessment. To assess the parameters described in this document it is recommended that a minimum
of 50 000 target reports from the system under assessment are used in the analysis. To assess some
parameters through the use of opportunity traffic would need considerably larger data sets. The
introduction of cheaper memory and improved processing has allowed many ANSPs to assess data
sets significantly larger than 50 000 reports. It is recognised that in areas of low traffic density
additional tests using simulated data can be used to supplement the verification process.
All portions of flights belonging to the CAV shall be taken into account within the assessment. This is
necessary to ensure that the Surveillance system is ‘fit for purpose’ and capable of supporting the
service. However if anomalies are noted, stemming from identified avionics failings or a lack of data
arising from a valid exemption, then the anomalous data may be discounted from the scope of the
performance assessment of the ground based surveillance system components, if such events are
covered by the system safety assessment. Similarly the CAV may vary with time. For example if there
is a military exercise in a portion of the system coverage. In that case the CAV will be temporarily
reduced or the aircraft involved will be filtered out.
A valid exemption is an exemption that has been granted by an NSA of one of the EUROCONTROL
member states or, on behalf of one of these NSA’s, by a recognised and appropriate body (e.g. the
European Commission).
When conducting an assessment based upon targets of opportunity, it is recommended that:
• The conformity assessment process shall remove possible side effects due to the limited
duration of datasets (at the beginning and at the end).
• The conformity assessment process shall remove possible side effects due to analysing
data at the boundary edges of the CAV (e.g. performing the trajectory reconstruction over
a larger volume than the CAV).
It is advised that data is recorded at various points in the system to permit traceability of cause of
anomalies.
Flight trials are normally conducted to address the performance of a specific sensor and its input into
the existing surveillance infrastructure rather than a test of an entire multi-sensor surveillance system.
Their extensive use is limited due to their high cost.
An objective of the flight trials is to check the performance of the system:
• in specific volume of airspace (e.g. areas of low traffic density);
• against specific aircraft characteristics (e.g. transponder power, radar cross section);
• to establish a repeatable baseline.
The objective of a flight trial is to validate the performance simulations particularly in areas where
difficulties are predicted.
The route flown by the trials aircraft is chosen to probe specific points of weak coverage at various
heights and locations within the CAV. It should be noted that these may not only be at extremes of
instrumented range. Flights over or near wind farms and motorways can also provide an ANSP with
an improved appreciation of the impact that such environments could introduce to surveillance
operations.
The trials shall be designed to address the specific characteristics of the surveillance sensors under
test. A different approach to flight trial design may be necessary when testing a Mode S SSR type of
system compared with a WAM system.
The flight trials shall be designed to address the case when the aircraft presents the worst but still
compliant characteristics, e.g. smallest radar cross section (RCS) in case or primary radar, lowest
transponder transmitter power output, least transponder receiver sensitivity.
Flight trials aircraft may be equipped with an accurate position recording device to permit a
comparison of the surveillance data with the trajectory actually flown.
Flight trials may be used to prove performance when the system is configured to replicate failure
conditions e.g. if a WAM receiver is unserviceable. Such an approach can confirm the impact of a
degraded mode of operation.
Further information, mostly concerning radar systems but that can be generalised to surveillance
systems in general, can be found in document [RD 12] and its Appendix A and in particular about the
different combinations of transponder transmitter output power and transponder receiver sensitivity.
4.4.3 Conformity assessment framework based on proof offered through system design
files or by system design assurance
The use of design files may be appropriate where demonstration of parameters is either difficult,
expensive or if it is destructive. Design files may also be considered if the aspect has being tested
before and no significant changes have been introduced.
Design files may also be assembled using previous tests based on injected test targets – e.g.
simulations conducted using large Monte Carlo runs or software loading may not change from one
configuration to another.
Cooperative surveillance systems often utilise remote static mounted transponders to support the built
in test equipment and may also be used to provide the Air Traffic Controller with a visual confidence
check regarding the performance of the system. Similar techniques exist for non cooperative
surveillance systems.
Whilst the support offered by simple remote transponders is very limited, an ‘intelligent’ remote
transponder may provide a more comprehensive yet non-intrusive means of verifying certain
performance characteristics, such as the time taken to recognise a change of the Mode A code within
the system, without the need to take an operational system off line.
Within surveillance sensors a self generated test target is often used to support Built In Test (BIT)
assessments to provide a general indication of the ‘health’ of the system and to ensure that the
system is operating as required. This aspect of conformity assessment does not include the use of BIT
signals but refers to injected test targets and similar signals that are generated by laboratory
equipment that is not an integral part of the surveillance sensor. It refers typically to tests conducted
by the manufacturer to demonstrate system performance against specific requirements.
The benefit of an injected test target is that it permits detailed and specific tests to be conducted
relatively cheaply and consistently. This approach can be of particular use in proving load testing
(peak and average) or performance against complex scenarios that are not possible for cost and/or
safety reasons to be proven through flight trials.
The use of detailed software controlled scenarios allows for the introduction of slight variations to
established test configurations. It also permits the introduction of a parameter change at a specific
point in time with the subsequent assessment of how long the system takes to reflect the change.
Through such an approach it can also be used to optimize system performance.
Test targets are typically injected at the RF front end of the system, however they may also be injected
at opportune points within the system chain where they can be used to assess in detail, the
performance of a specific element of a surveillance sub-system.
As the antenna and several other front end components are ‘by-passed’ or simulated, such testing
may not reflect real life site effects nor exercise the entire surveillance chain. However, with that
limitation established, the use of an injected test target provides a comprehensive method of
conducting a detailed performance assessment of numerous system characteristics that it would not
be possible to test in any other way.
Such testing, which is to be conducted using appropriately calibrated test equipment, is to be
considered as supplementary testing and whilst acceptable for specific aspects of performance, it is
insufficient for determining actual total system performance achieved on site.
Surveillance Sensor(s)/Receiver(s)
Conformity assessment
recording point
It is to be noted that the correlation function between the surveillance information and the flight plan
information is considered outside the scope of the surveillance system. Similarly the function providing
QNH/QNE corrected altitude on the basis of the pressure altitude is also considered outside the scope
of the surveillance system. It is nevertheless recognised that the inputs of these functions can be
provided to the surveillance data users through the surveillance system.
Within some surveillance system architecture it may be that the aircraft identity as reported by the
aircraft is sent to the Flight Data Processing System (FDPS) which sends back a “flight plan correlated
aircraft identity”. This latter information is then sent to the controller by the surveillance system. As
specified in [AD1], the aircraft identity data item to be provided by the surveillance system, is the
aircraft identity reported by the aircraft, therefore it is not the aircraft identity reported by the FDPS.
In the case of such architecture, the conformity assessment shall be adapted in such a way that the
performance indicators corresponding to the aircraft identity data items are based on the aircraft
identity reported by the aircraft and not the aircraft identity reported by the FDPS.
The next figures (Figure 13 and Figure 14) provide examples of current and future physical
implementations of local airborne and ground surveillance systems based on 1030/1090 MHz data
link. Figure 14 is fully consistent with the Generic RFG ADS-B Functional Architecture as described in
EUROCAE ED-126 (document [RD 14]) and in EUROCAE ED-161 (document [RD 15]).
If the remote aircraft is not cooperative, there is no remote surveillance sub-system.
The cooperative and remote surveillance sub-system receives information acquired locally (sensors or
HMI) and compiles these data items to make them consistent before being transmitted to the local
surveillance sub-system through RF data links.
The local surveillance sub-system is performing measurements (sensors) and is receiving (receivers)
the data items transmitted by remote surveillance sub-systems. The surveillance data processing
function compile all these data items to make them consistent and to adapt them to the needs of the
local users (synchronisation, format, etc.).
The surveillance system performance is considered end-to-end, therefore the performance
measurements are undertaken at the interface with the surveillance data users so as to be compared
with the corresponding needs/requirements.
The functional components of the previous diagram are still shown; in addition physical elements are
shown with a folded corner and examples of surveillance data sources and surveillance data users are
provided. On Figure 14 optional items are shown dotted.
SSR /
ADS-B
Mode S
Surveillance
1030/1090 Mhz Radio Frequency (RF) data link
System
Conformity assessment
recording point
Figure 13: Current Air-Ground Surveillance systems implementation based on 1030/1090 MHz
data link
1090 Mhz
SSR / Radio
ADS-B ADS-B TIS-B
Mode S Frequency
(RF) data link
1030/1090 Mhz Radio Frequency (RF) data link 1030 Mhz Radio Frequency (RF) data link
Conformity assessment
recording point
Other surveillance
Interface to Surveillance data users data sources
Figure 14: Future Air-Ground, Ground-Air and Air-Air Surveillance system implementation based on 1030/1090 MHz data link
In the frame of this document, the considered Surveillance system encompasses all the components
and elements (either functional or physical) shown on the above figures and the RF data links used.
This document is independent of the environment; it is up to the surveillance system designer to
ensure that the designed system is capable of providing the required performance when operated
under the range of local environments.
For instance, the weather conditions may impact the quality of the RF data link; they will have to be
taken into account in the frame of the surveillance system design process to ensure that the required
performance can be met under all the locally specified weather conditions.
The performance of the surveillance system shall be measured at the input interface of the system
using surveillance data. The measured performance characteristics can then be compared to the
required performance characteristics.
From these diagrams one can see that the performance of the surveillance system not only depends
on the performance of its different components and elements but also on the performance
characteristics of its inputs.
Concerning the quality of the inputs to the aircraft domain it is assumed that they are in accordance
with the requirements specified in the Annex II of [AD1]. These requirements will be further detailed in
the forth coming EASA Certification Specification ACNS.
[RD 15] Safety, Performance and Interoperability Requirements Document for ADS-B RAD
Application ED-161 Dated September 2009 (EUROCAE)
[RD 16] Technical specification for Wide Area Multilateration (WAM) systems ED-142 Dated
September 2010 (EUROCAE)
[RD 33] ICAO Annex 6 Operation of Aircraft Part 1 International Commercial Air Transport –
th
Aeroplanes 8 Edition July 2001
[RD 34] Manual on Airspace Planning Methodology for the Determination of Separation Minima ICAO
Doc 9689-AN/953 First Edition 1998
[RD 35] ICAO Manual on Implementation of a 300 m (1 000 ft) Vertical Separation Minimum between
nd
FL 290 and FL 410 Inclusive. 2 Edition 2002 Document 9574 AN/934.
[RD 36] ICAO Document 9536, Review of the General Concept of Separation (RGCSP).
[RD 37] ICAO Document 8168, Aircraft Operations Volume 1 Flight Procedures Fifth Edition 2006
[RD 38] ICAO Document 8168, Aircraft Operations Volume 2 Construction of Visual and Instrument
Flight Procedures Fifth Edition 2006
B-4 Acronyms
Acronym Definition
ACID AirCraft IDentification
ANSP Air Navigation Service Provider
ATC Air Traffic Control
ATCO Air Traffic COntroller
ATM Air Traffic Management
ATS Air Traffic Service
ATSU Air Traffic Service Unit
BIT Built In Test
CAV Conformity Assessment Volume
DAP Downlink Aircraft Parameter
DSNA/DTI Direction des Services de la Navigation Aérienne/Direction de la Technique et de
l’Innovation
EASA European Aviation Safety Agency
EATMN European Air Traffic Management Network
EC European Commission
ECAC European Civil Aviation Conference
FCU Flight Control Unit
FHA Functional Hazard Analysis
FL Flight Level
FMS Flight Management System
HMI Human Machine Interface
HSM Horizontal Separation Minima
IAS Indicated Air Speed
ICAO International Civil Aviation Organisation
IFR Instrument Flight Rules
ISA International Standard Atmosphere
ISO International Standardisation Organisation
JAA Joint Aviation Authorities
MCP Mode Control Panel
MRT Mean Response Time
MSPSR Multi-Static Primary Surveillance Radar
MTBCF Mean Time Between Critical failure
MTBF Mean Time Between Failure
MTTR Mean Time to Repair/Restore
NM Nautical Mile
NRA Non Radar environment (in the context of the ADS-B RFG)
NSA National Supervisory Authority
OHA Operational Hazard Analysis
OPA Operational Performance Assessment
OSED Operational Service and Environment Description
PSR Primary Surveillance Radar
PSSA Preliminary System Safety assessment
PU Probability of Update
PUCV Probability of Update with Correct and Valid value
RAD RADar environment (in the context of the ADS-B RFG)
RCS Radar Cross Section
RF Radio Frequency
RFG Requirement Focus Group
RMS Root Mean Square
RPS Radar Position Symbol
RSP Required Surveillance Performance
Acronym Definition
RVSM Reduced Vertical Separation Minimum
SDP Surveillance Data Processing
SDPS Surveillance Data Processing System
SES Single European Sky
SPI Special Position Identification
SPI IR Surveillance Performance and Interoperability Implementing Rule
SSR Secondary Surveillance Radar
SSTF Surveillance Standard Task Force
TAS True Air Speed
TBC To Be Confirmed
TBD To Be Defined
TMA Terminal Manoeuvring Area
MI Measurement interval
UTC Coordinated Universal Time
VFR Visual Flight Rules
VSM Vertical Separation Minima
WGS World Geodetic System
Table 10: Acronym list
ANNEX - C DEFINITIONS
Ground Speed – This is the speed (amplitude) of the aircraft over the ground. This data item is
calculated on-board the aircraft. If available, this data item may be forwarded by the surveillance
system.
Indicated Airspeed (IAS) – This is the speed of the aircraft as shown on its pitot static airspeed
indicator calibrated to reflect standard atmosphere adiabatic compressible flow at sea level
uncorrected for airspeed system errors. This definition is extracted from the EASA document [RD 27].
This data item is calculated on-board the aircraft. If available, this data item may be forwarded by the
surveillance system.
Inertial vertical velocity – This is the vertical velocity of the aircraft as measured by an inertial device
on board the aircraft. This data item is calculated on-board the aircraft. If available, this data item may
be forwarded by the surveillance system.
Magnetic heading – This is the direction over the ground to which the aircraft axis is pointed. The
reference is the Magnetic North at the aircraft position. This data item is calculated on-board the
aircraft. If available, this data item may be forwarded by the surveillance system.
Pressure altitude rate – This is the variation over time of the aircraft pressure altitude. This data item
is calculated on-board the aircraft. If available, this data item may be forwarded by the surveillance
system.
Track Angle Rate – This is the variation over time of the aircraft True track angle. This data item can
be forwarded and/or calculated depending on surveillance system architecture.
True Airspeed (TAS) – This is the speed (amplitude) of the aircraft in the air. It can only be calculated
on board the aircraft on the basis of the Indicated Airspeed. This data item is forwarded by the
surveillance system.
True Track Angle (or Course) – This is the direction over the ground of the aircraft track. This data
item is calculated on-board the aircraft. If available, this data item may be forwarded by the
surveillance system.
Track velocity vector – This is speed vector of the aircraft track as calculated by the surveillance
ground system, it may take into account down-linked aircraft parameters such as the ground speed
and the true track angle. The naming of this data item is consistent with the naming adopted in
ASTERIX category 062 ([RD 5]) and has been chosen to avoid confusion with similar data items that
can be down-linked by the aircraft.
Rate of climb/descent – This is the variation over time of the aircraft pressure altitude as calculated
by the ground surveillance system, it may take into account down-linked aircraft parameters such as
pressure altitude rate. It may also be reduced to a trend with discrete values (climbing, descending,
straight level flight or unknown). The naming of this data item is consistent with the naming adopted in
ASTERIX category 062 ([RD 5]) and has been chosen to avoid confusion with similar data items that
can be down-linked by the aircraft.
Flight status – This indicates whether the aircraft is airborne or on the ground or unknown (either on
the ground or airborne). This data item is defined on board the aircraft, in general based on a weight
on wheel mechanism. This data item is forwarded by the surveillance system.
Surveillance technique source (cooperative, non-cooperative, combined) – This reflects whether
the data contained in the target report is based on information provided by a cooperative surveillance
source only or by a non-cooperative surveillance source or both. This data item is calculated by the
ground surveillance system.
The diagram below shows the different times in the case of a data item that is forwarded by the
ground surveillance system.
Airborne Boundary of
latency Data item age at SDPS output surveillance
As in I062/295 system:
time of recording
Figure 15: The different stages of surveillance system data processing (forwarded data item)
Assuming a ground surveillance system that is composed of:
• Mode S sensor
• Sensor data distribution system (network)
• Surveillance data processing system (e.g ARTAS tracker)
• Surveillance data distribution system (network)
• ATC centre display system
Applying the generic diagram above on the same surveillance system further illustrates the definitions
in the case of pressure altitude data item
• Time of airborne measurement (T1) of aircraft pressure altitude by the aircraft altimeter.
• Time of arrival of a Mode S reply (DF5) from the aircraft containing the pressure altitude,
e.g. Mode C code.
• From this Mode S reply the Mode S sensor decodes the Mode C code and copies it in a
Mode S target report that is dated (Time of arrival at sensor level T2) in accordance with
the time of arrival of the reply.
• The Mode S target report is transmitted to the tracker through the sensor network.
• This Mode C is copied in the next update of the track dated Time of applicability (T3) that
is output by the tracker. This track is processed by the ATC centre system and the aircraft
vertical position is transferred through the surveillance network and delivered at Time of
output of surveillance system (T4), then the ATCO uses this information to undertake
vertical separation with another aircraft.
In this case the pressure altitude data age is equal to T 4 – T 1 .
In the case where the measurement of a data item is performed on board the aircraft, the date of the
measurement is not forwarded to the ground surveillance system, nevertheless:
• In the case of an SSR/Mode S target report, the delay between T2 and T1 is normally
below a specified threshold.
•
4
In an ADS-B report, all data items are dated “on the basis ” of the time of arrival of the last
transmitted data item. In any case the exact time of measurement is not known.
The diagram below shows the different times in the case of a data item that is elaborated by the
ground surveillance system.
Boundary of
surveillance
system:
time of recording
Figure 16: The different stages of surveillance system data processing (calculated data item)
Applying the generic diagram above on this specific surveillance system further illustrates the
definitions in the case of the horizontal position data item:
• Time of arrival of a Mode S reply from an aircraft.
• From this Mode S reply the Mode S sensor elaborate the horizontal position (azimuth and
range) of a Mode S target report that is dated (Time of arrival at sensor level T1) in
accordance with the time of arrival of the reply.
• The Mode S target report is transmitted to the tracker through the sensor network.
• From the Mode S target report the tracker extrapolate a horizontal position for the Time of
applicability (T 2 ). The track data items are transferred through the surveillance network
and the horizontal position is delivered at the Time of output of the surveillance system
(T 3 ), and then the ATCO uses this information to undertake horizontal separation with
another aircraft.
4 The position may be extrapolated; in that case the date will correspond to the one of the extrapolation, which is itself based on
the previous received position dated with its time of arrival.
In this case the data age of the horizontal position at the output of the surveillance system is equal to
T3 – T2.
Separation
Separation Separation
Separation
© September 2015 – European Organisation for the Safety of Air Navigation (EUROCONTROL)
This document is published by EUROCONTROL for information purposes. It may be copied
in whole or in part, provided that EUROCONTROL is mentioned as the source and it is not used for
commercial purposes (i.e. for financial gain). The information in this document may not be modified
without prior written permission from EUROCONTROL.
www.eurocontrol.int