0% found this document useful (0 votes)
120 views15 pages

Synthetic Vision System: January 2009

This document provides an overview of synthetic vision systems for aircraft. It defines enhanced vision and synthetic vision, and notes that synthetic vision generates computer imagery of the external scene using onboard databases and sensors. The document describes enabling technologies like GPS and displays. It outlines the key elements of synthetic vision systems, including an enhanced intuitive view, hazard detection and display, integrity monitoring and alerting, and precision navigation guidance. Synthetic vision aims to allow pilots to view the external scene as if conditions were clear, even in low visibility.

Uploaded by

bskanwar6376
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
120 views15 pages

Synthetic Vision System: January 2009

This document provides an overview of synthetic vision systems for aircraft. It defines enhanced vision and synthetic vision, and notes that synthetic vision generates computer imagery of the external scene using onboard databases and sensors. The document describes enabling technologies like GPS and displays. It outlines the key elements of synthetic vision systems, including an enhanced intuitive view, hazard detection and display, integrity monitoring and alerting, and precision navigation guidance. Synthetic vision aims to allow pilots to view the external scene as if conditions were clear, even in low visibility.

Uploaded by

bskanwar6376
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/279473108

Synthetic Vision System

Article · January 2009

CITATIONS READS
6 743

2 authors, including:

Lynda Jean Kramer


NASA, Langley Research Center
89 PUBLICATIONS   694 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Spacecraft Handling Qualities View project

Synthetic Vision Systems View project

All content following this page was uploaded by Lynda Jean Kramer on 24 August 2015.

The user has requested enhancement of the downloaded file.


_______________________________________
SYNTHETIC VISION SYSTEMS

L.J. Prinzel and L.J. Kramer


Research and Technology Directorate, Crew Systems and Operations Branch (D-318), Mail Stop 152,
NASA Langley Research Center, Hampton, VA 23681, USA

1. INTRODUCTION

A “synthetic vision system” is an aircraft cockpit display technology that presents the visual
environment external to the aircraft using computer-generated imagery in a manner analogous to
how it would appear to the pilot if forward visibility were not restricted. The purpose of this
chapter is to review the state of synthetic vision systems, and discuss selected human factors
issues that should be considered when designing such displays.

1.1. Background

Aviation has been witness to the introduction of many new avionics systems (e.g., attitude
indicators, radio navigation, instrument landing systems, ground proximity warning systems) that
have sought to overcome the issues associated with limited outside visibility for the pilot. Still,
limited visibility remains the single most critical factor affecting both safety and capacity in
worldwide aviation operations. In commercial aviation alone, over 30% of fatal accidents
worldwide are categorized as Controlled Flight Into Terrain (CFIT), where a normally
functionally, mechanically sound aircraft impacts terrain or obstacles that the flight crew could
not see due to the lack of outside visual reference or impaired crew terrain/hazard situational
awareness. In general aviation, the largest accident category is Continued Flight into Instrument
Meteorological Conditions, in which a non-instrument rated pilot continues to fly into
deteriorating weather and visibility, leading to a loss of the visual horizon and a potential impact
into unexpected terrain or spatial disorientation and loss of control. Finally, the greatest factor
affecting airport delays is limited visibility that reduces runway capacity and increases distances
required for air traffic separation when weather conditions drop below visual flight rule
operations.

Synthetic vision is a visibility solution to this visibility problem that would allow all aircraft to be
flown under the virtual equivalent of visual meteorological conditions or clear daylight
operations.

2. DEFINITIONS

2.1. Enhanced Vision

Past solutions to enhance pilot visibility have been sought through imaging sensors. Such
systems are termed “enhanced vision systems” and consist of active or passive sensors
that are used to penetrate weather phenomena such as darkness, fog, haze, rain, and snow.
Enhanced vision systems have been installed on military aircraft but are infrequently
found on commercial transport aircraft due to cost, complexity, and technical
performance. Enhanced vision sensor imagery depends upon the external environment
and the sensor characteristics. For example, high-frequency radars (e.g., 94 GHz) and
infrared sensors may exhibit degraded range performance in heavy precipitation and
certain fog types. On the other hand, low-frequency (e.g., 9.6 GHz) and mid-frequency
(e.g., 35 GHz) radars have improved range, but often have poor display resolution.
Active radar sensors can suffer from mutual interference when multiple users are in close
proximity. Finally, present enhanced vision sensors do not extract color attributes which
may potentially create misleading visual artifacts under certain temperature or radar
reflective conditions.

2.2. Synthetic Vision

A “synthetic vision system” is an electronic means of displaying the pertinent and critical
features of the environment external to the aircraft through a computer-generated image
of the external scene topography using on-board databases (e.g., terrain, obstacles,
cultural features), precise positioning information, and flight display symbologies that
may be combined with information derived from a weather-penetrating sensor (e.g.,
runway edge detection, object detection algorithms) or with actual imagery from
enhanced vision sensors.

All aircraft categories can benefit from synthetic vision system applications, including
general aviation aircraft, business jets, cargo and commercial airliners, military cargo and
fighter jets, and rotorcraft. These systems may be shown on head-down, head-up,
helmet-mounted, and navigation displays and be combined with runway incursion
prevention technology; database integrity monitoring equipment; enhanced vision
sensors; taxi navigation and surface guidance maps; advanced communication,
navigation, and surveillance technologies; and traffic and hazard display overlays. What
characterizes the Synthetic Vision Systems technology is the intuitive representation of
visual information and cues that the pilot or flight crews would normally have in day,
visual meteorological conditions.
Primary Flight Display Navigation Display

Surface Guidance Map Display Head-Up Display


Figure 1. Examples of Synthetic Vision System Displays
3. DESCRIPTION

3.1. Enabling Technologies

Several research and technological developments have made synthetic vision systems
possible. Fundamentally, these systems require only precise ownship location, a
database, available graphics and computing capability and display media. Additional
information and capability may be required depending upon the intended function. Many
technical breakthroughs are responsible for the growing efficacy of synthetic vision
systems, which include:

• Head-Up Display and Helmet-Mounted Display development


• Efficient and effective display symbology and presentation format development,
including Pathway/Tunnel/Highway-In-The-Sky development and ground
operations displays.
• Global Positioning System/Inertial Navigation System (GPS/INS) development
• Mapping, Charting, & Geodesy Enhancements, Including Shuttle Radar
Topography Mission (SRTM) and published, accepted standards (e.g., RTCA
DO-255, -272, -276, -291)
• Datalink capability (ADS-B, CPDLC, TIS-B), enabling Cockpit Display of
Traffic Information (CDTI), Runway Incursion Prevention System (RIPS) and
digital transmission of Air Traffic Control instructions.
• Improved computer processing and graphic processors
• Database Integrity Monitoring Equipment (DIME) development
• Enhanced Vision System imaging sensors

3.2. Synthetic Vision System Elements

The synthetic vision system is composed of four elements: Enhanced intuitive view,
hazard detection and display, integrity monitoring and alerting, and precision navigation
guidance.

(a) Enhanced Intuitive View --- Synthetic vision systems present the display of pertinent
and critical features of the environment external to the aircraft through computer-
generated imagery particularly when weather conditions prevent the pilot from
effectively seeing these factors through the cockpit window. The display is intuitive
because it displays these data in the way that the pilot normally would see in day visual
meteorological conditions and includes symbology that reduces flight technical error and
fosters instant recognition and awareness.

(b) Hazard Detection and Display --- Terrain, cultural, traffic, obstacles, and other
hazards are graphically represented to the pilot to maintain the pilot’s situation awareness
and proactively ensure terrain and hazard separation. Synthetic vision systems provide
for pilot detection, identification, geometry awareness, prioritization, action decision and
assessment, and overall situation awareness not afforded by today’s avionics which
require the pilot to be reactive to alert cautions and warnings.
(c) Integrity Monitoring and Alerting --- Some level of integrity monitoring and alerting
is required in all SVS applications because pilots must trust that the synthetic vision
system provides an accurate portrayal (i.e., not hazardly misleading information). A
flight-critical level of integrity, redundancy, and the inclusion of reversionary modes may
be needed to achieve the ultimate potential for a Synthetic Vision System. In this case,
independent sources to verify and validate the synthetic vision presentation (e.g., radar
altimeters, enhanced vision sensors, TAWS) fashioned to create integrity monitoring
functions may be necessary. If the integrity monitoring discovers a mismatch, the
displays degrade gracefully to reversionary modes and trigger an alert to the pilot that
synthetic vision is no longer available nor reliable. The system effectively prevents a
pilot from using erroneous or misleading synthetic vision information.

(d) Precision Navigation Guidance --- Synthetic vision system elements (e.g., surface
guidance, taxi maps, tunnels/pathways/highways-in-the-sky, velocity vectors, command
guidance cues) allow pilots to rapidly and accurately correlate ownship position to
relevant terrain, desired flight paths/plans, cultural features, and obstacles. These
elements enable the pilot to monitor navigation precision to meet Required Navigation
Performance (RNP) criteria and compliance with complex approach and departure
procedures (RNAV, GLS, curved, step-down, noise abatement) without the need for
land-based navigation aids (e.g., ILS, VOR, DME, ADF, NDB, LORAN) that are
expensive to install and maintain.

3.3. Synthetic Vision System Components

There are many potential conceptualizations of synthetic vision systems dependent upon
the class of aircraft (CFR Title 14 Parts 23, 25, 27, 29) to which the system is being
designed. As an example, the National Aeronautics and Space Administration (NASA)
synthetic vision system concept for Part 25 aircraft has the following synthetic vision
system components:

Synthetic Vision Database/Sensors


• On-board synthetic vision databases
• Weather Radar
• Radar altimeter
• Forward Looking Infrared (option)
• Millimeter Wave Radar (option)

Synthetic Vision Displays


• Primary Flight Display, or imbedded display features
• Navigation Display, or display features/pages
• Interface with other cockpit displays, e.g., TAWS
• Head-Up or Helmet-Mounted Displays (option)

Computers/Embedded Computational Functions


• Image Object Detection and Fusion
• Data confidence, detection threshold filtering, expected error
• Source data reasonability and integrity estimation
• Hazard detection
• Data fusion (correlated position of potential hazards)
• Image enhancement and fusion, where appropriate
• Integrity self monitoring and alerting

• System Integrity, Verification and Validation


• Database reliability, integrity, expected error
• Other source data reasonability and integrity estimation
• Generate appropriate system alert messages
• Integrity self monitoring and alerting

• Computations and Symbology Generation


• Cleared and actual path depiction
• Hazard element display integration and depiction
• Runway Incursion Prevention System
• Hold Short and Landing Technology
• Navigation and hazard situation awareness enhanced display elements
• Alert and warning generation and presentation
• Overall display symbol generation and/or integration
• Integrity self monitoring and alerting

Equipment
• Dedicated synthetic vision system support equipment and crew interfaces
• Interface with other aircraft systems

Associated Aircraft Systems


• Differential Global Positioning System
• Inertial Reference Unit/Attitude Heading Reference Set (IRU/AHRS)
• Air Data Computer (ADC)
• Radio
• RADAR
• Traffic Collision and Avoidance System (TCAS)
• Data Link aggregate (e.g., IFF Mode S, ADS-B)
• Terrain Awareness and Warning System (TAWS)
• Laser Altimeter (option)
Sensors/Database Computation Display

FLIR
(potential) SVS Computer
Primary Flight
(dedicated or
Display
imbedded)
MMWR
Radar Sensor/Imagery
Head Mounted
(potential) Transformations
Display (potential)

Weather Image/Data
Navigation
Radar Fusion
Display

Image Object
Aircraft Nav Data Detection Vertical Situation
(Database, DGPS, Display
DIME, FMS, etc.)
Symbology
Generation Head-Up Display
(potential)
Aircraft State Data
(INS, ADC, Integrity
AHRS, Radalt, Monitoring Electronic Moving
etc.) Map/RIPS Display
Terrain Feature
Extraction
(potential) Aux Displays (i.e.,
Hazards Info Sys Pilot Info,
(Datalink, TAWS, Weather, TCAS,
Interface and
TCAS, IOD, RIPS, etc.)
Communication
Wx Radar, etc.)

Other Aircraft
Systems (i.e.,
FMS, GPWS,
CAWS, etc.)

Figure 2. NASA Synthetic Vision Concept


4. BENEFITS

4.1. Safety Benefits

Synthetic Vision Systems are characterized by the ability to represent visual information
and cues of the environment external to the aircraft that are intuitive and resemble visual
flight conditions with unlimited ceiling and visibility. In terms of safety benefits,
synthetic vision may help to reduce many accident precursors including:

• Loss of vertical and lateral path and terrain awareness


• Loss of terrain and traffic awareness
• Unclear escape or go-around path even after recognition of problem
• Loss of altitude awareness
• Loss of situation awareness relating to the runway environment and incursions
• Unclear path guidance on the surface
• Unusual attitude / upset recognition
• Runway incursions
• Non-compliance with Air Traffic Control (ATC) clearances
• Transition from instruments to visual flight
• Spatial disorientation

These safety benefits are particularly evident during non-normal and emergency
situations. In these non-normal events, mental workload and tasking/attentional demands
placed on the pilot are high. Synthetic vision systems, through their intuitive display and
presentation methods, off-load the pilots from basic spatial awareness tasking (to avoid
terrain, traffic, and obstacles) and increase their speed of situation recognition.

4.2. Operational Benefits

The aviation safety benefits alone of synthetic vision may be reason enough to pursue the
technology, but operational and economic benefits must be considered for Part 121 and
135 operations because of the costs associated with implementation of these systems.
Analyses have demonstrated that synthetic vision could serve to increase national
airspace system capacity by providing the potential for increased visual-like operations
gate-to-gate even under extreme visibility restricted weather conditions (e.g., Category
IIIb minimums). For example, a NASA-sponsored cost-benefit analysis of 10 major US
airports calculated the average cost savings to airlines for the years 2006 to 2015 to be
$2.25 Billion. While these savings are predicated on several technology developments
and success implementation/certification, this analysis indicates the potential order of
magnitude savings and operational efficiencies offered by these technologies.
Operational benefits of synthetic vision systems may include:

• Intuitive depiction of ATC cleared flight paths and taxi clearances


• Enhanced surface operations (e.g., rollout, turn off and hold short, taxi)
• Reduced runway occupancy time in low visibility
• Reduced departure and arrival minimums
• Better allow for converging and circling approaches, especially for dual and triple
runway configurations
• Reduce inter-arrival separations
• Provide for independent operations on closely-spaced parallel runways
• Provide for precise noise abatement operations
• Required Navigation Performance adherence
• 4D navigation capability
• Oceanic route optimization, spacing, and ownship reporting
• Enhanced path guidance, compliance monitoring, and alerting
• Depiction of terminal, restricted and special use airspace
• Depiction of traffic and weather hazards and resolutions
• Mission planning / rehearsal capability
• Reduced training requirements
• Approach operations to Type I and non-ILS runways
• Virtual visual self-spacing and station keeping capability
• Piloting aid support (e.g., flare guidance, runway remaining, navigation guidance)
• Enhanced flight management

5. ONGOING RESEARCH EFFORTS

5.1. Government Research

There are several government research efforts to design synthetic vision systems. NASA
is pursuing research and development for commercial, business, and general aviation
aircraft. The project is funded under the Aviation Safety and Security program, Synthetic
Vision Systems research project principally conducted at the NASA Langley Research
Center. Human performance modeling and synthetic vision rotocraft research (joint
Army-NASA project) are also conducted at the NASA Ames Research Center. The
Federal Aviation Administration (FAA) Capstone program focuses on synthetic vision
technology that together with NASA, the Alaskan community and aviation industry
partners, seeks to reduce Part 91 general aviation accidents. The Air Force Research
Laboratory Human Effectiveness Directorate is evaluating synthetic vision technology
displays to enable U.S. Air Force aircraft to fly with high situation awareness under
instrument meteorological conditions and help prevent CFIT accidents. Finally, there has
been a significant amount of synthetic vision research conducted by international
government research agencies (e.g., Germany Aerospace Center, National Aerospace
Laboratory).

5.2. Industry Research

Industry research has partnered with government agencies to pursue development of


synthetic vision systems. Rockwell-Collins and BAE Systems have significant research
efforts toward commercial and military applications of synthetic vision, enhanced vision,
and sensor fusion technology. Part 23 aircraft are served by several companies most
notably Universal, Chelton Flight Systems, and RTI International. Universal has
received a FAA technical standard order for the Vision-1 egocentric and exocentric
synthetic vision system displays. Chelton Flight Systems was selected for the FAA
Capstone program and has received Supplemental Type Certification (STC) approval for
installation of synthetic vision EFIS in the Cessna Citation 501, King Air
90/100/200/300, Conquest I and II, all Cheyenne, all Commander, MU-2, Pilatus PC-12,
TBM-700, Piaggio Avanti, and hundreds of other aircraft, including helicopters. Finally,
RTI International has integrated a 3-D virtual display depicting the flight path, a
worldwide terrain database, weather and traffic information, and GPS technology into a
single cockpit instrument that shows traffic, weather, obstacles, flight path, and
navigation information.

5.3. University Research

Numerous university researchers have contributed to the growing knowledge of the


human factors of synthetic vision displays. For example, research conducted by
Christopher Wickens (University of Illinois at Urbana-Champaign), Eric Theunissen
(Technical University of Delft), Thomas Schnell (University of Iowa), Kevin Corker (San
Jose State University), Jacques Verly (University of Liege), Maarten Uijt De Haag (Ohio
State University), and Andrew Barrows (Stanford University) are a few of the many who
have significantly advanced the understanding of human factors issues.

6. SELECT HUMAN FACTORS ISSUES

A Human Factors and Ergonomics conference panel was held at the Human Facotrs and
Ergonomics Annual Meeting in 2004 to debate the human factors of synthetic vision
systems. The panel members were Lawrence Prinzel (NASA), Raymond Comstock
(NASA), Mica Endsley (SA Technologies), Christopher Wickens (UIUC), Kevin Corker
(San Jose State U.), Tim Etherington (Rockwell-Collins), Guy French (Wright-Patterson
AFB), and Michael Snow (Boeing). The consensus of the panel was that synthetic vision
systems have significant promise in achieving the aforementioned safety and operational
benefits. It was acknowledged that significant human factors research has been
conducted, but a number of human factors issues still remain.

Corker and Guneratne (2002) categorized the human factors issues into three research
areas: Image quality, information integration, and operational concepts. Based on a
literature review, they developed an extensive list of human factors issues and provided a
set of research priority recommendations which are presented below.

6.1. Image Quality

Human Factors
Research Recommendation
Issue
What are the effects of display minification? Should field-of-
Field-of-View
view be automatically or manually determined? Can synthetic
Display Size
vision be retrofitted into smaller cockpit display sizes? Should
different field-of-view options be made available? What are the
minimum and maximum field-of-view settings for each display
size?
What is the minimum number of curves or objects required to
convey given information? How can the data be arranged to
provide a clear view without obstructing the view? How can
Clutter clutter be quantified on synthetic vision displays? What are the
effects of non-iconic information and synthetic vision
presentation on pilot scan of the cockpit and out-the-window
environment?
What are effective symbol sizes for iconic representations of
obstacles, traffic, guidance cues, etc.? What colors and standards
should be used? What are the minimum resolution, brightness,
Iconography
and contrast? What conventions (color, size, shape, etc.) can be
carried forward to support visual momentum and quick transition
between synthetic vision and traditional instrumentation?
What is the minimum contrast necessary to convey synthetic
terrain information? Should contrast be automatically adjusted for
Display Contrast
lighting conditions and/or background colors? Should contrast
control be given to pilots?
Should HUD symbology and/or synthetic terrain be entirely
opaque, transparent, or mixed? Should transparency be varied
Opacity
with lighting conditions? What are the effects of weather and
lighting transitions?

6.2. Information Integration

Human Factors
Research Recommendation
Issue
How should pathway guidance formats be designed for synthetic
Guidance vision systems? What are the best guidance cues for predictor
vector information?
What are the best synthetic terrain formats? What level of
realism is required for effective synthetic vision systems? Would
photo-realism be sufficient for altitude and trend information to
Terrain lead the crew to a false confidence in the system? Should
Presentation wireframe formats and overlays be used? What amount of
texturing and object detail is needed to provide adequate depth-
cueing? What would be the effect of combining display terrain
texturing methods?
Will realistic terrain cause the pilot to focus on the artificial
Cognitive display to the exclusion of the outside world and backup
Tunneling instruments? Will synthetic vision displays be compelling and
induce complacency?
What is the best way of integrating synthetic vision systems with
Display Integration
existing traffic, terrain, and other warning displays?
How can synthetic vision better impart awareness of trends such
Trend Information as shallow climbing, descending, etc.? What is the best mix of
trend and guidance information to avoid clutter?
How does a synthetic vision system change a pilot’s interaction
Skill Retention with tradition instruments? Do pilots retain the skills necessary to
revert back to traditional instruments if the system fails?
Will synthetic vision create a measurable decrease in mental
Workload Demand workload? What is the effect of the increase in data information
afforded by synthetic vision displays in the cockpit?

6.3. Operational Concepts

Human Factors
Research Recommendation
Issue
Which transitions will require switching between synthetic vision
Flight Phase
and other instrumentation? How can synthetic vision be designed
Transitions
to minimize the effect of the transitions?
Should synthetic vision be designed to accommodate current
operational procedures? Should pilot-flying or pilot-not-flying
Crew Interaction have different displays for their different roles? Or should they
have the same displays for cross-checking? How much effort can
be taken from aircraft management for display management?
When should the crew be alerted to potential failures? Too many
alarms may cause an impression that the system is “buggy”;
delaying an alert too long may leave the crew too little time to
Failure Modes
react to a dangerous system. What is the best way to alert the
crew visually, aurally, or otherwise in a way that is clearly
distinguishable from the other cockpit alarms?
What information is absolutely necessary for which phases of
flight? Should there be distinctly different sets of data for
Essential
different phases, as the PFD has different modes? Should these
Information
modes be automatically set, or should the crew have the
capability to determine the mode?
Does synthetic vision provide a benefit during both high and low
Effect at Various
workload? Are there problems with low workload over long
Workloads
periods of time?
Does synthetic vision lend itself to overtrust and complacency?
What factors are most important to convincing pilots that it is safe
to follow synthetic vision display and guidance? What operating
Crew Confidence in
characteristics are likely to decrease confidence (e.g., minimum
System
frame rates, power losses, sensor lag)? What is the proper
balance of crew confidence in the system and ensuring that cross-
checking other instruments is performed?
Resource How much control should pilots have over the synthetic vision
Management system during flight?
7. CONCLUSIONS

Commercial aviation is among the safest modes of transportation. But, the need to fly
regardless of the weather has led to an accident rate that is far from ideal. Aircraft
accidents serve as powerful reminders of the risks involved and how much safer flying
can and should be. Technology has advanced to allow for the emergence of synthetic
vision systems that will fundamentally change how aircraft are operated in instrument
conditions. By creating a virtual visual meteorological condition, synthetic vision holds
the promise to eliminate the precursor to many accidents and incidents (limited visibility)
and substantially improve the safety and operational efficiency of aviation.

8. ACKNOWLEDGEMENTS

The authors gratefully acknowledge the assistance of Randall Bailey and Dan Williams
(NASA Langley Research Center), R. Michael Norman (Boeing), and Kevin Corker (San
Jose State University).

9. RECOMMENDED FURTHER READINGS

Corker, K.M., & Guneratne, E. (2002). Human factors issues and evaluation of
commercial and business aircraft synthetic vision systems. NASA Contractor Final
Report (21-1214-2882).

Parrish, R.V., Baize, D.G., & Lewis, M.S. (2001). Synthetic vision. In C. Spitzer
(Ed.), The Avionics Handbook (pp. 16-1 – 16-8). CRC Press: Boca Raton

Prinzel, L.J., Comstock, J.R., Corker, K.M., Endsley, M.R., Etherington, T.,
French, G.A., Snow, M.P., Wicken, C.D. (2004). Human factors of synthetic vision
systems. Proceedings of the Annual Meeting of the Human Factors and Ergonomics
Society, 48.

Prinzel, L.J., Comstock, J.R., Glaab, L.J., Kramer, L.J., Arthur, J.J., & Barry, J.S.
(2004). The efficacy of head-down and head-up synthetic vision display concepts for
retro- and forward-fit of commercial aircraft. International Journal of Aviation
Psychology, 14(1), 53-77.

Prinzel, L.J., Hughes, M.F., Arthur, J.J., Kramer, L.J., Glaab, L.J., Bailey, R.E.,
Parrish, R.V., & Uenking, M.D. (2003). Synthetic Vision CFIT Experiments for GA and
Commercial Aircraft: “A Picture Is Worth A Thousand Lives”. Proceedings of the
Human Factors & Ergonomics Society, 47, 164-168.

Prinzel, L.J., Kramer, L.J., Arthur, J.J., Bailey, R.E., Comstock, J.R. (2004).
Comparison of head-up and head-down “highway-in-the-sky” tunnel and guidance
concepts for synthetic vision displays. Proceedings of the Annual Meeting of the Human
Factors and Ergonomics Society, 48.
Prinzel, L.J., Kramer, L.J., Comstock, J.R., Bailey, R.E., Hughes, M.F., & Parrish,
R.V. (2002). NASA synthetic vision EGE flight test. Proceedings of the Annual Human
Factors and Ergonomics Meeting, 46, 135-139.

Schnell, T., Kwon, Y., Merchant, S., & Etherington, T. (2004). Improved flight
technical performance in flight decks equipped with synthetic vision information system
displays. International Journal of Aviation Psychology, 14(1), 79-102.

Snow, M.P., & French, G.A. (2001). Human factors in head-up synthetic vision
display. SAE Technical Paper 2001-01-2652. Warrendale, PA: Society of Automotive
Engineers.

Snow, M. P., and Reising, J. M. (1999). Effect of pathway-in-the-sky and


synthetic terrain imagery on situation awareness in a simulated low-level ingress
scenario. Proceedings of the 4th Annual Symposium on Situation Awareness in the
Tactical Air Environment (pp. 198-207). Patuxent River, MD: NAWCAD.

Theunissen, E. (1997). Integrated design of a man-machine interface for 4-D


navigation. Netherlands: Delft University Press.

Uijt de Haag, M., Young, S., Sayre, J., Campbell, J., & Vadlamani, A. (2002).
DEM integrity monitor equipment (DIME) flight test results. In J.G. Verly (Ed.),
Enhanced and Synthetic Vision 2002 (pp. 72-83). Bellingham, Washington: International
Society for Optical Engineering (SPIE).

Williams, D., Waller, M., Koelling, J., Burdette, D., Doyle, T., Capron, W.,
Barry, J., & Gifford, R. (2001). Concept of operations for commercial and business
aircraft synthetic vision systems. NASA Langley Research Center: NASA Technical
Memorandum TM-2001-211058.

Wickens, C.D., Alexander, A.L., & Hardy, T.J. (2003). The primary flight display
and Its pathway guidance: Workload, performance, and situation awareness. Final
Technical Report AHFD-03-2/NASA-03-1. Savoy, Ill: University of Illinois, Aviation
Research Laboratory.

Wickens, C.D., Alexander, A.L., Thomas, L.C., Horrey, W.J., Nunes, A., Hardy,
T.J., Zheng, S.X. (2004). Traffic and flight guidance depiction on a synthetic vision
system display: The effects of clutter on performance and visual attention allocation.
Final Technical Report AHFD-04-10/NASA-04-1. Savoy, Ill: University of Illinois,
Aviation Research Laboratory.

Verly, J.G. (Ed.). (1997 – 2004), Enhanced and Synthetic Vision (Vols. 1997 –
2004). Bellingham, WA: International Society of Optical Engineering.

View publication stats

You might also like