Flight simulator: Difference between revisions

Content deleted Content added
Line 231:
These limited angular and linear movements (or "throws") do not inhibit the realism of motion cueing imparted to the simulator crew. This is because the human sensors of body motion are ''more'' sensitive to acceleration rather than steady-state movement and a six cylinder platform can produce such initial accelerations in all six DoF. The body motion sensors include the vestibular (inner ear, semicircular canals and otoliths), muscle-and joint sensors, and sensors of whole body movements. Furthermore, because acceleration precedes displacement, the human brain senses motion cues before the visual cues that follow. These human motion sensors have low-motion thresholds below which no motion is sensed and this is important to the way that simulator motion platforms are programmed (and also explains why instruments are needed for safe cloud flying). In the real world, after conditioning to the particular environment (in this case aircraft motions), the brain is subconsciously used to receiving a motion cue before noticing the associated change in the visual scene. If motion cues are not present to back up the visual, some disorientation can result ("simulator sickness") due to the cue-mismatch compared to the real world.
 
In a motion-based simulator, after the initial acceleration, the platform movement is backed off so that the physical limits of the cylinders are not exceeded and the cylinders are then re-set to the neutral position ready for the next acceleration cue. The backing-off from the initial acceleration is carried out automatically through the simulator computer and is called the "washout phase". Carefully-designed "washout algorithms" are used to ensure that washout and the later re-set to about neutral is carried out below the human motion thresholds mentioned above and so is not sensed by the simulator crew, who just sense the initial acceleration. This process is called "[[Acceleration- onset cueing]]" and fortunately matches the way the sensors of body motion work. This is why aircraft manoeuvre at, say, 300 knots, can be effectively simulated in a replica cabin that itself does not move except in a controlled way through its motion platform. These are the techniques that are used in civil Level D flight simulators and their military counterparts.
 
The [[NASA Ames Research Center]] in "Silicon Valley" south of San Francisco operates the [http://ffc.arc.nasa.gov/vms/vms.html Vertical Motion Simulator]. This has a very large-throw motion system with 60 feet (+/- 30 ft) of vertical movement (heave). The heave system supports a horizontal beam on which are mounted rails of length 40 feet, allowing lateral movement of a simulator cab of +/- 20 feet. A conventional 6-degree of freedom hexapod platform is mounted on the 40 ft beam, and an interchangeable cabin is mounted on the hexapod platform. This design permits quick switching of different aircraft cabins. Simulations have ranged from blimps, commercial and military aircraft to the Space Shuttle. In the case of the Space Shuttle, the large Vertical Motion Simulator was used to investigate a longitudinal [[pilot-induced oscillation]] (PIO) that occurred on an early Shuttle flight just before landing. After identification of the problem on the VMS, it was used to try different longitudinal control algorithms and recommend the best for use in the Shuttle programme. After this exercise, no similar Shuttle PIO has occurred. The ability to simulate realistic motion cues was considered important in reproducing the PIO and attempts on a non-motion simulator were not successful (a similar pattern exists in simulating the roll-upset accidents to a number of early Boeing 737 aircraft, where a motion-based simulator is needed to replicate the conditions).