Javascript must be enabled to continue!
Mixed-reality for unmanned aerial vehicle operations in near earth environments
View through CrossRef
Future applications will bring unmanned aerial vehicles (UAVs) to near Earth environments such as urban areas, causing a change in the way UAVs are currently operated. Of concern is that UAV accidents still occur at a much higher rate than the accident rate for commercial airliners. A number of these accidents can be attributed to a UAV pilot's low situation awareness (SA) due to the limitations of UAV operating interfaces. The main limitation is the physical separation between the vehicle and the pilot. This eliminates any motion and exteroceptive sensory feedback to the pilot. These limitation on top of a small field of view from the onboard camera results in low SA, making near Earth operations difficult and dangerous. Autonomy has been proposed as a solution for near Earth tasks but state of the art artificial intelligence still requires very structured and well defined goals to allow safe autonomous operations. Therefore, there is a need to better train pilots to operate UAVs in near Earth environments and to augment their performance for increased safety and minimization of accidents. In this work, simulation software, motion platform technology, and UAV sensor suites were integrated to produce mixed-reality systems that address current limitations of UAV piloting interfaces. The mixed reality definition is extended in this work to encompass not only the visual aspects but to also include a motion aspect. A training and evaluation system for UAV operations in near Earth environments was developed. Modifications were made to flight simulator software to recreate current UAV operating modalities (internal and external). The training and evaluation system has been combined with Drexel's Sensor Integrated Systems Test Rig (SISTR) to allow simulated missions while incorporating real world environmental effects and UAV sensor hardware. To address the lack of motion feedback to a UAV pilot, a system was developed that integrates a motion simulator into UAV operations. The system is designed such that during flight, the angular rate of a UAV is captured by an onboard inertial measurement unit (IMU) and is relayed to a pilot controlling the vehicle from inside the motion simulator. Efforts to further increase pilot SA led to the development of a mixed reality chase view piloting interface. Chase view is similar to a view of being towed behind the aircraft. It combines real world onboard camera images with a virtual representation of the vehicle and the surrounding operating environment. A series of UAV piloting experiments were performed using the training and evaluation systems described earlier. Subjects' behavioral performance while using the onboard camera view and the mixed reality chase view interface during missions was analyzed. Subjects' cognitive workload during missions was also assessed using subjective measures such as NASA task load index and non-subjective brain activity measurements using a functional Infrared Spectroscopy (fNIR) system. Behavioral analysis showed that the chase view interface improved pilot performance in near Earth flights and increased their situational awareness. fNIR analysis showed that a subjects cognitive workload was significantly less while using the chase view interface. Real world flight tests were conducted in a near Earth environment with buildings and obstacles to evaluate the chase view interface with real world data. The interface performed very well with real world, real time data in close range scenarios. The mixed reality approaches presented follow studies on human factors performance and cognitive loading. The resulting designs serve as test beds for studying UAV pilot performance, creating training programs, and developing tools to augment UAV operations and minimize UAV accidents during operations in near Earth environments.
Title: Mixed-reality for unmanned aerial vehicle operations in near earth environments
Description:
Future applications will bring unmanned aerial vehicles (UAVs) to near Earth environments such as urban areas, causing a change in the way UAVs are currently operated.
Of concern is that UAV accidents still occur at a much higher rate than the accident rate for commercial airliners.
A number of these accidents can be attributed to a UAV pilot's low situation awareness (SA) due to the limitations of UAV operating interfaces.
The main limitation is the physical separation between the vehicle and the pilot.
This eliminates any motion and exteroceptive sensory feedback to the pilot.
These limitation on top of a small field of view from the onboard camera results in low SA, making near Earth operations difficult and dangerous.
Autonomy has been proposed as a solution for near Earth tasks but state of the art artificial intelligence still requires very structured and well defined goals to allow safe autonomous operations.
Therefore, there is a need to better train pilots to operate UAVs in near Earth environments and to augment their performance for increased safety and minimization of accidents.
In this work, simulation software, motion platform technology, and UAV sensor suites were integrated to produce mixed-reality systems that address current limitations of UAV piloting interfaces.
The mixed reality definition is extended in this work to encompass not only the visual aspects but to also include a motion aspect.
A training and evaluation system for UAV operations in near Earth environments was developed.
Modifications were made to flight simulator software to recreate current UAV operating modalities (internal and external).
The training and evaluation system has been combined with Drexel's Sensor Integrated Systems Test Rig (SISTR) to allow simulated missions while incorporating real world environmental effects and UAV sensor hardware.
To address the lack of motion feedback to a UAV pilot, a system was developed that integrates a motion simulator into UAV operations.
The system is designed such that during flight, the angular rate of a UAV is captured by an onboard inertial measurement unit (IMU) and is relayed to a pilot controlling the vehicle from inside the motion simulator.
Efforts to further increase pilot SA led to the development of a mixed reality chase view piloting interface.
Chase view is similar to a view of being towed behind the aircraft.
It combines real world onboard camera images with a virtual representation of the vehicle and the surrounding operating environment.
A series of UAV piloting experiments were performed using the training and evaluation systems described earlier.
Subjects' behavioral performance while using the onboard camera view and the mixed reality chase view interface during missions was analyzed.
Subjects' cognitive workload during missions was also assessed using subjective measures such as NASA task load index and non-subjective brain activity measurements using a functional Infrared Spectroscopy (fNIR) system.
Behavioral analysis showed that the chase view interface improved pilot performance in near Earth flights and increased their situational awareness.
fNIR analysis showed that a subjects cognitive workload was significantly less while using the chase view interface.
Real world flight tests were conducted in a near Earth environment with buildings and obstacles to evaluate the chase view interface with real world data.
The interface performed very well with real world, real time data in close range scenarios.
The mixed reality approaches presented follow studies on human factors performance and cognitive loading.
The resulting designs serve as test beds for studying UAV pilot performance, creating training programs, and developing tools to augment UAV operations and minimize UAV accidents during operations in near Earth environments.
Related Results
Comparing cybersickness in virtual reality and mixed reality head-mounted displays
Comparing cybersickness in virtual reality and mixed reality head-mounted displays
Introduction: Defence Research and Development Canada is developing guidance on the use of Mixed Reality head-mounted displays for naval operations in the Royal Canadian Navy. Virt...
An overview of various kinds of wind effects on unmanned aerial vehicle
An overview of various kinds of wind effects on unmanned aerial vehicle
Attitude, speed, and position of unmanned aerial vehicles are susceptible to wind disturbance. The types, characteristics, and mathematical models of the wind, which have great inf...
Application of Unmanned Flying Vehicle for Obtaining Digital Orthofotomaps
Application of Unmanned Flying Vehicle for Obtaining Digital Orthofotomaps
Nowadays, surveys using unmanned aerial vehicles is becoming popular. The resulting orthophotomap is the final product for creating digital plans and cardboard. The objectives of t...
Modeling and simulation on interaction between pedestrians and a vehicle in a channel
Modeling and simulation on interaction between pedestrians and a vehicle in a channel
The mixed traffic flow composed of pedestrians and vehicles shows distinct features that a single kind of traffic flow does not have. In this paper, the motion of a vehicle is desc...
Persistent Unmanned Surface Vehicles for Subsea Support
Persistent Unmanned Surface Vehicles for Subsea Support
Abstract
This paper discusses the role of unmanned systems in subsea support. Recent developments in mobile unmanned vehicle networks are reviewed, demonstrating ...
Autonomous localized path planning algorithm for UAVs based on TD3 strategy
Autonomous localized path planning algorithm for UAVs based on TD3 strategy
AbstractUnmanned Aerial Vehicles are useful tools for many applications. However, autonomous path planning for Unmanned Aerial Vehicles in unfamiliar environments is a challenging ...
Vehicle Theft Detection and Locking System using GSM and GPS
Vehicle Theft Detection and Locking System using GSM and GPS
A vehicle tracking system is very useful for tracking the movement of a vehicle from any location at any time. An efficient vehicle tracking system is designed and implemented for ...
PERANAN UAV (UNMANNED AERIAL VEHICLE) DALAM IDENTIFIKASI ANCAMAN LONGSOR TINGGI DI SUNGAI CI DURIAN, DAERAH CIGUDEG, BOGOR
PERANAN UAV (UNMANNED AERIAL VEHICLE) DALAM IDENTIFIKASI ANCAMAN LONGSOR TINGGI DI SUNGAI CI DURIAN, DAERAH CIGUDEG, BOGOR
Technological developments in the field of photography, especially in the manufacture of 3D models from a collection of aerial photographs can be quickly and effectively with Unman...

