Sunday, July 17, 2016

Sense And Avoid Selection

The ability for small unmanned aerial systems (sUAS) to sense and avoid has been a prevalent topic in unmanned aviation and somewhat controversial.  There have been products that have come to market claiming to help to meet this requirement, however the Federal Aviation Administration (FAA) does not recognize these sensors as being able to meet this requirement yet.  This paper will look at ultrasonic sensors as well as advances made in visual sensors.
Ultrasonic sensors utilize high-frequency sound pulses and then by calculating how long it takes for the initial sound to echo to come back, it can then compute a range from the object that reflected the sound (Ultrasonic Distance Sensor, 2016).  As depicted in Figure 1, one part of the sensor will transmit the sound wave, meanwhile the other side “listens,” this is known as the receiver (Ultrasonic Distance Sensor, 2016).  By approximating the speed of sound at 1,100 feet per second this factor becomes a known in the distance equation. The distance equation is as follows: distance (D) equals the time (t) it takes the sound to return and then multiplying it by the known speed of sound (1,100ft/s) and then dividing it by 2 (D = (t x 1100)/2) (Ultrasonic Distance Sensor, 2016). 

Figure 3. Ultrasonic sensors working example.  Courtesy of  Cornell University Electrical and Computer Engineering.


Visual sensors or what we commonly think of as cameras can be used to help system “see” and avoid as well.  Recent software advances have allowed for cameras to be utilized on commercial products, for example Subaru’s “EyeSight” which utilizes stereoscopic cameras to sense range and is available on select Subaru models (SUBARU DEBUTS NEXT GENERATION EyeSight SYSTEM, 2014).  Additionally, companies like Chinese manufacturer DJI, have marketed commercial off the shelf (COTS) solutions with their latest Phantom 4 quadcopter.  The Phantom 4 uses stereoscopic cameras mounted to the front in order to sense and avoid objects in front of it (Sense and Avoid, 2016). 
DJI has also introduced a product know as Guidance as part of their developer series.  This system incorporates both the ultrasonic and visual technologies discussed earlier (Guidance User Manual V1.6, 2015).  It includes five sets of ultrasonic and image sensors and these sensors are all connected by a single core control which can be connected to any DJI control system or other systems via USB or UART (Guidance User Manual V1.6, 2015).  The system requires 11.1-25 volts for power and draws 12 watts of power with all 5 guidance sensors (GUIDANCE SPECS, 2016).  Additionally the system weighs in at 282.4 grams with the guidance core, five sensors, and associated cables (GUIDANCE SPECS, 2016).  Since the systems comes with five sensors, one set could be used fore and aft, one set port and starboard, and on sensor facing down giving five sides of protection.  Lastly the sensor effective range is at maximum 20 meters or just over 65 feet (GUIDANCE SPECS, 2016).  Depending on the processing power of the controller, if an avoidance decision could be made in less than one that could allow for a flight speed of 44 miles per hour, best case scenario.
Currently these systems do not meet the FAA requirements but are being utilized to allow for obstacle avoidance of non-cooperative objects such as birds, debris, or in cases where line of sight position is in question due to viewing angles.  Products such as DJI’s Guidance uses the incorporation of multiple sensors to complete one task.  As time goes, visual recognition software will continue to evolve and progress, and in this authors opinion it will be a combination of sensors such as these that will fit the need of the FAA’s “see” and avoid or for the case of unmanned systems “sense” and avoid requirement. 



References
Guidance User Manual V1.6. (2015 Oct). DJI. Retrieved from http://download.dji-innovations.com/downloads/dev/Guidance/en/Guidance_User_Manual_en_V1.6.pdf
GUIDANCE SPECS. (2016 Jul 12). DJI.com Retrieved from http://www.dji.com/product/guidance/info#specs
Phantom 4 User Manual V1.2. (2016 Mar). DJI. Retrieved from https://dl.djicdn.com/downloads/phantom_4/en/Phantom_4_User_Manual_en_v1.2_160328.pdf
Sense and Avoid. (2016 Jun 29). Dji.com. Retrieved from https://www.dji.com/product/phantom-4
SUBARU DEBUTS NEXT GENERATION EyeSight SYSTEM. (2014 Jan 23). Subaru.com. Retrieved from http://media.subaru.com/newsrelease.do?id=562&mid=123&allImage=1&teaser=subaru-debuts-next-generation-eyesight-system

Ultrasonic Distance Sensor. (2016 Jul 11). Arduino-info.wikispaces.com. Retrieved from http://arduino-info.wikispaces.com/Ultrasonic+Distance+Sensor

Sunday, July 10, 2016

Control Station Analysis

“OpenROV is an open-source, low-cost underwater robot for exploration and education. It's also a passionate community of professional and amateur ocean explorers and technologists” (Welcome to OpenROV!, 2016).  The OpenROV is capable of descending to depths of 328feet of seawater and has up to a two hour life (Welcome to OpenROV!, 2016).  David Lang, the co-creator of OpenROV, wanted to make this system simple, low-cost, and accessible to allow more people to purchase and discover underwater exploration.
The OpenROV 2.8 weights 2.6kg; is 30cm long, 20cm wide, and 15cm tall; and has a maximum speed of 2 knots (OpenROV 2.8 Mini Observation Class ROV, 2016).  Additonally, it has a 120 degree field of view (FOV) camera that transmits video back via a 100 meter tether (can support up to a 300 meter tether) (OpenROV 2.8 Mini Observation Class ROV, 2016).  Onboard processing is completed through a BeagleBone Black and Arduino Mega microprocessors and the system connects to a PC that runs OS/X/Windows/Linux via Google Chrome browser and uses OpenROV open source software which is installed onboard the OpenROV (OpenROV 2.8 Mini Observation Class ROV, 2016). 
The OpenROV uses what they call a top-side adapter to connect the tether to the control computer, this can also be connected to a wireless router to allow a wireless connection to the adapter (Jakobi, 2016).  Once connected to the top-side adapter, Google chrome can be opened and connected to the IP address 192.168.254.1:8080, which will access the onboard OpenROV Control software (OpenROV, 2016).  The OpenROV software is stored on the BealeBone Black and can be updated via SD card (OpenROV, 2016).  The OpenROV software provides a plethora of information and can be configured to use keyboard inputs or game pad inputs for command functions to the ROV (OpenROV, 2016). 


\Figure 1. OpenROV open source control software screenshot.  Red is connectivity status, Blue shows compass heading, Orange shows latency, Yellow is current draw, and Green is battery voltage.  Courtesy of OpenROV.com.


Figure 1. OpenROV open source control software screenshot.  Red Compass heading, Orange shows motor thrust, Yellow is depth, and Green is roll and artificial horizon.  Courtesy of OpenROV.com.
The use of visuals to communicate information to the surface controller is a very common means of communication.  However, with the amount of information that systems can send this can become a very visually intense control method.  Other methods of communicating information to the operator are being utilized such as aural warnings, which OpenROV does not currently use.  Additionally, when on board a surface vessel, an operator at a surface control station can become victim to spatial disorientation (SD).
Spatial disorientation (SD) is defined as, “a failure to sense correctly the attitude, motion, and/or position of the aircraft with respect to the surface of the earth” (Cooke, 2006).  Because the operator is not physically inside of an unmanned vehicle, this can lead to false perceptions and the primary cause of SD (Cooke, 2006).  SD taxonomy in unmanned aerial systems (UAS) can be divided in to three groups: Visual Reference (VR), Operator Platform (OP), and Control Method (CM); furthermore these groups can be further divided in to VR: exocentric (EX), egocentric (EG), and External View (EV); OP: Mobil (M), and Stationary (S); and CM: Manual Control (MC), Supervisory Control (SC) and Fully Autonomous (FA) (Cooke, 2006).
Haptic feedback has been a recent topic that is being explored to help combat reduced situational awareness (SA) and SD.  “Haptic feedback, often referred to as simply "haptics", is the use of the sense of touch in a user interface design to provide information to an end user” (What is "haptic feedback"?, 2016).  Haptic feedback can be as simple as a vibrating wrist band or as complex as the proposed Tesla Suit which allows for full body haptic feedback (Rigg, 2016).  The Army Aviation Association of America also experimented with a motion simulator allowing the operator to feel as if they were in the cockpit of the aircraft (Bobryk, 2012).  Regardless of how simple or complex the haptic system is, it does serve to provide more SA to the operator and makes for easier processing since so much information is already being processed visually and aurally.  However, a large drawback is the increasing amount of information that must be sent back to the surface control station.
The OpenROV software could integrate aural warnings associated with depth to increase operator SA.  For example if a set depth is configured for warning, an aural tone would let the operator know the depth has been exceeded.  This could be especially important when reaching the maximum operating depth.  While not beyond the scope of OpenROV’s open source software, haptic feedback might be a harder to integrate because the premise of the OpenROV is affordability and adding haptic feedback devices would add more to the overall cost, however aural warnings could be integrated easier by taking advantage of speakers already incorporated in the control PC. 




References
Bobryk, B. (2012 Jun 13). UAV Motion Ground Station. Retrieved from https://www.youtube.com/watch?v=z7dBJsLlq8E
Cooke, N. J. (2006). Human factors of remotely operated vehicles (1st ed.).Boston, Mass: JAI. Retrieved from http://site.ebrary.com.ezproxy.libproxy.db.erau.edu/lib/erau/detail.action?docID=10139446
Jakobi, N. (2016 Jul 5). How to build a WiFi enabled Tether Management System. Openrov.donzuki.com. Retrieved from http://openrov.dozuki.com/Guide/How+to+build+a+WiFi+enabled+Tether+Management+System/59
OpenROV. (2016 Jul 5). OpenROV Operators Manual. Openrov.donzuki.com. Retrieved from http://openrov.dozuki.com/Guide/OpenROV+Operators+Manual/80
OpenROV 2.8 Mini Observation Class ROV. (2016 Jul 5). Openrov.com. Retrieved from http://www.openrov.com/products/2-8.html
Rigg, J. (2016 Jan 06). Teslasuit does full-body haptic feedback for VR. Engadget.com. Retrieved from https://www.engadget.com/2016/01/06/teslasuit-haptic-vr/
What is "haptic feedback"?. (2016 Jul 4). Mobileburn.com. Retrieved from http://www.mobileburn.com/definition.jsp?term=haptic+feedback
Welcome to OpenROV!. (2016 Jul 5). Openrov.com. Retrieved from http://www.openrov.com/index.html