Sunday, July 17, 2016

Sense And Avoid Selection

The ability for small unmanned aerial systems (sUAS) to sense and avoid has been a prevalent topic in unmanned aviation and somewhat controversial.  There have been products that have come to market claiming to help to meet this requirement, however the Federal Aviation Administration (FAA) does not recognize these sensors as being able to meet this requirement yet.  This paper will look at ultrasonic sensors as well as advances made in visual sensors.
Ultrasonic sensors utilize high-frequency sound pulses and then by calculating how long it takes for the initial sound to echo to come back, it can then compute a range from the object that reflected the sound (Ultrasonic Distance Sensor, 2016).  As depicted in Figure 1, one part of the sensor will transmit the sound wave, meanwhile the other side “listens,” this is known as the receiver (Ultrasonic Distance Sensor, 2016).  By approximating the speed of sound at 1,100 feet per second this factor becomes a known in the distance equation. The distance equation is as follows: distance (D) equals the time (t) it takes the sound to return and then multiplying it by the known speed of sound (1,100ft/s) and then dividing it by 2 (D = (t x 1100)/2) (Ultrasonic Distance Sensor, 2016). 

Figure 3. Ultrasonic sensors working example.  Courtesy of  Cornell University Electrical and Computer Engineering.


Visual sensors or what we commonly think of as cameras can be used to help system “see” and avoid as well.  Recent software advances have allowed for cameras to be utilized on commercial products, for example Subaru’s “EyeSight” which utilizes stereoscopic cameras to sense range and is available on select Subaru models (SUBARU DEBUTS NEXT GENERATION EyeSight SYSTEM, 2014).  Additionally, companies like Chinese manufacturer DJI, have marketed commercial off the shelf (COTS) solutions with their latest Phantom 4 quadcopter.  The Phantom 4 uses stereoscopic cameras mounted to the front in order to sense and avoid objects in front of it (Sense and Avoid, 2016). 
DJI has also introduced a product know as Guidance as part of their developer series.  This system incorporates both the ultrasonic and visual technologies discussed earlier (Guidance User Manual V1.6, 2015).  It includes five sets of ultrasonic and image sensors and these sensors are all connected by a single core control which can be connected to any DJI control system or other systems via USB or UART (Guidance User Manual V1.6, 2015).  The system requires 11.1-25 volts for power and draws 12 watts of power with all 5 guidance sensors (GUIDANCE SPECS, 2016).  Additionally the system weighs in at 282.4 grams with the guidance core, five sensors, and associated cables (GUIDANCE SPECS, 2016).  Since the systems comes with five sensors, one set could be used fore and aft, one set port and starboard, and on sensor facing down giving five sides of protection.  Lastly the sensor effective range is at maximum 20 meters or just over 65 feet (GUIDANCE SPECS, 2016).  Depending on the processing power of the controller, if an avoidance decision could be made in less than one that could allow for a flight speed of 44 miles per hour, best case scenario.
Currently these systems do not meet the FAA requirements but are being utilized to allow for obstacle avoidance of non-cooperative objects such as birds, debris, or in cases where line of sight position is in question due to viewing angles.  Products such as DJI’s Guidance uses the incorporation of multiple sensors to complete one task.  As time goes, visual recognition software will continue to evolve and progress, and in this authors opinion it will be a combination of sensors such as these that will fit the need of the FAA’s “see” and avoid or for the case of unmanned systems “sense” and avoid requirement. 



References
Guidance User Manual V1.6. (2015 Oct). DJI. Retrieved from http://download.dji-innovations.com/downloads/dev/Guidance/en/Guidance_User_Manual_en_V1.6.pdf
GUIDANCE SPECS. (2016 Jul 12). DJI.com Retrieved from http://www.dji.com/product/guidance/info#specs
Phantom 4 User Manual V1.2. (2016 Mar). DJI. Retrieved from https://dl.djicdn.com/downloads/phantom_4/en/Phantom_4_User_Manual_en_v1.2_160328.pdf
Sense and Avoid. (2016 Jun 29). Dji.com. Retrieved from https://www.dji.com/product/phantom-4
SUBARU DEBUTS NEXT GENERATION EyeSight SYSTEM. (2014 Jan 23). Subaru.com. Retrieved from http://media.subaru.com/newsrelease.do?id=562&mid=123&allImage=1&teaser=subaru-debuts-next-generation-eyesight-system

Ultrasonic Distance Sensor. (2016 Jul 11). Arduino-info.wikispaces.com. Retrieved from http://arduino-info.wikispaces.com/Ultrasonic+Distance+Sensor

Sunday, July 10, 2016

Control Station Analysis

“OpenROV is an open-source, low-cost underwater robot for exploration and education. It's also a passionate community of professional and amateur ocean explorers and technologists” (Welcome to OpenROV!, 2016).  The OpenROV is capable of descending to depths of 328feet of seawater and has up to a two hour life (Welcome to OpenROV!, 2016).  David Lang, the co-creator of OpenROV, wanted to make this system simple, low-cost, and accessible to allow more people to purchase and discover underwater exploration.
The OpenROV 2.8 weights 2.6kg; is 30cm long, 20cm wide, and 15cm tall; and has a maximum speed of 2 knots (OpenROV 2.8 Mini Observation Class ROV, 2016).  Additonally, it has a 120 degree field of view (FOV) camera that transmits video back via a 100 meter tether (can support up to a 300 meter tether) (OpenROV 2.8 Mini Observation Class ROV, 2016).  Onboard processing is completed through a BeagleBone Black and Arduino Mega microprocessors and the system connects to a PC that runs OS/X/Windows/Linux via Google Chrome browser and uses OpenROV open source software which is installed onboard the OpenROV (OpenROV 2.8 Mini Observation Class ROV, 2016). 
The OpenROV uses what they call a top-side adapter to connect the tether to the control computer, this can also be connected to a wireless router to allow a wireless connection to the adapter (Jakobi, 2016).  Once connected to the top-side adapter, Google chrome can be opened and connected to the IP address 192.168.254.1:8080, which will access the onboard OpenROV Control software (OpenROV, 2016).  The OpenROV software is stored on the BealeBone Black and can be updated via SD card (OpenROV, 2016).  The OpenROV software provides a plethora of information and can be configured to use keyboard inputs or game pad inputs for command functions to the ROV (OpenROV, 2016). 


\Figure 1. OpenROV open source control software screenshot.  Red is connectivity status, Blue shows compass heading, Orange shows latency, Yellow is current draw, and Green is battery voltage.  Courtesy of OpenROV.com.


Figure 1. OpenROV open source control software screenshot.  Red Compass heading, Orange shows motor thrust, Yellow is depth, and Green is roll and artificial horizon.  Courtesy of OpenROV.com.
The use of visuals to communicate information to the surface controller is a very common means of communication.  However, with the amount of information that systems can send this can become a very visually intense control method.  Other methods of communicating information to the operator are being utilized such as aural warnings, which OpenROV does not currently use.  Additionally, when on board a surface vessel, an operator at a surface control station can become victim to spatial disorientation (SD).
Spatial disorientation (SD) is defined as, “a failure to sense correctly the attitude, motion, and/or position of the aircraft with respect to the surface of the earth” (Cooke, 2006).  Because the operator is not physically inside of an unmanned vehicle, this can lead to false perceptions and the primary cause of SD (Cooke, 2006).  SD taxonomy in unmanned aerial systems (UAS) can be divided in to three groups: Visual Reference (VR), Operator Platform (OP), and Control Method (CM); furthermore these groups can be further divided in to VR: exocentric (EX), egocentric (EG), and External View (EV); OP: Mobil (M), and Stationary (S); and CM: Manual Control (MC), Supervisory Control (SC) and Fully Autonomous (FA) (Cooke, 2006).
Haptic feedback has been a recent topic that is being explored to help combat reduced situational awareness (SA) and SD.  “Haptic feedback, often referred to as simply "haptics", is the use of the sense of touch in a user interface design to provide information to an end user” (What is "haptic feedback"?, 2016).  Haptic feedback can be as simple as a vibrating wrist band or as complex as the proposed Tesla Suit which allows for full body haptic feedback (Rigg, 2016).  The Army Aviation Association of America also experimented with a motion simulator allowing the operator to feel as if they were in the cockpit of the aircraft (Bobryk, 2012).  Regardless of how simple or complex the haptic system is, it does serve to provide more SA to the operator and makes for easier processing since so much information is already being processed visually and aurally.  However, a large drawback is the increasing amount of information that must be sent back to the surface control station.
The OpenROV software could integrate aural warnings associated with depth to increase operator SA.  For example if a set depth is configured for warning, an aural tone would let the operator know the depth has been exceeded.  This could be especially important when reaching the maximum operating depth.  While not beyond the scope of OpenROV’s open source software, haptic feedback might be a harder to integrate because the premise of the OpenROV is affordability and adding haptic feedback devices would add more to the overall cost, however aural warnings could be integrated easier by taking advantage of speakers already incorporated in the control PC. 




References
Bobryk, B. (2012 Jun 13). UAV Motion Ground Station. Retrieved from https://www.youtube.com/watch?v=z7dBJsLlq8E
Cooke, N. J. (2006). Human factors of remotely operated vehicles (1st ed.).Boston, Mass: JAI. Retrieved from http://site.ebrary.com.ezproxy.libproxy.db.erau.edu/lib/erau/detail.action?docID=10139446
Jakobi, N. (2016 Jul 5). How to build a WiFi enabled Tether Management System. Openrov.donzuki.com. Retrieved from http://openrov.dozuki.com/Guide/How+to+build+a+WiFi+enabled+Tether+Management+System/59
OpenROV. (2016 Jul 5). OpenROV Operators Manual. Openrov.donzuki.com. Retrieved from http://openrov.dozuki.com/Guide/OpenROV+Operators+Manual/80
OpenROV 2.8 Mini Observation Class ROV. (2016 Jul 5). Openrov.com. Retrieved from http://www.openrov.com/products/2-8.html
Rigg, J. (2016 Jan 06). Teslasuit does full-body haptic feedback for VR. Engadget.com. Retrieved from https://www.engadget.com/2016/01/06/teslasuit-haptic-vr/
What is "haptic feedback"?. (2016 Jul 4). Mobileburn.com. Retrieved from http://www.mobileburn.com/definition.jsp?term=haptic+feedback
Welcome to OpenROV!. (2016 Jul 5). Openrov.com. Retrieved from http://www.openrov.com/index.html


Saturday, June 25, 2016

Unmanned System Data Protocol and Format


In September 2007 the United States Air Force (USAF) transferred to pre-production Global Hawk aircraft to the National Aeronautics and Space Administration (NASA) (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2009b).  This came to fruition due to a failed plans for NASA to uses USAF aircraft, this was because the Department of Defense (DoD) priorities dictated otherwise (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2008b).  This paper will examine the communications payloads onboard, data format, protocols, and storage methods used to make this platform effective and functional. Additionally, onboard sensors will be examined and the overall data strategy, as well as possible improvements.
The NASA Global Hawk communication payload consists of: two UHF/LOS links, two Iridium link, and one Inmarsat link which provide command and control (C2) communications; two iridium links for communications with Air Traffic Control (ATC); and  six Iridium links for Payload C2 and health status (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2009b).  These communications payloads are exemplified in FIGURE 1.
Figure 1.  Concept of operations for NASA’s Global Hawk communications payloads. Courtesy of NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2009b.
            The NASA Global Hawk can fly missions of up to 30 hours, and for this reason status packet can be monitored by the Mission Scientist and Payload Operator so that they can have situational awareness (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2008a).  These packet are in a Comma-separated values American Standard Code for Information Intercahnge (CSV ASCII) format which is similar to Interagency Working Group standard format number 1 (IWG1) (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2008a).  In the CSV ASCII format, “a leading identifier, with comma separated values, and with the first value being a timestamp” (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2008a).  Additional parameters include the instrument status code as the second parameter and there shall not be more than 16 total parameters (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2008a). 
The Link Module system is the on-board file serve and data base that instruments can use for back up storage of data and caching of flight data request (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2008a).  Additionally, wide band satellite communications (SATCOM) is available when within the geographical footprint and can provide from 56 Kbps and up to 50 Mbps of service (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2008a). 
The NASA Global Hawk sensor suite can be changed to accommodate different sensors for different missions (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2014).  For example, in 2011 a National Oceanic and Atmospheric Administration (NOAA) sponsored flight called for the deployment of dropsondes (NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration, 2014).  A dropsonde is a tube about the size of a paper towel roll and transmits temperature and humidity as it drifts and transmits this information to include its GPS location (Newman, 2015).  In 2014 a LIDAR instrument was fitted to the NASA Global Hawk, both a real-time and stored product were available to the ground users (McGill et al., 2014).  As a default practice, images were transferred and saved every five minutes during flight (McGill et al., 2014).
NASA’s Global Hawk sensor payload can range depending on the mission and the customer’s requirements.  Onboard storage will usually give better fidelity in a case where large amounts of data are unable to be transmitted or have to be converted to a lower fidelity before being transmitted.  Sensors are continually evolving, for example the ARGUS Eye can collect one million terabytes of high definition video a day which equates to 5,000 hours of video (Rise of the Drone, 2013).  This creates a need for onboard storage or more data links to stream information.  Data links tend to come a premium because there is only so much of the frequency spectrum that can be used and there are lots of every day devices that use these frequencies as well, such as cell phones and Wi-Fi.  So, it would seem most practical to keep data on board and only transmit when requested by the ground station.
NASA’s Global Hawk is a highly capable asset and can be reconfigured to meet the mission requirements set forth by the customer.  However, the Global Hawk faces an issue many unmanned platforms will begin to see, it is the fact that sensor technologies are moving faster than the ability to process, store, and transmit the data collected.

References
Graves, B. (2015). Special gear for global hawk? San Diego Business Journal, 36(28), 12. Retrieved from http://bi.galegroup.com.ezproxy.libproxy.db.erau.edu/essentials/article/GALE%7CA423235424?u=embry&sid=summon&userGroup=embry
McGill, M., Hlavka, D., Kupchock, A., Palm, S., Selmer, P., Hart, B. (2014, Apr 29). Cloud Physics Lidar on the Global Hawk. Greenbelt, MD: NASA Goddard Space Flight Center. Retrieved from http://ntrs.nasa.gov/search.jsp?R=20140017377&hterms=GLOBAL+HAWK&qs=N%3D0%26Ntk%3DAll%26Ntt%3DGLOBAL%2520HAWK%26Ntx%3Dmode%2520matchallpartial%26Nm%3D123%7CCollection%7CNASA%2520STI%7C%7C17%7CCollection%7CNACA
NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration. (2008 Nov). Global Hawk: Payload Network Communications Guide. Edwards, CA: NASA Center for AeroSpace Information. Retrieved from https://www.eol.ucar.edu/raf/Software/iwgadts/DFRC-GH-0029-Baseline.pdf
NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration. (2008). NASA global hawk project overview. Hanover, MD: NASA Center for AeroSpace Information. Retrieved from http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20080017500.pdf
NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration. (2009). NASA global hawk: A new tool for earth science research. Hanover, MD: NASA Center for AeroSpace Information. Retrieved from http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20090019745.pdf
NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration. (2009). NASA global hawk: Project update and future missions. Hanover, MD: NASA Center for AeroSpace Information. Retrieved from http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20090001264.pdf
NASA Center for AeroSpace Information, & United States. National Aeronautics and Space Administration. (2014 Nov). NASA Global Hawk Overview November 2014. Edwards, CA: NASA Center for AeroSpace Information. Retrieved from http://ntrs.nasa.gov/search.jsp?R=20140017744&hterms=GLOBAL+HAWK&qs=N%3D0%26Ntk%3DAll%26Ntt%3DGLOBAL%2520HAWK%26Ntx%3Dmode%2520matchallpartial%26Nm%3D123%7CCollection%7CNASA%2520STI%7C%7C17%7CCollection%7CNACA
Newman, P. (2015 Jul 31). What the Heck is a Dropsonde? Nasa.gov. Retrieved from http://www.nasa.gov/content/goddard/what-the-heck-is-a-dropsonde
Optical Bar Camera. (2016). UTC Aerospace Systems. Retrieved from http://utcaerospacesystems.com/cap/products/Pages/optical-bar-camera.aspx

Rise of the Drones. (2013 Feb 12). Public Broadcasting Station. Retrieved from https://www.youtube.com/watch?v=HopKAYthJV4

Saturday, June 18, 2016

UAS Sensor Placement

In the world of small unmanned aerial systems (sUAS) sensor placement is a critical design element and decision that is based on the desired mission or use of the platform.  In this paper the author, who is an avid quadcopter hobbyist, will look at two different design of sUAS platforms that have a similar set of sensors but are used in different ways thus resulting in different sensor placement.  This paper will examine the DJI Phantom 2 Vision Plus and the Walkera Runner 250 Advance.  Both of these systems are quadcopters that are GPS enabled, but the Phantom is designed for aerial photography, while the Runner is designed for what is known as first person view (FPV) racing.
The Phantom 2 Vision Plus is a 350mm quadcopter, which means it measures 350mm from the furthest two propellers (i.e. port forward to starboard rear).  It is equipped with a camera on a gimbal that can tilt and roll, this not only acts to point the camera in a direction other than where the craft is facing, but allows for stabilization of the image independent of the aircrafts attitude.  This image is sent back to the user via 2.4 gHz so that the operator at the control of the ground control station (GCS) can see in real-time what the aircraft sees (Phantom2 Vision+ User Manual V1.8, 2015).  Furthermore, this camera is mounted underneath of the aircraft for two reasons typically, one is due to the size of the gimbal and arms, and the second allows the camera to keep the aircraft out of the image (Phantom2 Vision+ User Manual V1.8, 2015). Additionally the Phantom is equipped with GPS to allow for precise positioning while flying and knowing where it started from, this antenna is built in to the housing of the quadcopter (Phantom2 Vision+ User Manual V1.8, 2015).  Lastly newer versions of the Phantom such as the Phantom 4 is equipped with ultrasonic sensors and vision sensors to allow it to hold a much more precise position and allow it to avoid obstacles if the camera is not facing the forward direction (Phantom 4 User Manual V1.2, 2016).
The Walkera Runner 250 Advanced is a 250mm quadcopter.  It is one of the few FPV Racing quadcopter equipped with GPS from the factory (Walkera Runner 250(R) Quick Start Guide, 2015).  This is a fairly unusual option since weight is the enemy in most any form of racing.  In a class where most racers are under 500 grams, the additional weight of a few grams from a GPS antenna can cost a race, but this platform was chosen to try and make a closer comparison of the two types of platforms.  The camera on the Runner is not attached to a gimbal and mounted directly to the quadcopter on the forward point on the body, this gives the controller the feel of being inside the cockpit of an aircraft (Walkera Runner 250(R) Quick Start Guide, 2015).  The image is stabilized with two rubber bulbs that act as shocks so that vibrations from the propeller does not create what is known as the “Jello” effect.  This is where the image moves as if the user is looking through moving JELLO or water.  Just like the Phantom this image is sent back to the operator at the GCS and displayed on a screen or what is known as FPV Goggles.  Additionally the image is not stabilized independent of the aircraft because this allows the users to infer the attitude of the aircraft as if they were actually in it.  Lastly the GPS is not integrated into the body shell, as it does not have one in the same way the Phantom does.  The Runner is a racer, so again, weight is the enemy and the frame is the main support and made of carbon fiber, so the antenna is placed on the top of the carbon fiber with a plastic post to set it up a little higher for a less obstructed view to the sky.  While the GPS obtains the same information, it is mainly used as a way for the user to see their position and relative to the aircraft and allows for a return home function, both similar features found in the Phantom, but unlike the Phantom the positioning is not meant to be as precise and is susceptible to drift from the wind and altitude is only accurate up to about +/- 3 meters (Walkera Runner 250(R) Quick Start Guide, 2015).
While these platforms offer a similar sensors, they are arranged differently and operate differently due to the intended use or mission of the system.


References
Phantom2 Vision+ User Manual V1.8. (2015 Jan) DJI. Retrieved from http://dl.djicdn.com/downloads/phantom_2_vision_plus/en/Phantom_2_Vision_Plus_User_Manual_v1.8_en.pdf
Phantom 4 User Manual V1.2. (2016 Mar). DJI.com. Retrieved from https://dl.djicdn.com/downloads/phantom_4/en/Phantom_4_User_Manual_en_v1.2_160328.pdf
Walkera Runner 250(R) Quick Start Guide. (2015 Oct 20). Walkera

Wednesday, June 8, 2016

Unmanned Systems Maritime Search and Rescue

On October 1, 2015, the U.S. Flagged El Faro went missing as it traveled from Puerto Rico to Jacksonville, Florida.  Onboard was its crew of 33, who were all presumed dead.  At the time of its trip Hurricane Joaquin, a category 4 storm, threatened its route back to Jacksonville.  During the trip the ship went missing, when its main propulsion failed stranding the crew in the path of the storm.
The U.S. Navy used an unmanned underwater vehicle (UUV) called CURV 21 which was able to locate and identify the sunken El Faro, (Almasy, 2015).  The CURV 21 is a 6,400 poubd UUV capable of reaching depths of 20,000 feet.  It uses a “.680 fiber-optic umbilical cable and a shared handling system that can switch at sea between side-scan sonar and ROV operations” (CURV 21 - REMOTELY OPERATED VEHICLE, 2015).  Among its exteroceptive sensors is found a side-scan sonar, CTFM sonar, high resolution still camera, and black and white and color video cameras (CURV 21 - REMOTELY OPERATED VEHICLE, 2015).  Its proprioceptive sensors include an electronic gyrocompass, attitude and heading reference unit, 1200 kHz Doppler velocity log, and 200 kHz altimeter.
The CURV 21 could benefit from the loss of the umbilical, however this umbilical is required due to the amount of data that must pass to and from the remotely located operators.  Currently underwater wireless systems would not allow the CURV 21 to operate at depth of 20,000ft.
With the declining prices of small unmanned aerial systems (sUAS), these systems are finding their ways in to search and rescue as well.  “In the vast wilderness of the Everglades, the SAR operations are often conducted in remote areas accessible by boat or aircraft,” (Safety, 2016).  In a search and rescue environment time is of the essence and sUAS give support teams the ability to quickly launch air assets to begin the SAR process.  The relatively flat area of the everglades would help to keep operators with in visual line of sight (VLOS); remaining VLOS is a current restriction imposed by the Federal Aviation Administration (FAA) of sUAS usage.   Unmanned Surface Vehicles (USV) could also be used to access areas covered by trees or inaccessible by boat.  Additionally, the use of USVs can keep searchers in one central area without the possibility of losing a rescue member or exposing rescuers to dangerous wildlife such as snakes, alligators, and mosquitoes.  While this is not the same environment where the El Faro sank, the benefits of a multi-sensor search and keeping all of the searchers in one area could be beneficial.
Sensor suits of sUAS and UUVs can be similar, but are often used differently.  For example, a camera on a sUAS or on a larger UAS often serve as a long range and short range visual cues.  However on a UUV, light does not penetrate the water as well and built-in lights only have a limited range, thus the cameras are used in close up viewing.  RADAR and SONAR work in similar ways as an exteroceptive sensor, however this is the primary way an UUV is able to see underwater, and would most likely be a tertiary way for the UAS to see.
Platforms in the air, surface (ground and water), and underwater can work together to make search and rescue efforts executed in a timely fashion and centralizing all information.  While collaboration will be key in unmanned operations, this also holds true in the realm of people and sUAS products.  For example an all-volunteer group called Search With Areal Rc Multirotor (SWARM), who has, “over 1,100 SAR Drone Pilots dedicated to searching for missing persons. Our primary mission is to offer and provide multi-rotor (drone) and fixed wing aerial search platforms for ongoing Search and Rescue operations at no cost to the SAR organization or to the family,” (Search With Aerial RC Multirotors (SWARM), 2016).  Through continued advances in sensor technology, all missions of unmanned systems will continue to benefit.


References
CURV 21. (2016 Jun 7). Office of the Director of Ocean Engineering Supervisor of Salvage and Diving. Retrieved from http://www.supsalv.org/00c2_curv21Rov.asp?destPage=00c2
CURV 21 - REMOTELY OPERATED VEHICLE. (2015 Nov 13). US Navy. Retrieved from http://www.navy.mil/navydata/fact_display.asp?cid=4300&tid=50&ct=4
Safety. (16 Feb 2016). Everglades National Park, Florida. National Park Service.  Retrieved from http://www.nps.gov/ever/getinvolved/supportyourpark/safety.htm  
Search With Aerial RC Multirotors (SWARM). (16 Feb 2016). SAR Drones. Retrieved from http://sardrones.org/

Almasy, S. (2015 Nov 2). Sub with camera to dive on sea wreck believed to be missing ship El Faro. CNN. Retrieved from http://www.cnn.com/2015/11/01/us/el-faro-search/

Thursday, June 2, 2016

Are more sensors on sUAS better?

On May 3, 2016, a man from Ohio who was flying his small unmanned aerial system (sUAS) near Cape Marco, FL crashed into a condominium, (Video: After drone crash, Marco council nixes ordinance, 2016).  “The owners of the condo where the drone landed, fearing they were being spied upon, were very upset by the incident, according to a police report obtained by WBBH. Officials, however, did not find any evidence to support that fear” (Man will not face charges after drone crashes into Fla. high-rise condo, 2016).  Furthermore the pilot was registered with the FAA according to the article.  Additionally, in my research he was not within 5NM of an airport and based on the video it was Visual Meteorological Conditions (VMC).

The crash happened after the signal was lost and the fail-safe was triggered for the sUAS to return home, (Video: After drone crash, Marco council nixes ordinance, 2016).  Victor Rios, a council member of the Belize, the condominium that the sUAS crashed at, wrote in concerns: “Based upon my experience and due to the position of the drone on the master bedroom lanai, I believe that it was hovering just above the railing and maneuvering for better position in an attempt to get closer - and this is when the drone hit the edge of the railing damaging the propellers,” (Video: After drone crash, Marco council nixes ordinance, 2016).  The Ohio man was cooperative and allowed the police chief to play the video for the council, clearly showing that there was no intentions of spying.

The DJI Phantom 4, a newer model of the one in the Cape Marco incident, was released early in 2016. “The Phantom 4 is equipped with an Obstacle Sensing System that constantly scans for obstacles in from of it, allowing it to avoid collisions by going around, over or hovering. The DJI Vision Positioning System uses ultrasound and image data to help the aircraft maintain its current positon,” (Phantom 4 User Manual V1.2, 2016).  With this new feature, I can assume that the sUAS would not have run strait in to the condo, but makes me wonder what might have happened?  Could it have stopped and hovered, then crashed after loosing battery life? Or tried to go around and still crashed due to the inability to sense obstacles on either side it?  How might the issue of spying changed if it would have stopped and hovered because of this new sensor employed?


References

Man will not face charges after drone crashes into Fla. high-rise condo. (2016 May 6). News Channel 8 (WFLA). Retrieved from http://wfla.com/2016/05/06/man-will-not-face-charges-after-drone-crashes-into-fla-high-rise-condo/
Phantom 4 User Manual V1.2. (2016 Mar). DJI.com. Retrieved from https://dl.djicdn.com/downloads/phantom_4/en/Phantom_4_User_Manual_en_v1.2_160328.pdf
Video: After drone crash, Marco council nixes ordinance. (2016 May 6). Sun Times. Retrieved from http://www.marcoislandflorida.com/story/news/2016/05/04/drone-crash-marco-council-nixes-ordinance/83921064/

Thursday, May 19, 2016

Finishing ASCI638

      As I round out the final week of my third nine week term at Embry Riddle Aeronautical University, I will be completing two courses.  Specifically, one of the courses is Human Factors in Unmanned Systems, ASCI 638, to obtain my concentration in Human Factors.  As an active duty member of the armed forces, this course has been challenging but very rewarding.  In this course we have completed nine weeks’ worth of discussions and research assignments, which have been very insightful and eye opening.  Additionally, we conducted a term long Case Analysis.  This Case Analysis too has served as an excellent tool in not only preparing for the capstone, but in preparing for work outside of academia. 

      The Case Analysis was a term long research project, which I chose to examine the Standardization of a Single Measurement Unit in Unmanned Aviation.  Without going into all of the details of the project, this is a holdover from manned aviation and I believe this will be a concern as sUAS become more and more popular especially with the integration into the NAS and parts continuing to come from overseas.  This project also required us to interact with our peers via peer reviews.  I like the idea of peer reviews in this format, as in my mind it helps to prepare us for a career where we might be a project manager overseeing a project of a similar scope.


      In this course we have also started a blog to share our research and thoughts.  I plan to maintain this blog and continue to share my ideas, thoughts, and research as I continue in the process to obtain my Master’s Degree in Unmanned Systems with a concentration in Human Factors.  I believe this degree will not only broaden my horizons but will allow me to further contribute ideas and research to my employer.  I am looking forward to continuing this process of learning and continuing classes to complete my degree.