Wednesday, June 8, 2016

Unmanned Systems Maritime Search and Rescue

On October 1, 2015, the U.S. Flagged El Faro went missing as it traveled from Puerto Rico to Jacksonville, Florida.  Onboard was its crew of 33, who were all presumed dead.  At the time of its trip Hurricane Joaquin, a category 4 storm, threatened its route back to Jacksonville.  During the trip the ship went missing, when its main propulsion failed stranding the crew in the path of the storm.
The U.S. Navy used an unmanned underwater vehicle (UUV) called CURV 21 which was able to locate and identify the sunken El Faro, (Almasy, 2015).  The CURV 21 is a 6,400 poubd UUV capable of reaching depths of 20,000 feet.  It uses a “.680 fiber-optic umbilical cable and a shared handling system that can switch at sea between side-scan sonar and ROV operations” (CURV 21 - REMOTELY OPERATED VEHICLE, 2015).  Among its exteroceptive sensors is found a side-scan sonar, CTFM sonar, high resolution still camera, and black and white and color video cameras (CURV 21 - REMOTELY OPERATED VEHICLE, 2015).  Its proprioceptive sensors include an electronic gyrocompass, attitude and heading reference unit, 1200 kHz Doppler velocity log, and 200 kHz altimeter.
The CURV 21 could benefit from the loss of the umbilical, however this umbilical is required due to the amount of data that must pass to and from the remotely located operators.  Currently underwater wireless systems would not allow the CURV 21 to operate at depth of 20,000ft.
With the declining prices of small unmanned aerial systems (sUAS), these systems are finding their ways in to search and rescue as well.  “In the vast wilderness of the Everglades, the SAR operations are often conducted in remote areas accessible by boat or aircraft,” (Safety, 2016).  In a search and rescue environment time is of the essence and sUAS give support teams the ability to quickly launch air assets to begin the SAR process.  The relatively flat area of the everglades would help to keep operators with in visual line of sight (VLOS); remaining VLOS is a current restriction imposed by the Federal Aviation Administration (FAA) of sUAS usage.   Unmanned Surface Vehicles (USV) could also be used to access areas covered by trees or inaccessible by boat.  Additionally, the use of USVs can keep searchers in one central area without the possibility of losing a rescue member or exposing rescuers to dangerous wildlife such as snakes, alligators, and mosquitoes.  While this is not the same environment where the El Faro sank, the benefits of a multi-sensor search and keeping all of the searchers in one area could be beneficial.
Sensor suits of sUAS and UUVs can be similar, but are often used differently.  For example, a camera on a sUAS or on a larger UAS often serve as a long range and short range visual cues.  However on a UUV, light does not penetrate the water as well and built-in lights only have a limited range, thus the cameras are used in close up viewing.  RADAR and SONAR work in similar ways as an exteroceptive sensor, however this is the primary way an UUV is able to see underwater, and would most likely be a tertiary way for the UAS to see.
Platforms in the air, surface (ground and water), and underwater can work together to make search and rescue efforts executed in a timely fashion and centralizing all information.  While collaboration will be key in unmanned operations, this also holds true in the realm of people and sUAS products.  For example an all-volunteer group called Search With Areal Rc Multirotor (SWARM), who has, “over 1,100 SAR Drone Pilots dedicated to searching for missing persons. Our primary mission is to offer and provide multi-rotor (drone) and fixed wing aerial search platforms for ongoing Search and Rescue operations at no cost to the SAR organization or to the family,” (Search With Aerial RC Multirotors (SWARM), 2016).  Through continued advances in sensor technology, all missions of unmanned systems will continue to benefit.


References
CURV 21. (2016 Jun 7). Office of the Director of Ocean Engineering Supervisor of Salvage and Diving. Retrieved from http://www.supsalv.org/00c2_curv21Rov.asp?destPage=00c2
CURV 21 - REMOTELY OPERATED VEHICLE. (2015 Nov 13). US Navy. Retrieved from http://www.navy.mil/navydata/fact_display.asp?cid=4300&tid=50&ct=4
Safety. (16 Feb 2016). Everglades National Park, Florida. National Park Service.  Retrieved from http://www.nps.gov/ever/getinvolved/supportyourpark/safety.htm  
Search With Aerial RC Multirotors (SWARM). (16 Feb 2016). SAR Drones. Retrieved from http://sardrones.org/

Almasy, S. (2015 Nov 2). Sub with camera to dive on sea wreck believed to be missing ship El Faro. CNN. Retrieved from http://www.cnn.com/2015/11/01/us/el-faro-search/

Thursday, June 2, 2016

Are more sensors on sUAS better?

On May 3, 2016, a man from Ohio who was flying his small unmanned aerial system (sUAS) near Cape Marco, FL crashed into a condominium, (Video: After drone crash, Marco council nixes ordinance, 2016).  “The owners of the condo where the drone landed, fearing they were being spied upon, were very upset by the incident, according to a police report obtained by WBBH. Officials, however, did not find any evidence to support that fear” (Man will not face charges after drone crashes into Fla. high-rise condo, 2016).  Furthermore the pilot was registered with the FAA according to the article.  Additionally, in my research he was not within 5NM of an airport and based on the video it was Visual Meteorological Conditions (VMC).

The crash happened after the signal was lost and the fail-safe was triggered for the sUAS to return home, (Video: After drone crash, Marco council nixes ordinance, 2016).  Victor Rios, a council member of the Belize, the condominium that the sUAS crashed at, wrote in concerns: “Based upon my experience and due to the position of the drone on the master bedroom lanai, I believe that it was hovering just above the railing and maneuvering for better position in an attempt to get closer - and this is when the drone hit the edge of the railing damaging the propellers,” (Video: After drone crash, Marco council nixes ordinance, 2016).  The Ohio man was cooperative and allowed the police chief to play the video for the council, clearly showing that there was no intentions of spying.

The DJI Phantom 4, a newer model of the one in the Cape Marco incident, was released early in 2016. “The Phantom 4 is equipped with an Obstacle Sensing System that constantly scans for obstacles in from of it, allowing it to avoid collisions by going around, over or hovering. The DJI Vision Positioning System uses ultrasound and image data to help the aircraft maintain its current positon,” (Phantom 4 User Manual V1.2, 2016).  With this new feature, I can assume that the sUAS would not have run strait in to the condo, but makes me wonder what might have happened?  Could it have stopped and hovered, then crashed after loosing battery life? Or tried to go around and still crashed due to the inability to sense obstacles on either side it?  How might the issue of spying changed if it would have stopped and hovered because of this new sensor employed?


References

Man will not face charges after drone crashes into Fla. high-rise condo. (2016 May 6). News Channel 8 (WFLA). Retrieved from http://wfla.com/2016/05/06/man-will-not-face-charges-after-drone-crashes-into-fla-high-rise-condo/
Phantom 4 User Manual V1.2. (2016 Mar). DJI.com. Retrieved from https://dl.djicdn.com/downloads/phantom_4/en/Phantom_4_User_Manual_en_v1.2_160328.pdf
Video: After drone crash, Marco council nixes ordinance. (2016 May 6). Sun Times. Retrieved from http://www.marcoislandflorida.com/story/news/2016/05/04/drone-crash-marco-council-nixes-ordinance/83921064/

Thursday, May 19, 2016

Finishing ASCI638

      As I round out the final week of my third nine week term at Embry Riddle Aeronautical University, I will be completing two courses.  Specifically, one of the courses is Human Factors in Unmanned Systems, ASCI 638, to obtain my concentration in Human Factors.  As an active duty member of the armed forces, this course has been challenging but very rewarding.  In this course we have completed nine weeks’ worth of discussions and research assignments, which have been very insightful and eye opening.  Additionally, we conducted a term long Case Analysis.  This Case Analysis too has served as an excellent tool in not only preparing for the capstone, but in preparing for work outside of academia. 

      The Case Analysis was a term long research project, which I chose to examine the Standardization of a Single Measurement Unit in Unmanned Aviation.  Without going into all of the details of the project, this is a holdover from manned aviation and I believe this will be a concern as sUAS become more and more popular especially with the integration into the NAS and parts continuing to come from overseas.  This project also required us to interact with our peers via peer reviews.  I like the idea of peer reviews in this format, as in my mind it helps to prepare us for a career where we might be a project manager overseeing a project of a similar scope.


      In this course we have also started a blog to share our research and thoughts.  I plan to maintain this blog and continue to share my ideas, thoughts, and research as I continue in the process to obtain my Master’s Degree in Unmanned Systems with a concentration in Human Factors.  I believe this degree will not only broaden my horizons but will allow me to further contribute ideas and research to my employer.  I am looking forward to continuing this process of learning and continuing classes to complete my degree.

UAS Crew Member Selection

In this paper we will examine a scenario where we work at a company that has just purchased the Insitu ScanEagle and a variant of the General Atomics Ikhana to conduct oceanic environmental studies.  We are now responsible for identifying the required crew positions to be filled and determine the qualifications, certifications, and training requirements.  Additionally a minimum and ideal set of criteria for those potential operators.
In order to identify the crew positions available we will look at each platform’s published figures starting with the Insitu ScanEagle.  The ScanEagle uses a pneumatically actuated launcher and a recovery system called the SkyHook, so it does not require a runway like a conventional aircraft.  Additionally its wing span is only 10.2ft and its maximum takeoff weight is 48.5lbs, making it fairly light and relatively portable.  It has a maximum speed of 80knots and a 24+ hour endurance. Its ground control station (GCS) is portable and allows for point-and-click control, (ScanEagle, 2015). 
With the above information, we can use the ScanEagle in a somewhat mobile operations such as having the launcher on a ship or on the ground.  We would also require, two positions to launch fly and recover a ScanEagle, a launch crewman and operator.  Depending on the length of operations required we may need more than one operator, additionally the launch crewman can be trained in maintenance so that a traveling crew can be kept to a minimum of two if operations allowed.
Next, we will look at the variant of the General Atomics Ikhana.  The Ikhana is an MQ-9 Predator B that has been adapted and instrumented for use by NASA, (NASA Armstrong Fact Sheet: Ikhana Predator B Unmanned Science and Research Aircraft System, 2015).  The Predator B has a wingspan of 66ft and maximum gross takeoff weight of 10,500lbs, thus the system is not as portable as the ScanEagle, (Guardian Multi-Mission Maritime Patrol, 2015).  The Ikhana GCS is portable in the fact that it fits inside of a large trailer.  The trailer houses “the pilot control station, engineering monitoring workstations, science monitoring stations, and range safety oversight,” (NASA - Large UAS Aircraft, 2008). “Predators and Reapers currently require two crew to function: a pilot and a sensor operator,” (Whittle, 2015).  In this scenario, one pilot will be required for Ikhana operations and a ground handling crew will be required.  However depending on the length of a particular mission more than one pilot may be required.
In examining current Certificates of Waiver or Authorization (COA) issued by the Federal Aviation Administration, we will require pilots for either platform to hold a minimum of a private pilot license and pass a third class medical examination, (Freedom of Information Act Responses, 2016).  While it is assumed that most flights will be conducted due regard, where the pilot is responsible for safe separation, it can also be inferred that the possibility exists for taking off while in the National Airspace System (NAS) before switching operations over to due regard.  Additionally, we will not let a pilot act as Pilot in Command (PIC) unless three proficiency flights have been conducted in the past 90 days.  We will ensure we work with General Atomics and Insitu to get the required proficiency before we take delivery of the vehicles and we will maintain this proficiency.
Finally, minimum and ideal set of criteria for these potential operators will be established.  Minimum criteria for selection for pilots on either platform without UAS experience, will be a private pilot certificate and a third class medical examination with at least 18 months remaining.  These are just minimums and ideal set of criteria for no experience in a UAS platform.  Ideally, operators from the military with experience in these platforms would be ideal, but military operators may not have a private pilot certificate or third class medical.  In a case such as this as long as the potential operator was medically cleared for UAS flights upon military discharge, we can supplement obtaining a private pilots certificate.  In either case training will have to be conducted before the operator is able to be a PIC on either platform.
While, both platforms will assumably be operating in the maritime environment in due regard, it is conceivable that the UAS could operate in the UAS.  In order for our company to ensure that we operator safely in the NAS as well as protecting our investment in these UAS systems we have set forth the above minimums for operator minimums.

References
Freedom of Information Act Responses. (2016 May 9). Federal Aviation Administration. Retrieved from https://www.faa.gov/uas/public_operations/foia_responses/
Guardian Multi-Mission Maritime Patrol. (2015). General Atomics Aeronautical Systems. Retrieved from http://www.ga-asi.com/Websites/gaasi/images/products/aircraft_systems/pdf/Guardian_032515.pdf
NASA - Large UAS Aircraft. (2008 Oct 3). NASA.gov. Retrieved from https://www.nasa.gov/centers/dryden/research/ESCD/ikhana.html#.VzEAO_krK00


Friday, May 13, 2016

sUAS Operational Risk Management

Enclosed is an Operational Risk Management worksheet created for the DJI Phantom 2 Vision.  I chose this platform because this is a small unmanned aerial system (sUAS) that I fly.  Additionally, I am awaiting a section 333 exemption, which will allow me charge for services I can provide DJI Phantom 2 Vision.
I started this ORM assessment tool by making a Primary Hazard List (PHL).  The PHL consists of likelihood category and severity category. Each category has 4 possible outcomes; not likely, possible, probable, and certain for likelihood; and negligible, moderate, high, and catastrophic for severity.  All range in value from 1-4 where one has a low value and 4 has a high value.  A matrix could then be built giving a risk assessment value by multiplying the value of the severity and likelihood, see FIGURE 1.
FIGURE 1. PHL for the DJI Phantom 2 Vision.
Once the PHL was completed I moved on to the Preliminary Hazard Assessment (PHA).  A list of hazards were compiled and the likelihood and severity were assessed giving a risk level (RL) value by multiplying the values.  Mitigating actions are also listed for each hazard to decrease the risk level resulting in the residual risk level (RRL).  An attempt was made to ensure all RL could be mitigated to some point but not all resulted in a lower RRL.
An Operational Hazard Review and Analysis (OHR&A) was created but has not been filled out because no flights have taken place.

FIGURE 2. ORM Worksheet for the DJI Phantom 2 Vision.

Lastly, an ORM Worksheet was created.  The worksheet can be completed digitally or manually, the digital method automatically sums entered line values.  The total value can then give the operator or myself an ORM level.  One caveat with the line listed as “AIRFIELD TOWER NOTIFIED” created with the assumption that operations will be taking place within the required 5nm of an airfield where the airfield would need to be notified.  If the flight is outside of this radius then the line can be skipped.

I will incorporate this ORM tool into flying of my DJI Phantom 2 Vision in addition with the checklist that I already operate with.  This tool can also let me assess the operational risk of flying when operating with my section 333 exemption.

Wednesday, May 11, 2016

Automatic Takeoff and Landing

There’s a saying in aviation, “Flying is the 2nd greatest thrill known to man. Landing is the 1st.”  This is because landing along with takeoff is a very dynamic part of a flight, and often the most dynamic part of a flight.  Pilots are often talking to approach or tower controllers, running checklists, scanning gauges, and physically controlling the aircraft during this part of flight.  Even in the early years of aviation, inventors and engineers have sought to ease the demands of an aviator, and this was seen as early only about 10 years after the Wright Brothers first flight, with Elmer Sperry’s gyrostabilizer, (Elmer Sperry, Sr., 2016).  The world of aviation has seen many changes from the early days of flight, but engineers still face many challenges including automation.
            For pilots, talks of automation in takeoff and landing can be a touchy subject for those who pride themselves on being able to land an airplane with hand-eye skills.  The Airbus A320 is capable of taking off and landing through automation.  Once the pilot has aligned the aircraft for the approach, speed is adjusted through an auto-throttle system, in where the pilot turns a dial to select an indicated speed and the aircraft adjusts the throttle levers accordingly, (Airbus A320: Auto Landing Tutorial, 2012).  Once the appropriate speed is set and the navigation aid (NAVAID) is selected, the approach button can be depressed, (Airbus A320: Auto Landing Tutorial, 2012).  During all of these processes the pilot is still receiving information from the aircraft and NAVAID and should be backing up the automation by checking the speed, ensuring NAVAID guidance is being received.  Once the glideslope is intercepted the pilot will further input the final approach speed, lower the landing gear, set the flaps, and set the spoilers and autobrakes, (Airbus A320: Auto Landing Tutorial, 2012).  The pilot will leave their hands on the power lever in the event of a go around, (Airbus A320: Auto Landing Tutorial, 2012).
            This system helps to alleviate the additional demands put on a pilot in the terminal by automating parts of the evolution, allowing the pilots to focus on things such as navigation (visual), and communication.  The system in the A320 is capable of flying in a zero vertical visibility and 50 meter horizontal visibility situation, (Airbus A320: Auto Landing Tutorial, 2012).  With the automated system taking care of the aviation portion of the evolution, the pilot must never forget the basics of, “aviate, navigate, communicate.”  This is the priority of skills often taught to aviators early in flight training.  This being said the pilot is always there in the cockpit ready to take control if required or making the judgement call for a go around.
            These systems have also found their way in to unmanned aerial systems (UAS).  The MQ-9 Reaper, remotely piloted aircraft (RPA), boasted in September 2012 that it had, “successfully completed 106 full-stop Automatic Takeoff and Landing Capability (ATLC) landings, a first for the multi-mission aircraft. The milestone was first achieved with four ATLC landings on June 27 at the company’s Gray Butte Flight Operations Facility in Palmdale, Calif,” (Predator B Demonstrates Automatic Takeoff and Landing Capability, 2012).  Other systems have been created to assist in this process since in its simplest for the computer is simply following a predefined flight path.  “The Visually Assisted Landing System (VALS), lets the drones use their cameras to identify landmarks, adjust speed and direction accordingly, and navigate to a smooth landing. And since runways are clearly defined, flat, obvious pieces of topography, identifying them should be easier… By utilizing the drones existing cameras, the system can be used for both the larger UAVs like the Predator, and smaller drones like the Scan Eagle,” (Fox, 2009).
                                     
FIGURE 1. Vision based landing technology called The Visually Assisted Landing System (VALS).  Courtesy of Popular Science.

            Both manned and RPA can be equipped with systems that can follow a preset published approach to get back on the ground.  But, an RPA is missing one key thing that manned aircraft has, a pilot in the aircraft making judgment calls based on what they are seeing and feeling.  While a RPA operator can see and make decisions, these are typically in a limited field of view and are subject to communications links.  If a communication link is broken, the RPA operator is now, for all intents and purposes, blind.  “Vision-based landing has been found attractive since it is passive and does not require any special equipment other than a camera and a vision processing unit onboard,” (Huh, 2010).  A visual based system would let the RPA make a split second decision such as a go around due to reduced visibility or a fouled runway.
            While aviation has made major changes from its beginnings in 1903, we still face the some of the same basic issues, of trying to find ways to assist the pilot to make aviation safer and easier.  We have even gone so far to pull the pilot out of the aircraft, but still face some of the same issues.

References
Airbus A320: Auto Landing Tutorial. (2012 Apr 6). BAA Training. Retrieved from https://www.youtube.com/watch?v=LIaMALJjOEc

Elmer Sperry, Sr.. (2016 Apr 27). The National Aviation Hall of Fame. Retrieved from http://www.nationalaviation.org/sperry-sr-elmer/

Fox, S. (2009 Aug 4). Popular Science. Retrieved from http://www.popsci.com/military-aviation-amp-space/article/2009-08/new-system-allow-automated-predator-drone-landings

Huh, S., & Shim, D. H. (2010). A vision-based automatic landing method for fixed-wing UAVs. Journal of Intelligent and Robotic Systems, 57(1), 217-231. doi:10.1007/s10846-009-9382-2. Retrieved from http://search.proquest.com.ezproxy.libproxy.db.erau.edu/docview/873356418?pq-origsite=summon


Predator B Demonstrates Automatic Takeoff and Landing Capability. (2012 Sep 17). General Atomics Aeronautical. Retrieved from http://www.ga-asi.com/predator-b-demonstrates-automatic-takeoff-and-landing-capability

Tuesday, May 10, 2016

UAS Shift Work Schedule

              As a human factors consultant for the MQ-1B squadron for the United States Air Force (USAF) this paper will attempt to address concerns of fatigue brought to the attention of the squadron’s Commanding Officer (CO).  Figure [1], has been submitted on behalf of the MQ-1B squadron outlining current operations.
“Fatigue is a condition characterized by increased discomfort with lessened capacity for work, reduced efficiency of accomplishment, loss of power or capacity to respond to stimulation, and is usually accompanied by a feeling of weariness and tiredness,” (Salazar).  Military life can lead to, “Late nights, deadlines, night-shift work, early briefs, time-zone travel, deployments, combat stress, and anxiety all compete for limited sleep time,” (Davenport, 2009).  These factors along with dealing with stresses of home life and collateral jobs create what is known as operational tempo (OPTEMPO).  “[C]ombat-aviation missions are presumably significantly more stressful than commercial air-transportation operations. For instance, although airline-transport pilots no doubt experience stress from their responsibility for the safety of up to 400 passengers, they are rarely targets of enemy aggression. Combat pilots, however, routinely perform their duties under imminent and palpable threats to their own safety and, in fact, their very lives,” (Caldwell, 2008).
Current OPTEMPO requires 3 shifts, Day, Swing, and Night.  Each team takes one particular shift for a 6 day week and then shifts the following week.  While this helps to create a circadian rhythm for the week, this is desynchronized the following week. By the end of the week as synchronization is happening, it is then changed again.

Figure 1
Figure 2
Proposal of the schedule in Figure [2], still has a 6 day work week stating with 2 Nights, 2 Swings, 2 Days shifts.  This schedule allows for a clockwise flowing shift change, allowing the body to adapt quicker and easier.  Additionally, since the week start on a Night shift and ends on a Day shift it allows for 78.5 hours off or 3.2 days of continuous time off.
Additionally, since this appears to be a high OPTEMPO environment, it is suggested that the CO seek advice from the flight surgeon on the USAF “Go, No Go” Program or the prescription use of stimulants and sedatives.  In this author’s opinion, having flown ISR missions for the United States Navy (USN), the prescription use of the sedatives can help in a high OPTEMPO environment.

References
Caldwell, J. A. (2008). Go Pills in Combat: Prejudice, Propriety, and Practicality. Air & Space Power Journal (Fall 2008). Retrieved from http://www.airpower.maxwell.af.mil/airchronicles/apj/apj08/fal08/caldwell.html

Davenport, N. (2009). MORE FATIGUE! (yawn). Approach, 54(3), 3. Retrieved from http://search.proquest.com.ezproxy.libproxy.db.erau.edu/docview/274584729?pq-origsite=summon


Salazar, G. Fatigue in Aviation. Federal Aviation Administration (Publication # OK-07-193) Retrieved from https://www.faa.gov/pilots/safety/pilotsafetybrochures/media/Fatigue_Aviation.pdf