God is my Co-Pilot……or is it Drone Deploy?

God is my Co-Pilot……or is it Drone Deploy?
3D Model 1

Introduction

Recently I was approached about one of my previous blog posts on Vineyard Surveys https://theuavguy.wordpress.com/2014/09/26/drones-above-the-vineyards/ by a Drone Startup based in San Francisco, Drone Deploy. In my previous blogs I’ve stated the importance of a whole system approach to UAV work with a smooth end-to-end flow. It’s not about the drone, or how sexy it is. It’s using the drone as a tool to get actionable data that can be used to improve your crop yields, reduce your survey times, in a sense make you work more efficiently with less workload. What Drone Deploy suggested is that they had taken the processing section of the workflow and automated it. From my experiences and from talking with other UAV operators, it’s clear flying and capturing useful images/data is 20% of the time, the rest of the 80% is data processing, analysis and decision making. However Mike Winn, CEO and founder (Jono Millin and Nicholas Pilkington are also founders) at Drone Deploy told me that they had devised a product that would take the data as the drone is flying, yes as it’s flying, and start processing as it’s flying, and 15 to 30 minutes after landing you would have all the processed data available. Really? This seemed too good to be true, as such I set off on this investigation to see how well Drone Deploys system worked.

 

What is Drone Deploy?
I first came across Drone Deploy in May 2014 at the sUBSExpo in San Francisco, where they presented their Co-Pilot hardware and software post-processing system. The premise behind the presentation is that Drone Deploy gathers your images as they are taken and uploads them to the cloud, where they are then processed on extremely fast and optimized servers designed for this application. The results are the presented via the cloud to the user by any device with a web browser interface. In this way the cloud processing can achieve image post processing times unobtainable with your normal system at home or office.
So I hear you ask, how do you get the images from the aircraft to the cloud? Well Drone Deploy have addressed this by using a cellular module, the Co-Pilot (thus the blog title,) which runs at LTE data-rates to allow images to be pushed to the cloud from the camera as it is taking the images. So you start to realize now how much work Drone Deploy have put into the Co-Pilot and the associated cloud processing. The Co-Pilot itself has the LTE module for image transfer to the cloud, a Wifi module that talks to the survey Camera and a telemetry link that talks with the Flight Controller. That’s pretty cool. Why? Because the Co-Pilot takes the GPS coordinates from the flight controller when the camera is triggered, and stamps it onto the image. As such none-GPS cameras can be used to generate Orthomosaics, which can later be overlaid onto Google Earth. However what’s even cooler, is that the LTE link is bi-directional. What’s the advantage of this? It allows you to talk to the aircraft from your smart device via a web-browser, which really doesn’t seem like a major deal. But what Drone Deploy have done is integrate the flight controller mission planning into their own web browser mission planner. This allows you to plan a mission, fly the mission, and look at the post-processed mapping/survey data on the same device just by using a web browser, no telemetry links etc. Also Drone Deploy have addressed multiple markets and platforms with their Web-based mission planning and data viewing App. First they support both fixed-wing and rotary aircraft including helicopters and multirotors. Secondly, they have devised mission planning categories for surveying, agriculture, construction and 3D modeling. Chose the category to best plan your mission.

So what happens if you don’t have cellular coverage or if you don’t live in North America. Well Drone Deploy is working on both of those points. You can pre-plan a mission and fly without the cellular telemetry downlink. Once you are back where a cellular signal exists, trun on the UAV, and it will upload all the images to the Drone Deploy cloud and process them. Also in the future Drone Deploy is looking at adding WiFi/Bluetooth links to do in-field communication from your smart device to the UAV.

OK so what about international customers, again Drone Deploy has the foresight to see the UAv community as been global, and as such is rolling out deployment in the next months overseas. Keep watching.

 

What does a Drone Deploy Mission Look Like?
So here is a test case using a multirotor:
1. Put the multirotor at its takeoff point, turn on your transmitter, turn on the UAV camera and then connect the UAV battery. At this point the UAV will connect to the Drone Deploy cloud server and upload any updates if necessary.
2. Get out your smart device be it an Android phone, iPhone, laptop, tablet and log on to Drone Deploy website and enter your account information. You will then be taken to the Dashboard, which has all the missions you have planned off-line (you can plan missions in the warmth of your office/home) or missions flown with generated maps.
3. You now have the choice to plan a new mission, fly a mission you planned off-line, or re-fly a mission you have already flown.
4. Let’s plan a new mission. Pick a category, such as Agricultural.
5. You will be asked which aircraft profile and camera you want to use and your present location.
6. Draw a perimeter around your required survey area and define your required ground resolution per pixel or altitude.
7. Drone Deploy will then generate a flight path to optimize the correct image overlaps to allow correct stitching. What’s also important is that it only takes enough images necessary. Having too many extends processing time. As such normal rules are fly as high as you can, as fast as you can and get just enough images to allow quality data stitching and analysis. Drone Deploys figures that all out for you. As such the Co-Pilot also triggers the camera according to when it determines trigger points are in its computed flight path. Note, the flight path generation also checks flight duration versus aircraft capability. As such if the mission exceeds the aircraft endurance this will be flagged as a safety issue.
8. By now the multirotor status will be visible on the webpage and will report connection, updates, GPS-lock and finally FLY NOW.
9. You now click on the symbol “FLY NOW” and the Co-Pilot will go through a preset checklist, it will connect to the camera, check camera battery, check multirotor battery, ensure you have GPS-lock, and take a test image and down-load it from the copter to your browser. If all checks pass, the checklist is passed and you get the “Ready to Launch”. If any check fails, the copter will not launch, this is an inherent safety feature so you do not take off with low batteries, no GPS lock etc.
10. Now switch your transmitter to Auto, and press “Launch” on your browser screen. A notice will come up stating “Takeoff in 5 Seconds” The countdown will begin and at 0 the copter will takeoff and climb to survey height. It will then start to fly the mission.
11. During the flight, the images are downloaded to the cloud and processed as you fly. The images are also presented on your web browser screen along with aircraft speed, height and battery level. The flight plan is shown, with the aircraft progress, direction of travel and the direction it’s pointing.
12. The next bit is truly impressive. During the flight and as images are processed in the cloud, stitched results are placed in realtime, yes in realtime on the flight path of your web browser. From anyone who has used stitching software this is like a miracle to watch, realtime stitching whilst the copter is still in the air!
13. At the end of the survey, the copter returns to the take-off point (or the point you want it to land) and lands.
14. After landing large sections of the area will be stitched or even fully stitched. Results may even be available already. However results such as stitched Orthomosaic, ENDVI maps, Digital Elevation Models and 3D models will be viewable in approx. 15 minutes depending on mapping area.
15. So now power-off, pack-up your copter. Then get your smartphone etc. view the results as you are stood in the field, and using geo-referenced survey map generated from the mission go crop ground truthing etc.
16. On to the next site.

Note: Safety is paramount and Drone Deploy uses an extensive three dimensional geo-fence, that monitors aircraft deviation away from the expected flight corridor. Excessive deviations will trigger a return home and land. They also supply No-Fly Zones during mission planning.

It really is that simple and fast. Here is a video of the whole process using a preplanned flight, look how fast it takes (apologizes for iPad glare.)
http://youtu.be/c28uEXJvW6U

And here is a video of 20 acres been stitched in realtime:
http://youtu.be/d4gqTXSollw

 

Test Bed

For the initial exercise a 3D Robotics Iris+ was used. The reasoning behind this decision was to prove out that a small consumer drone could fly a commercial survey mission. Remember my initial point “the drone it’s just a tool”. I see a lot of UAV manufacturers with excellent products, however the price tags are in the $15k to $30k price range. Again all the drone does is act as a camera platform, it’s the images from the camera that are the important bit. If you want quality data, focus on the payload system, and just use the drone as a tool to carry it. Could a $750 consumer drone do a full commercial UAV survey using Drone Deploy? Most farmers and agronomists want a ROI on new technology, so why not give them a tool they can get a feel for UAV’s in precision agriculture, but without breaking the bank account? Only one way to find out.

Iris+_White

So next was the choice of camera and gimbal. Obviously, this is the tradeoff of resolution, weight, flight time and spectral coverage. The Iris+ can carry a GoPro Hero 3 or 4 either in a hard mount or using a Tarot Gimbal. To ensure good data a gimbal is used such that the camera is always pointed straight down in a NADIR position. This position and the brushless gimbal ensure the image is sharp with the correct overlap. However it is expected that the hard mount could be used and is a future investigation.

Iris

So now talking about the GoPro. Obviously we want to do vegetation health and stress analysis so we need some type of Near-Infrared, or Red Edge camera. This requires taking a stock GoPro, opening it up and removing the normal lens/filter used for RGB daylight photography/video:

http://youtu.be/_tUl-BzZN70

IR_Pro

(If you are not confident in doing this please use a vendor purchase from RageCams or IR-Pro instead.)
Once the old lens is removed, use a new replacable screw-in NIR-GB lens. There are number of lens that exist on the market from IR-Pro http://www.ir-pro.com/ and Peau http://www.peauproductions.com/main.html

IR_Pro_1

So what results should you expect from a GoPro NDVI? The most important thing to recognize is that this is not a referenced NDVI camera. To go that route you are looking at much more expensive solutions. Here we are talking about a cheap scouting copter for farmer to take the first steps into UAV with a short ROI which does not break the bank, but can generate actionable data and can identify vegetation and crop issues, that are correlated to ground data. The holy grail of UAV NDVI is still that, you speak with anyone with experience in this field (bad pun) and they will tell you that although everyone holds NDVI up as the only metric. That’s untrue, in reality it’s using any combination of cameras to generate UAV images, which are combined with ground data to generate actionable data. Different crops work better with different cameras, processing, camera combinations i.e. NGB and FLIR for example. There is no single camera/formula which will work for all solutions. Of course resolution, imager size, rolling shutter effects etc. play into this as well, but the question here is, can you use a GoPro for NDVI that allows useful correlation with ground data.

So let’s do some comparisons. Below are 2 scenes, one of an open space, one of a plant, with photographs taken with a normal RGB Canon SX260HS, the GoPro with a IR-Pro InfraBlue22 NDVI lens, a Canon SX260HS with an older style Event 38 filter, and finally a Canon SX260HS with a new style Event 38 filter. That’s 8 photographs in all. Obviously you cannot vegetation health from the RGB, but it’s our reference to understand what we are looking at.
RGB Photographs

RGB_FieldRGB_Plant

Now look at the GoPro NDVI photograph. You can see the vegetation is red/brown/pink, whilst the manmade objects are mostly as viewed by our RGB reference. However you see some pink tinge to the manmade objects.
GoPro Photographs

DCIM100GOPROGOPR0283. DCIM100GOPROGOPR0296.
Looking to the older Event 38 filter, you can see that this has similar type of spectral separation, with some overlap between the vegetation and manmade objects. Here the manmade objects such as the road and buildings have a slight pink tinge to them, but less than the GoPro. This is due to the red notch roll-off, filter corner frequency.
Old Event 38 Photographs

Old_E38_Field Old_E38_Plant

On the newer Event 38 filters, you can now see good separation of spectral content, with manmade objects having no or little pink, and the vegetation been much better defined pinks/browns. This is because the new filter has a roll-off which is steeper and the corner is moved slightly up in the spectrum.
New Event 38 Photographs

New_E38_Field New_E38_Plant

As such you can see how 3 different NIR filters with good imagers, two with the same Canon imager, can generate significantly differently results. For today the flights are only concerned with the GoPro InfraBlue22 NDVI lens, but in later blogs I’ll be looking at filters for Canon Point-Shoot and the Sony Alpha range of cameras, as finally with an Event 38 GoPro lens using the filter shown in the above Canon SX260HS. At that time I’ll be looking to do a fly off of all the different lenses and cameras, thus showing the relative merits and drawbacks of each.
Now one curious point about having a GoPro InfraBlue22 NDVI lens, is that you can now shoot NIR video, and here is a sample. Again look at the ground color which is sparse due to the California drought, the accumulated water from the first rain in months, the vibrant color of the trees, and manmade objects like the path and buildings:
http://youtu.be/hWKzjpjLM4E
Now one question I have been asked a lot is, “can I fly a GoPro NDVI and look at the video to see crop health. In this way I won’t have to do all this image stitching and processing.” I’m afraid not, the GoPro InfraBlue22 is not really NDVI camera, what it does is captures NIR, Green and Blue spectra which then needs to be processed by software on a pixel by pixel basis to generate a NDVI image, using a NDVI, ENDVI, DVI formula. Many formulas exist and each formula has its own merits in terms of results for different crops, sun and cloud conditions, filter type etc. To get an NDVI image of a large area, you need to capture a set of overlapped still images in NIR-G-B (or other band combination dependent on filters), stitch them together and then process them according to a formula. If anyone has a NDVI video processing software let me know, or I’ve just given you your next Startup or Kickstarter idea.

 

Test Cases
Three test cases were flown. The flights were preplanned away from the field, then flown using just an Apple iPad with Internet connection whilst at the field. Two surveys covered 20 acres at an altitude of 80m and took approximately 9 minutes to fly, whilst the last survey covered 5 acres at 50m with an elapsed time of 5 minutes. The flights were flown with the GoPro 3 Black converted and using the IR-Pro InfraBlue22 NDVI lens, in a downward pointing NADIR position, in a Tarot Gimbal, flown on a 3D Robotics Iris+ multirotor. The test sites were chosen as they consisted of a number of plant habitats and included wild and mown meadow, manmade objects, seeded lawn with grounds keeper care. To act as a reference, the images from each survey were also processed in Pix4D Pro as well. Due to the nature of the test sites, stitching was not easy and this was intended to test the efficiency of the algorithms.

 

Test Case 1
Description

Wild Meadow in drought with rolling hills with a general declining elevation away from the takeoff point. Mixture of wild meadow, mowed meadow, brush, with fences, paths and cattle. Flight covered 20 acres at an altitude of 80m and took 9 minutes to fly. 37% battery left.

Drone Deploy Flight Path

Flightpath_Meadow

Drone Deploy Orthomosaic

Mosaic_Meadow1

Drone Deploy NDVI

NDVI_Meadow1

Drone Deploy Digital Elevation Model

DEM_Meadow1

Drone Deploy 3D Model

3D_Meadow

Pix4D Ray Cloud

Pix4D_Ray_Meadow

Pix4D Orthomosaic

Pix4D_Moasic_Meadow

Pix4D NDVI

Pix4D_NDVI_Meadow

Ground Photographs

Picture 1 – Looking South over the Survey Site

IMG_0634

Picture 2 – Ground vegetation and brush

IMG_0644

Picture 3 – The hole in the ground and the bump

IMG_0652

Picture 4 – The path up the left side of survey area

IMG_0696

Picture 5 – The road and banking to the right of survey area

IMG_0835

Ground Truthing
Picture 1 shows a view looking over the survey sight. You can see the rolling nature of the meadow, which is made up of grass, vegetation and brush shown in picture 3. This is common grazing meadow. Interesting if you look towards the center of the photograph at the next hill, just to the left you will see an off-road vehicle track which bends around the hill. Now look at the Orthomosaic and NDVI to the right side and you will see a blue area which bends. This is the same feature, but why the blue? Well look more to the right towards the road and compare to Picture 5. You’ll see a bank, but on the NDVI where the bank is again a blue area. Now look at the Orthomosaic, NDVI, DEM and 3D model and in the middle you will again see a blue area. Looking at Picture 3 this can be seen to be due to a big hole in the ground. So what is causing the blue? Well the survey was taken in the afternoon in late December, with the Sun low in the horizon, as such the blue is from shadows. Now look at the NDVI to the left side and you see an area of blue, this is actually shadows from the dead brush shown in Picture 2.
The rest of the NDVI shows varying degrees of green which correlates to different densities of grass throughout the meadow. In Picture 4, you can see a path to the left of the hole, which is easily seen on the Orthomosaic, NDVI, DEM and 3D Model. And that is what’s interesting is the amount of data produced, from different vegetation densities, identifying elevation changes, depressions and water drainage areas, fence lines, paths etc. It’s data rich information from a single 9 minute UAV flight over 20 acres.

 

Test Case 2
Description
Meadow in drought with wild and mowed areas, buildings and concrete path in survey area, see 3D model. Takeoff was from the top of a hill to the left center of the survey site, with the land dropping away in altitude in a rolling manner. Rain the previous days highlighted by areas of accumulated water in low lying basins. 9 min 10 sec flight at 80m with 41% battery left at landing.

Drone Deploy Flight Path

Flightpath_Meadow

Drone Deploy Orthomosaic

Mosaic_Bike1

Drone Deploy NDVI

NDVI_Bike1

Drone Deploy Digital Elevation Model

DEM_Bike1

Drone Deploy 3D Model

3D Model_Bike

Pix4D Ray Cloud

Pix4D_Ray_Bike

Pix4D Orthomosaic

Pix4D_Mosaic_Bike

Pix4D NDVI

Pix4D_NDVI_Bike

Ground Photographs

Picture 1 – Looking South of take-off point

IMG_0610 - Copy

Picture 2 – Looking North of take-off point

IMG_0612 - Copy

Picture 3 – Looking at depression in ground South of take-off filled with water

IMG_0619 - Copy

Picture 4 – Looking to the West of take-off point, towards mowed area, treeline and Sun

IMG_0620

Ground Truthing
The DEM and 3D Model and good representations of the survey site, with the DEM showing the marked elevation changes, which can be seen by looking at the ground photographs. The buildings can be seen in Picture 2, and visible at the top of the mosaic and NDVI. The tarmac bike paths can also be clearly seen.
To the middle left of the NDVI image can be see a blue area. Closer examination shows this to be due to the shadow of the trees, look closer at the shadow shape. It can also be seen in the Orthomosaic. Looking around further, and comparing to the DEM, you can see that the significant areas of blue are all on the right side of elevation slopes in the DEM. As such these are again shadows due to elevation changes and plants/bushes. This observation falls in line with Test Case 1. In this case the survey was run in late December at around 3pm, where the Sun cast shadows. Another reason for blue is water. Look at Picture 3 with the depression filled with water, then look at the same location in the NDVI and Orthomosaic. So here you have similar colors for different reasons. Another point to watch out for.
Different plant and vegetation densities show up as different shades or colors. In this case more dense areas showed up as a stronger green, whilst mowed areas and paths were more of a darker green. You can see the differences by comparing Picture 4 where you can see unmowed/mowed with the Orthomosaic and NDVI imagery.

 

Test Case 3
Description
Soccer field cared by ground keeping staff. Covers approx. 5 acres. Flight time of 5 minutes at 80m, with 62% battery left after landing.

Drone Deploy Flight Path

Flightpath_Soccer

Drone Deploy Orthomosaic

Mosaic_Soccer

Drone Deploy NDVI

NDVI_Soccer

Drone Deploy Digital Elevation Model

DEM_Soccer

Drone Deploy 3D Model

3D_Soccer

Pix4D Ray Cloud

Pix4D_Ray_Soccer

Pix4D Orthomosaic

Pix4D_Mosaic_Soccer

Pix4D NDVI

Pix4D_NDVI_Soccer

Ground Photographs

Soccer_Field

Ground Truthing
From the Orthomosaic, NDVI from both Drone Deploy and Pix4D, the question is, what’s with the circle patterns? Well that’s a good question. Just looking at the ground photograph it’s hard to even see the pattern unless you already know it’s there. So what is it? To be truthful I can say what the difference is, but not why. The area within the circles appears to be of a different density of seeding, with more densely packed growth outside to inside the circles. As to why a circular pattern, I have no idea. There are no sprinklers to generate such a well defined pattern. At this time I’m going to approach the field owners to ask some questions.

 

Drone Deploy Observations
So let’s go over my observations from this investigation of Drone Deploy:
1. UAV data needs to be ground truthed. Without correlating ground and aerial data, you have no means to interpret what you are seeing. UAV images and in particular NDVI imagery however can highlight issue areas, and with experience seasoned UAV operators can pull on past knowledge to make educated assessments.
2. Drone Deploy is fast and simple. Its GUI is intuitive and actionable data is delivered in a simple to understand format. Drone Deploy is so easy to use via a tablet, a non RC experienced user can easily fly a survey, and get quality results in a very short time.
3. Stitching on the go, it’s unheard of and it’s not a gimmick. Having the ability to see maps generated as you fly means spotting issues with images whilst in-flight, or issues in the field straight away.
4. Actionable data, it’s not just a saying. Drone Deploy generates Orthomosaic overlayed on Google Earth which has great alignment. It’s amazing to watch as you zoom in the quality of detail in Drone Deploy area, compared to surrounding Google Earth imagery.
5. You get 4 sets of data in one go, Orthomosaic, NDVI, Digital Elevation model and 3D Model all in one . Also all the data can be exported and shared.
6. The GoPro does not have GPS and for this exercise I didn’t try to match the GoPro clock to the Pixhawk clock, so I could align using the Iris+ logs. Instead the Pix4D images were stitched with un-geotagged images. As such I was quite amazed at the speed and ability of Pix4D to stitch none geotagged images from a GoPro. At no time did Pix4D fail to stitch the images from any survey flight I flew.
7. Most parameters on the GoPro are controllable, except one, shutter speed. My expectation, and one raised by Agribotix in their blog in the past, is that the lack of shutter speed control with flight speeds of UAV’s would generate blurring. In fact this was not the case. You’d be unable to stitch images if this was the case.
8. Drone Deploy and Pix4D data correlates well, showing that both companies have done their homework on how to generate data that is quality, correct and actionable.
9. You can fly 20 acres with a consumer drone such as an Iris+, capturing NGB images to generate NDVI data whilst the aircraft is still in the air. Given the battery capacity left and flying at 107m it is feasible to cover 25 acres with such a setup.
10. NDVI data is not infallible, it is a function of camera quality, filters, Sun and lighting conditions. Be aware of the limitations of the technology you are using to get the best data available.

 

Drone Deploy Conclusions
Overall I’m very impressed with the Drone Deploy system. This is a disruptive technology in the commercial UAV space, where the UAV is viewed as a tool rather than pretty piece of sexy technology. In this space the UAV is just a part of the whole system, an important part, but the end goal is actionable data. Using a consumer drone from 3D Robotics, the Iris+, and flying a NDVI converted camera, Drone Deploy facilitated a number of 20 acre NDVI crop scouting flights, with actionable data available in minutes that correlated with ground data. On top of that Drone Deploy have made a system that a worker in the field with no previous experience of RC flight can be trained on a flown with a smartphone or tablet. Nothing else today exists to do this. I believe Drone Deploy has a very bright future.
If you are interested in learning more about this system, drop me an email at iain.butler@kextrel.com

 

Acknowledgements

I’d just like to personally thank the following people for making this investigation a success. Mike Winn for initially reaching out to me to discuss the vision of Drone Deploy. Jono Millin and Nicholas Pilkington for business and mapping support, and Jeremy Eastwood, Chase Gray and Manu Sharma for technical assistance. Best of luck and I’m sure that you will do well. Also congratulations on hiring Gretchen West.

In my next blog I aim to cover flying a number of different cameras on a 3D Robotics X8M platform with and without Drone Deploy. The cameras will range from GoPro, through Point and Shoot to high end consumer.

Regards

@theUAVguy

http://www.twitter.com/theUAVguy

http://www.kextrel.com

The LA Drone Expo – Is it really all about the Drone?

The LA Drone Expo – Is it really all about the Drone?

DSC_1190

On Saturday 13th December, I had the fortunate opportunity to attend the first Los Angeles Drone Expo held at the LA Sports Arena. The Expo was organized by the UAV Systems Association in collaboration with the Tesla Foundation. Doors opened at 10am to a full crowd and closed again at 7pm. The Expo crowd ranged from local to national visitors, exhibitors and speakers.

 IMG_0426IMG_0427

 

Exhibitors

The Expo was aimed at the small business consumer and commercial market, and it was well represented by the leading companies in this field. In particular booths that stood out were those by Aerial Media Pros, DJI and Drone Dudes. One point that was glaringly obvious is that it was virtually impossible to walk past a booth without seeing a DJI Phantom, they were just everywhere.

DSC_0151DSC_0145

Two drones were generating significant attention, the DJI Inspire 1 and the DreamQii. The Inspire has really caught people’s imagination, and even with the $3k price tag people are snapping them up, even though people acknowledge the camera presently has deficiencies. It was the sought after drone to touch and play with. Here is a video I took in the Expo flight area, you have to be impressed with that retracting landing gear:

http://youtu.be/71oI0t4IIqo

IMG_0435

DreamQii got a lot of interest as well, it is easy to see why they have raised over $1.3Million on Indiegogo based on the interest at their booth. The days of crowd sourced drones are upon us given the success of the DreamQii on Indiegogo, and Air Dog and Hexo+ on Kickstarter.

IMG_0456

Talking more about the exhibitors, the Expo was split between FPV racer, GoPro capable drones, Heavy Lift video drones lifting Red Epics, rounded out with numerous fixed wing drones and Software vendors.

DSC_0164DSC_0167

There were two omissions that did catch the eye. 3D Robotics and Precision Hawk, although they were Sponsors they didn’t have booths at the Expo. 3D Robotics though was represented by Brandon Basso who did attend as a speaker.

DSC_0159

The Speakers

The speakers attending the LA Drone Expo were a well-rounded group with a list of UAV Specialist from different fields. Examples are Brandon Basso from 3D Robotics, Chad Colby from 360 Yield Center covering Agriculture, Gretchen West formerly Executive VP of AUVSI covering present UAV business environment, Jono Millin from Drone Deploy, Antoine Martin of Pix4D, Lisa Ellman with her Polivation speech. The list goes on. Feedback from the Expo crowd was that the speeches and panels were well received, and attendance at the panels was high.

DSC_0140

They were two noticeable events in the speaking session, one occurred when a former government employee interacted with the crowd in a brash interactive manner when talking about UAS regulations, which didn’t go down well. The room was full, as regulations are a focal point of interest for the commercial UAV community, however it was hard to swallow missives about careless UAV operation and we as a community had to be regulated to protect ourselves. You can imagine the atmosphere. The other event was when a protest occurred near the main speaker stage. I wasn’t near the stage when this happened, but it appears the protest was about Military drones and Codepink, which was a bit off the mark as this was a commercial UAV Expo?

IMG_0458

A couple of small points did exist with the Speaker sessions, the timetable was not well advertised apart from on the Expo website, speakers were shuffled in order, and to compound issues there were two speaking stages and most people didn’t know the second stage existed. A number of people commented to me how they had missed presentations and panels due to these issues. Also the presentations were not videoed for later posting on-line. Having organizing and attended conference I understand the significant work involved in putting on an event of this size, and I’m sure AUVSA will correct these issues for next year.

 DSC_0169

The Discussions

Well you cannot go to a consumer and commercial drone expo without the present FAA situation been discussed. I talked with a good representation of people at the Expo, ranging from FPV racers, Phantom flyers, Commercial Video operators, Ex-Military UAV Operators, to company CEOs and Directors. There were many opinions but one common theme, the commercial UAV technology is evolving at a faster and accelerating pace and the FAA is failing to keep up. In a matter of fact most people believed the FAA is falling behind further each day, with one operator saying “The FAA is just in over its head and has no idea what to do.” Pretty sobering information.

DSC_0182

Sensors, Data, and Data Processing

I also sat down with Colin Snow (@DroneAnalyst) and chatted about our pasts, plus present observations and experiences and what the future holds.

IMG_0504

A drone theme was solidified from this discussion, which has been coming to the forefront of my thoughts over the past months, and from discussion with exhibitors. The drone is really just a tool or platform, much as a saw is a tool to a Carpenter. The purpose of a drone is to add a new perspective, to give us information from above in a more efficient and cost effective manner than manned and space based platforms. This is true if we are talking about photography, video, mapping, crop analysis, wildlife observation etc. The drone is really a platform for a sensor, what we are now seeing is that the information and data from that sensor payload is the focal point. Next once we have the data, how do we process that data fast and efficiently to generate actionable data, data that is useful to the end user.

DSC_0172

To this end, you see companies such as MicaSense who closed $2Million in capital from Parrot. Chad Colby was kind enough to show me a MicaSense camera, which is a multispectral camera with 5 sensors with a 1.6 Mpixel resolution. This pitches it right into TetraCam territory, but with a $6500 price tag and the ability to use GoPro batteries and other useful functions. Using multispectral sensors you can generate more data on the same flight. This can be useful where different crops and crop issues react in different spectral bands. Here the multispectral camera should come into its own, helping improve crop yields.

DSC_0185

FLIR and IR-DISTRO were also present, representing the IR sensor market, with higher refresh frequencies and higher resolutions. Then you have Peau Productions, GoPro and Red Epic, with the video market. The Peau Productions booth was always 2 to 3 people deep, the interest in the GoPro modification market is very strong.

DSC_0154DSC_0165

Now data without processing is useless, and this was discussed a number of times, Bret Chillcott owner of Ag Eagle and Chad Colby from Yield 360 explained how flying the fields is great, but you just generate so much data, that present processing techniques require too long to generate actionable data. In an essence your processing cannot keep up with how fast data is generated. This is only going to get worse as multispectral sensors such as MicaSense become more widespread.

DSC_0184IMG_0462

To overcome this companies like Pix4D and Drone Deploy are coming up with innovative ways to reduce the time from capturing data from flying, to generating actionable data. Pix4D which was at the Expo for example has streamlined its processing, and has new algorithms to address multispectral cameras. Here one camera is defined as the master and the majority of camera position relative to image processing is done on this camera, the remaining cameras are then treated as copies with defined distance offsets in X & Y from the master. In this way the multispectral camera data can be processed much faster. I’m a Pix4D user, so these improvements look very promising with new GPU and multi-processor options and a new 3D textured mesh model.

DSC_0162

Drone Deploy also present at the Expo has a more distributed processing approach and uses Cloud processing. As the drone is flying, images are loaded via cellular network to the Cloud, where high speed servers process the data whilst the drone is still in the air. 15 to 30 minutes after landing, actionable data is generated. Additional functionality exists, where the copter or plan, can be programmed, flown, and final data reviewed all in the field using just an iPad style tablet, smartphone or laptop. So what Drone Deploy have just done is made drone flight possible with almost real time actionable data for the farmer in the field, the inspection team looking at solar farms, to the construction inspector. What is also innovative is that as the system is Cloud based, the UAV can be flown by an operator, whilst the data can be reviewed by an analyst thousands of miles away. This means multiple global UAV operators can feed data back to one remote analysis person/group. Even more impressive is that Drone Deploy also allows you to fly multiple drones from one site concurrently. You don’t need to be a RC hobbyist to generate actionable data from drones anymore. Just ask Gretchen West, she was the Executive VP of AUVSI and an advocate for a decade in commercial UAV operation. It was announced at the LA Drone Expo that she is now joining Drone Deploy as it’ Business Strategist and Regulatory Affairs spokesperson. I do believe Drone Deploy is on to something. I’ve recently tested Drone Deploy myself in the field and I was impressed, how it simplified the end-to-end workflow for actionable drone data. What was good to understand, is that my impressions were backed up by other Expo attendees.

IMG_0428

Another innovative software drone company was the new startup Pixie Path. Again this company is looking towards the Cloud, but with the intention of cloud operation of drones. This in a sense is along the path of Drone Deploy, but here the operation is more autonomous than autopilot and data gathering. This style of autonomous cloud based flight operations removes the human, and allows a Cloud system to interface and control the drone/s. In a sense the human element and hopefully human issues are removed, although I wonder how the FAA will react to that. The hope is in the future this style of drone operation will allow easier integration into the National Airspace (NAS), as cloud drone operations can directly interface to ATC. It’s an interesting concept and worth watching.

 

The Takeaway

So what was the key takeaway for me from the Expo? The North American commercial drone community is innovating and thriving, the barrier is policy and regulations. This is a long told story.

What is an emerging trend though, is how the drone community has matured over the past year. Drones have gone from been purely military, to the Phantom stage, and now we are entering the emergence of the professional small commercial businesses. The focus is now shifting away from the drone, but more to what applications can we use them for, how do we do it more efficiently, how do we help generate actionable data?

So in the end the Takeaway is:

 

It’s not about the Drone, it’s really all about the data.

 

theUAVguy

https://twitter.com/theUAVguy

http://www.kextrel.com