God is my Co-Pilot……or is it Drone Deploy?

God is my Co-Pilot……or is it Drone Deploy?
3D Model 1

Introduction

Recently I was approached about one of my previous blog posts on Vineyard Surveys https://theuavguy.wordpress.com/2014/09/26/drones-above-the-vineyards/ by a Drone Startup based in San Francisco, Drone Deploy. In my previous blogs I’ve stated the importance of a whole system approach to UAV work with a smooth end-to-end flow. It’s not about the drone, or how sexy it is. It’s using the drone as a tool to get actionable data that can be used to improve your crop yields, reduce your survey times, in a sense make you work more efficiently with less workload. What Drone Deploy suggested is that they had taken the processing section of the workflow and automated it. From my experiences and from talking with other UAV operators, it’s clear flying and capturing useful images/data is 20% of the time, the rest of the 80% is data processing, analysis and decision making. However Mike Winn, CEO and founder (Jono Millin and Nicholas Pilkington are also founders) at Drone Deploy told me that they had devised a product that would take the data as the drone is flying, yes as it’s flying, and start processing as it’s flying, and 15 to 30 minutes after landing you would have all the processed data available. Really? This seemed too good to be true, as such I set off on this investigation to see how well Drone Deploys system worked.

 

What is Drone Deploy?
I first came across Drone Deploy in May 2014 at the sUBSExpo in San Francisco, where they presented their Co-Pilot hardware and software post-processing system. The premise behind the presentation is that Drone Deploy gathers your images as they are taken and uploads them to the cloud, where they are then processed on extremely fast and optimized servers designed for this application. The results are the presented via the cloud to the user by any device with a web browser interface. In this way the cloud processing can achieve image post processing times unobtainable with your normal system at home or office.
So I hear you ask, how do you get the images from the aircraft to the cloud? Well Drone Deploy have addressed this by using a cellular module, the Co-Pilot (thus the blog title,) which runs at LTE data-rates to allow images to be pushed to the cloud from the camera as it is taking the images. So you start to realize now how much work Drone Deploy have put into the Co-Pilot and the associated cloud processing. The Co-Pilot itself has the LTE module for image transfer to the cloud, a Wifi module that talks to the survey Camera and a telemetry link that talks with the Flight Controller. That’s pretty cool. Why? Because the Co-Pilot takes the GPS coordinates from the flight controller when the camera is triggered, and stamps it onto the image. As such none-GPS cameras can be used to generate Orthomosaics, which can later be overlaid onto Google Earth. However what’s even cooler, is that the LTE link is bi-directional. What’s the advantage of this? It allows you to talk to the aircraft from your smart device via a web-browser, which really doesn’t seem like a major deal. But what Drone Deploy have done is integrate the flight controller mission planning into their own web browser mission planner. This allows you to plan a mission, fly the mission, and look at the post-processed mapping/survey data on the same device just by using a web browser, no telemetry links etc. Also Drone Deploy have addressed multiple markets and platforms with their Web-based mission planning and data viewing App. First they support both fixed-wing and rotary aircraft including helicopters and multirotors. Secondly, they have devised mission planning categories for surveying, agriculture, construction and 3D modeling. Chose the category to best plan your mission.

So what happens if you don’t have cellular coverage or if you don’t live in North America. Well Drone Deploy is working on both of those points. You can pre-plan a mission and fly without the cellular telemetry downlink. Once you are back where a cellular signal exists, trun on the UAV, and it will upload all the images to the Drone Deploy cloud and process them. Also in the future Drone Deploy is looking at adding WiFi/Bluetooth links to do in-field communication from your smart device to the UAV.

OK so what about international customers, again Drone Deploy has the foresight to see the UAv community as been global, and as such is rolling out deployment in the next months overseas. Keep watching.

 

What does a Drone Deploy Mission Look Like?
So here is a test case using a multirotor:
1. Put the multirotor at its takeoff point, turn on your transmitter, turn on the UAV camera and then connect the UAV battery. At this point the UAV will connect to the Drone Deploy cloud server and upload any updates if necessary.
2. Get out your smart device be it an Android phone, iPhone, laptop, tablet and log on to Drone Deploy website and enter your account information. You will then be taken to the Dashboard, which has all the missions you have planned off-line (you can plan missions in the warmth of your office/home) or missions flown with generated maps.
3. You now have the choice to plan a new mission, fly a mission you planned off-line, or re-fly a mission you have already flown.
4. Let’s plan a new mission. Pick a category, such as Agricultural.
5. You will be asked which aircraft profile and camera you want to use and your present location.
6. Draw a perimeter around your required survey area and define your required ground resolution per pixel or altitude.
7. Drone Deploy will then generate a flight path to optimize the correct image overlaps to allow correct stitching. What’s also important is that it only takes enough images necessary. Having too many extends processing time. As such normal rules are fly as high as you can, as fast as you can and get just enough images to allow quality data stitching and analysis. Drone Deploys figures that all out for you. As such the Co-Pilot also triggers the camera according to when it determines trigger points are in its computed flight path. Note, the flight path generation also checks flight duration versus aircraft capability. As such if the mission exceeds the aircraft endurance this will be flagged as a safety issue.
8. By now the multirotor status will be visible on the webpage and will report connection, updates, GPS-lock and finally FLY NOW.
9. You now click on the symbol “FLY NOW” and the Co-Pilot will go through a preset checklist, it will connect to the camera, check camera battery, check multirotor battery, ensure you have GPS-lock, and take a test image and down-load it from the copter to your browser. If all checks pass, the checklist is passed and you get the “Ready to Launch”. If any check fails, the copter will not launch, this is an inherent safety feature so you do not take off with low batteries, no GPS lock etc.
10. Now switch your transmitter to Auto, and press “Launch” on your browser screen. A notice will come up stating “Takeoff in 5 Seconds” The countdown will begin and at 0 the copter will takeoff and climb to survey height. It will then start to fly the mission.
11. During the flight, the images are downloaded to the cloud and processed as you fly. The images are also presented on your web browser screen along with aircraft speed, height and battery level. The flight plan is shown, with the aircraft progress, direction of travel and the direction it’s pointing.
12. The next bit is truly impressive. During the flight and as images are processed in the cloud, stitched results are placed in realtime, yes in realtime on the flight path of your web browser. From anyone who has used stitching software this is like a miracle to watch, realtime stitching whilst the copter is still in the air!
13. At the end of the survey, the copter returns to the take-off point (or the point you want it to land) and lands.
14. After landing large sections of the area will be stitched or even fully stitched. Results may even be available already. However results such as stitched Orthomosaic, ENDVI maps, Digital Elevation Models and 3D models will be viewable in approx. 15 minutes depending on mapping area.
15. So now power-off, pack-up your copter. Then get your smartphone etc. view the results as you are stood in the field, and using geo-referenced survey map generated from the mission go crop ground truthing etc.
16. On to the next site.

Note: Safety is paramount and Drone Deploy uses an extensive three dimensional geo-fence, that monitors aircraft deviation away from the expected flight corridor. Excessive deviations will trigger a return home and land. They also supply No-Fly Zones during mission planning.

It really is that simple and fast. Here is a video of the whole process using a preplanned flight, look how fast it takes (apologizes for iPad glare.)
http://youtu.be/c28uEXJvW6U

And here is a video of 20 acres been stitched in realtime:
http://youtu.be/d4gqTXSollw

 

Test Bed

For the initial exercise a 3D Robotics Iris+ was used. The reasoning behind this decision was to prove out that a small consumer drone could fly a commercial survey mission. Remember my initial point “the drone it’s just a tool”. I see a lot of UAV manufacturers with excellent products, however the price tags are in the $15k to $30k price range. Again all the drone does is act as a camera platform, it’s the images from the camera that are the important bit. If you want quality data, focus on the payload system, and just use the drone as a tool to carry it. Could a $750 consumer drone do a full commercial UAV survey using Drone Deploy? Most farmers and agronomists want a ROI on new technology, so why not give them a tool they can get a feel for UAV’s in precision agriculture, but without breaking the bank account? Only one way to find out.

Iris+_White

So next was the choice of camera and gimbal. Obviously, this is the tradeoff of resolution, weight, flight time and spectral coverage. The Iris+ can carry a GoPro Hero 3 or 4 either in a hard mount or using a Tarot Gimbal. To ensure good data a gimbal is used such that the camera is always pointed straight down in a NADIR position. This position and the brushless gimbal ensure the image is sharp with the correct overlap. However it is expected that the hard mount could be used and is a future investigation.

Iris

So now talking about the GoPro. Obviously we want to do vegetation health and stress analysis so we need some type of Near-Infrared, or Red Edge camera. This requires taking a stock GoPro, opening it up and removing the normal lens/filter used for RGB daylight photography/video:

http://youtu.be/_tUl-BzZN70

IR_Pro

(If you are not confident in doing this please use a vendor purchase from RageCams or IR-Pro instead.)
Once the old lens is removed, use a new replacable screw-in NIR-GB lens. There are number of lens that exist on the market from IR-Pro http://www.ir-pro.com/ and Peau http://www.peauproductions.com/main.html

IR_Pro_1

So what results should you expect from a GoPro NDVI? The most important thing to recognize is that this is not a referenced NDVI camera. To go that route you are looking at much more expensive solutions. Here we are talking about a cheap scouting copter for farmer to take the first steps into UAV with a short ROI which does not break the bank, but can generate actionable data and can identify vegetation and crop issues, that are correlated to ground data. The holy grail of UAV NDVI is still that, you speak with anyone with experience in this field (bad pun) and they will tell you that although everyone holds NDVI up as the only metric. That’s untrue, in reality it’s using any combination of cameras to generate UAV images, which are combined with ground data to generate actionable data. Different crops work better with different cameras, processing, camera combinations i.e. NGB and FLIR for example. There is no single camera/formula which will work for all solutions. Of course resolution, imager size, rolling shutter effects etc. play into this as well, but the question here is, can you use a GoPro for NDVI that allows useful correlation with ground data.

So let’s do some comparisons. Below are 2 scenes, one of an open space, one of a plant, with photographs taken with a normal RGB Canon SX260HS, the GoPro with a IR-Pro InfraBlue22 NDVI lens, a Canon SX260HS with an older style Event 38 filter, and finally a Canon SX260HS with a new style Event 38 filter. That’s 8 photographs in all. Obviously you cannot vegetation health from the RGB, but it’s our reference to understand what we are looking at.
RGB Photographs

RGB_FieldRGB_Plant

Now look at the GoPro NDVI photograph. You can see the vegetation is red/brown/pink, whilst the manmade objects are mostly as viewed by our RGB reference. However you see some pink tinge to the manmade objects.
GoPro Photographs

DCIM100GOPROGOPR0283. DCIM100GOPROGOPR0296.
Looking to the older Event 38 filter, you can see that this has similar type of spectral separation, with some overlap between the vegetation and manmade objects. Here the manmade objects such as the road and buildings have a slight pink tinge to them, but less than the GoPro. This is due to the red notch roll-off, filter corner frequency.
Old Event 38 Photographs

Old_E38_Field Old_E38_Plant

On the newer Event 38 filters, you can now see good separation of spectral content, with manmade objects having no or little pink, and the vegetation been much better defined pinks/browns. This is because the new filter has a roll-off which is steeper and the corner is moved slightly up in the spectrum.
New Event 38 Photographs

New_E38_Field New_E38_Plant

As such you can see how 3 different NIR filters with good imagers, two with the same Canon imager, can generate significantly differently results. For today the flights are only concerned with the GoPro InfraBlue22 NDVI lens, but in later blogs I’ll be looking at filters for Canon Point-Shoot and the Sony Alpha range of cameras, as finally with an Event 38 GoPro lens using the filter shown in the above Canon SX260HS. At that time I’ll be looking to do a fly off of all the different lenses and cameras, thus showing the relative merits and drawbacks of each.
Now one curious point about having a GoPro InfraBlue22 NDVI lens, is that you can now shoot NIR video, and here is a sample. Again look at the ground color which is sparse due to the California drought, the accumulated water from the first rain in months, the vibrant color of the trees, and manmade objects like the path and buildings:
http://youtu.be/hWKzjpjLM4E
Now one question I have been asked a lot is, “can I fly a GoPro NDVI and look at the video to see crop health. In this way I won’t have to do all this image stitching and processing.” I’m afraid not, the GoPro InfraBlue22 is not really NDVI camera, what it does is captures NIR, Green and Blue spectra which then needs to be processed by software on a pixel by pixel basis to generate a NDVI image, using a NDVI, ENDVI, DVI formula. Many formulas exist and each formula has its own merits in terms of results for different crops, sun and cloud conditions, filter type etc. To get an NDVI image of a large area, you need to capture a set of overlapped still images in NIR-G-B (or other band combination dependent on filters), stitch them together and then process them according to a formula. If anyone has a NDVI video processing software let me know, or I’ve just given you your next Startup or Kickstarter idea.

 

Test Cases
Three test cases were flown. The flights were preplanned away from the field, then flown using just an Apple iPad with Internet connection whilst at the field. Two surveys covered 20 acres at an altitude of 80m and took approximately 9 minutes to fly, whilst the last survey covered 5 acres at 50m with an elapsed time of 5 minutes. The flights were flown with the GoPro 3 Black converted and using the IR-Pro InfraBlue22 NDVI lens, in a downward pointing NADIR position, in a Tarot Gimbal, flown on a 3D Robotics Iris+ multirotor. The test sites were chosen as they consisted of a number of plant habitats and included wild and mown meadow, manmade objects, seeded lawn with grounds keeper care. To act as a reference, the images from each survey were also processed in Pix4D Pro as well. Due to the nature of the test sites, stitching was not easy and this was intended to test the efficiency of the algorithms.

 

Test Case 1
Description

Wild Meadow in drought with rolling hills with a general declining elevation away from the takeoff point. Mixture of wild meadow, mowed meadow, brush, with fences, paths and cattle. Flight covered 20 acres at an altitude of 80m and took 9 minutes to fly. 37% battery left.

Drone Deploy Flight Path

Flightpath_Meadow

Drone Deploy Orthomosaic

Mosaic_Meadow1

Drone Deploy NDVI

NDVI_Meadow1

Drone Deploy Digital Elevation Model

DEM_Meadow1

Drone Deploy 3D Model

3D_Meadow

Pix4D Ray Cloud

Pix4D_Ray_Meadow

Pix4D Orthomosaic

Pix4D_Moasic_Meadow

Pix4D NDVI

Pix4D_NDVI_Meadow

Ground Photographs

Picture 1 – Looking South over the Survey Site

IMG_0634

Picture 2 – Ground vegetation and brush

IMG_0644

Picture 3 – The hole in the ground and the bump

IMG_0652

Picture 4 – The path up the left side of survey area

IMG_0696

Picture 5 – The road and banking to the right of survey area

IMG_0835

Ground Truthing
Picture 1 shows a view looking over the survey sight. You can see the rolling nature of the meadow, which is made up of grass, vegetation and brush shown in picture 3. This is common grazing meadow. Interesting if you look towards the center of the photograph at the next hill, just to the left you will see an off-road vehicle track which bends around the hill. Now look at the Orthomosaic and NDVI to the right side and you will see a blue area which bends. This is the same feature, but why the blue? Well look more to the right towards the road and compare to Picture 5. You’ll see a bank, but on the NDVI where the bank is again a blue area. Now look at the Orthomosaic, NDVI, DEM and 3D model and in the middle you will again see a blue area. Looking at Picture 3 this can be seen to be due to a big hole in the ground. So what is causing the blue? Well the survey was taken in the afternoon in late December, with the Sun low in the horizon, as such the blue is from shadows. Now look at the NDVI to the left side and you see an area of blue, this is actually shadows from the dead brush shown in Picture 2.
The rest of the NDVI shows varying degrees of green which correlates to different densities of grass throughout the meadow. In Picture 4, you can see a path to the left of the hole, which is easily seen on the Orthomosaic, NDVI, DEM and 3D Model. And that is what’s interesting is the amount of data produced, from different vegetation densities, identifying elevation changes, depressions and water drainage areas, fence lines, paths etc. It’s data rich information from a single 9 minute UAV flight over 20 acres.

 

Test Case 2
Description
Meadow in drought with wild and mowed areas, buildings and concrete path in survey area, see 3D model. Takeoff was from the top of a hill to the left center of the survey site, with the land dropping away in altitude in a rolling manner. Rain the previous days highlighted by areas of accumulated water in low lying basins. 9 min 10 sec flight at 80m with 41% battery left at landing.

Drone Deploy Flight Path

Flightpath_Meadow

Drone Deploy Orthomosaic

Mosaic_Bike1

Drone Deploy NDVI

NDVI_Bike1

Drone Deploy Digital Elevation Model

DEM_Bike1

Drone Deploy 3D Model

3D Model_Bike

Pix4D Ray Cloud

Pix4D_Ray_Bike

Pix4D Orthomosaic

Pix4D_Mosaic_Bike

Pix4D NDVI

Pix4D_NDVI_Bike

Ground Photographs

Picture 1 – Looking South of take-off point

IMG_0610 - Copy

Picture 2 – Looking North of take-off point

IMG_0612 - Copy

Picture 3 – Looking at depression in ground South of take-off filled with water

IMG_0619 - Copy

Picture 4 – Looking to the West of take-off point, towards mowed area, treeline and Sun

IMG_0620

Ground Truthing
The DEM and 3D Model and good representations of the survey site, with the DEM showing the marked elevation changes, which can be seen by looking at the ground photographs. The buildings can be seen in Picture 2, and visible at the top of the mosaic and NDVI. The tarmac bike paths can also be clearly seen.
To the middle left of the NDVI image can be see a blue area. Closer examination shows this to be due to the shadow of the trees, look closer at the shadow shape. It can also be seen in the Orthomosaic. Looking around further, and comparing to the DEM, you can see that the significant areas of blue are all on the right side of elevation slopes in the DEM. As such these are again shadows due to elevation changes and plants/bushes. This observation falls in line with Test Case 1. In this case the survey was run in late December at around 3pm, where the Sun cast shadows. Another reason for blue is water. Look at Picture 3 with the depression filled with water, then look at the same location in the NDVI and Orthomosaic. So here you have similar colors for different reasons. Another point to watch out for.
Different plant and vegetation densities show up as different shades or colors. In this case more dense areas showed up as a stronger green, whilst mowed areas and paths were more of a darker green. You can see the differences by comparing Picture 4 where you can see unmowed/mowed with the Orthomosaic and NDVI imagery.

 

Test Case 3
Description
Soccer field cared by ground keeping staff. Covers approx. 5 acres. Flight time of 5 minutes at 80m, with 62% battery left after landing.

Drone Deploy Flight Path

Flightpath_Soccer

Drone Deploy Orthomosaic

Mosaic_Soccer

Drone Deploy NDVI

NDVI_Soccer

Drone Deploy Digital Elevation Model

DEM_Soccer

Drone Deploy 3D Model

3D_Soccer

Pix4D Ray Cloud

Pix4D_Ray_Soccer

Pix4D Orthomosaic

Pix4D_Mosaic_Soccer

Pix4D NDVI

Pix4D_NDVI_Soccer

Ground Photographs

Soccer_Field

Ground Truthing
From the Orthomosaic, NDVI from both Drone Deploy and Pix4D, the question is, what’s with the circle patterns? Well that’s a good question. Just looking at the ground photograph it’s hard to even see the pattern unless you already know it’s there. So what is it? To be truthful I can say what the difference is, but not why. The area within the circles appears to be of a different density of seeding, with more densely packed growth outside to inside the circles. As to why a circular pattern, I have no idea. There are no sprinklers to generate such a well defined pattern. At this time I’m going to approach the field owners to ask some questions.

 

Drone Deploy Observations
So let’s go over my observations from this investigation of Drone Deploy:
1. UAV data needs to be ground truthed. Without correlating ground and aerial data, you have no means to interpret what you are seeing. UAV images and in particular NDVI imagery however can highlight issue areas, and with experience seasoned UAV operators can pull on past knowledge to make educated assessments.
2. Drone Deploy is fast and simple. Its GUI is intuitive and actionable data is delivered in a simple to understand format. Drone Deploy is so easy to use via a tablet, a non RC experienced user can easily fly a survey, and get quality results in a very short time.
3. Stitching on the go, it’s unheard of and it’s not a gimmick. Having the ability to see maps generated as you fly means spotting issues with images whilst in-flight, or issues in the field straight away.
4. Actionable data, it’s not just a saying. Drone Deploy generates Orthomosaic overlayed on Google Earth which has great alignment. It’s amazing to watch as you zoom in the quality of detail in Drone Deploy area, compared to surrounding Google Earth imagery.
5. You get 4 sets of data in one go, Orthomosaic, NDVI, Digital Elevation model and 3D Model all in one . Also all the data can be exported and shared.
6. The GoPro does not have GPS and for this exercise I didn’t try to match the GoPro clock to the Pixhawk clock, so I could align using the Iris+ logs. Instead the Pix4D images were stitched with un-geotagged images. As such I was quite amazed at the speed and ability of Pix4D to stitch none geotagged images from a GoPro. At no time did Pix4D fail to stitch the images from any survey flight I flew.
7. Most parameters on the GoPro are controllable, except one, shutter speed. My expectation, and one raised by Agribotix in their blog in the past, is that the lack of shutter speed control with flight speeds of UAV’s would generate blurring. In fact this was not the case. You’d be unable to stitch images if this was the case.
8. Drone Deploy and Pix4D data correlates well, showing that both companies have done their homework on how to generate data that is quality, correct and actionable.
9. You can fly 20 acres with a consumer drone such as an Iris+, capturing NGB images to generate NDVI data whilst the aircraft is still in the air. Given the battery capacity left and flying at 107m it is feasible to cover 25 acres with such a setup.
10. NDVI data is not infallible, it is a function of camera quality, filters, Sun and lighting conditions. Be aware of the limitations of the technology you are using to get the best data available.

 

Drone Deploy Conclusions
Overall I’m very impressed with the Drone Deploy system. This is a disruptive technology in the commercial UAV space, where the UAV is viewed as a tool rather than pretty piece of sexy technology. In this space the UAV is just a part of the whole system, an important part, but the end goal is actionable data. Using a consumer drone from 3D Robotics, the Iris+, and flying a NDVI converted camera, Drone Deploy facilitated a number of 20 acre NDVI crop scouting flights, with actionable data available in minutes that correlated with ground data. On top of that Drone Deploy have made a system that a worker in the field with no previous experience of RC flight can be trained on a flown with a smartphone or tablet. Nothing else today exists to do this. I believe Drone Deploy has a very bright future.
If you are interested in learning more about this system, drop me an email at iain.butler@kextrel.com

 

Acknowledgements

I’d just like to personally thank the following people for making this investigation a success. Mike Winn for initially reaching out to me to discuss the vision of Drone Deploy. Jono Millin and Nicholas Pilkington for business and mapping support, and Jeremy Eastwood, Chase Gray and Manu Sharma for technical assistance. Best of luck and I’m sure that you will do well. Also congratulations on hiring Gretchen West.

In my next blog I aim to cover flying a number of different cameras on a 3D Robotics X8M platform with and without Drone Deploy. The cameras will range from GoPro, through Point and Shoot to high end consumer.

Regards

@theUAVguy

http://www.twitter.com/theUAVguy

http://www.kextrel.com

Drones above the Vineyards

Drones above the Vineyards

DSC_0006
So what are drones good for? Well a lot actually as it turns out. For example, I had the privilege of working with a top California Vineyard before harvest time, to investigate how Multirotor UAV’s could be used in vineyards to improve efficiency and identify crop issues. In the following I’ll highlight the workflow used, the results and some tips learnt the hard way.

Which UAV platform and Remote Sensing Equipment to use?
First off is what type of UAV to use, well it’s not as easy as picking up a drone off a store shelf for starters. To have useful photogrammetry results a number of key issues needs to be addressed:
1. Lift capacity and Endurance – Although this seems obvious, the UAV has to lift itself, batteries and camera/s into the air and fly the whole survey route. Working backwards, the UAV choice is therefore influenced by the payload or in this case the camera. I’ll explain more on this later, but for this test the payload weight was 8.15oz or 231g, or a point and shoot size camera. Add onto this a gimbal of approx. 100g; we needed to lift approx. 330g for approximately 12 minutes. Looking through specifications, the 3D Robotics Y6 was capable of this scenario, using a 4S 6000mAh battery http://store.3drobotics.com/products/3dr-rtf-y6-2014 Another option was the 3D Robotics X8, however we also wanted a copter that could fold down for transport, as such the Y6 with its foldable frame was selected over the X8.

2. Autonomous Flight with Flight Planning – When flying large areas of crops, flying manual and getting the correct overlap on images is near impossible. As such the UAV needs to be flown in an autonomous mode. This requires that the area to be mapped is stored in an electronic flight plan in the UAV, and the UAV flies from each assigned point or waypoint at a specific speed, altitude and orientation. This also allows the mapping mission to be flown, time and time again, days, weeks and months later. This is important, as it allows images from different dates to be compared side by side, allowing crop analysis over time. As we were flying 3D Robotics Y6 we had a choice of Pixhawk or APM autopilots. The UAV was initially a Y6A with AMP2.6, however for this mission the UAV was converted to a Y6B (mainly as this was better supported) but still with an APM2.6. In the future we will look at using the Pixhawk for the Y6. The APM2.6 has a long history in UAV autonomous flight, so this was the chosen platform.

3. Camera – The data is only as good as the images, as such the camera is critical. This data quality is a trade-off in a number of factors, weight, resolution, control, imager size, cost etc. We have a defined weight of approx. 200-300g as an acceptable payload weight. This places us in the point and shoot category. The camera must also have an NDVI capability. Also the camera needs to be setup with the correct parameters and also be triggered by the UAV, which the APM2.6 can do. This combination led us to use the Canon series of point and shoot cameras. Presently the SX260HS, S100 and S110 can be converted for NDVI and are used by companies such as Agribotix, Sensefly, Roboflight, Quest UAV etc. To simplify operation, the camera used was the SX260HS, as this has an on-board GPS, allowing for each image to be geo-tagged with GPS coordinates. This helps with the image processing later. The Canon range of cameras also are supported by an application called CHDK, which is placed on the cameras SD card. This supplies the camera with additional functionality, such as triggering from the UAV, interval timing shots, setting white balance etc. The camera is also 12.1Megapixel, for flying at a height of approx. 100 feet, with a ground resolution of approx. 1cm per pixel. More than enough for crop analysis, and flying up to 400 feet still gives excellent imagery for analysis. Finally as previously mentioned, the cameras need to be NDVI capable. This was achieved using an Event 38 NGB, near infrared, green, blue filter with the red spectrum notched out.

http://www.event38.com/ProductDetails.asp?ProductCode=NGB2

A tutorial on how to convert the camera is defined here, the process is relatively simple.

Event38

http://droneyard.com/2013/08/24/infrablue-conversion-vegetation-stress/

And the results of the conversion can be seen in this blog:

http://droneyard.com/2013/08/10/ndvi-camera-for-remote-sensing/

Other camera filters exist from companies such as Max-Max:

http://www.maxmax.com/RemoteSensingcamerasi.htm

So the remote sensor is a Canon SX260HS with GPS, fitted with an Event 38 NGB NDVI filter, with CHDK software mounted on the camera SD card.

4. Camera Gimbal – Flight time is a trade-off of thrust versus weight, as such the lightest simplest quality gimbal was researched. Gimbal categories can be split into 3 main areas, simple servo gimbals, high quality servo gimbals sometimes with gearing, and brushless gimbals. The purpose of the gimbal is to allow the camera to take high quality images of the crop. To do this a number of issues must be addressed. Firstly the gimbal needs to keep the camera pointing straight down. This keeps the overlap on the images which I’ll explain later, even when the UAV is tilted when flying forwards or into crosswinds. This also stops blurring, as the camera is stabile in pitch and roll, thus not been moved around by the UAV movements. An important factor in this is the gimbal movement should be smooth. Secondly, the camera needs to avoid any vibration that could blur the images; therefore the camera needs to be isolated from vibrations of the Multirotor. This is normally achieved using rubber isolation grommets between the camera gimbal and the airframe. Thirdly, it should be light and simple, the more complex it is the more chance it will go wrong in the field. Finally it should be cost effective.

Based on these criteria, we need a smooth movement gimbal in pitch and roll, good vibration isolation, simple and light, and approx. $300. Simple servo gimbals although simple and light, can have sloppy or sharp movement, whilst brushless gimbals are very smooth as they are required for video work, tend to be more complex and heavier. As such a high quality servo gimbal was chosen, the GUAI Crane II which cost $279. It does not have any associated electronics as per brushless gimbals, instead using the UAV flight controller for gimbal control. An advantage of this gimbal is that it also allows the gimbal to be removed easily from the isolation damper for packing/traveling.

GAUI

5. Camera setup – Again, the final analysis is only as good as the data used, which means you need good quality images. For UAV aerial images, there are a number of trade-offs, such as ISO settings, the aperture, auto-focus, shutter speed, white balance, image stabilization, image capture time etc. The main aim is to get a sharp image with the least amount of noise. Also the image quality is affected by the light quality, with results changing between a sunny day and a cloudy day for example (see Agribotix for further analysis on this.) Normally the following setting work and were used, white-balance sunny day, zoom set to wide angle to maximize image view, auto-focus off to speed time between images and focus set to infinity, aperture set to automatic, image stabilization off, shutter speed set to a medium such as 1/800, ISO set as low as possible to avoid noise.
Summary of Setup
OK, so we have the UAV the 3D Robotics Y6, the 3D Robotics APM2.6 autonomous autopilot, a Canon SX260HS with GPS, fitted with an Event 38 NGB filter, mounted in a GAUI Crane 2 gimbal.

DSC_0011

Plan the Mission
The flight planning software for the 3D Robotics Y6 is called Mission Planner (http://copter.ardupilot.com/wiki/mission-planning-and-analysis/) and can be used to devise flight plans, configure the UAV, plus monitor the UAV in flight using a telemetry link. One useful point when planning a mission is to use a site survey if nearby, to access the safety and understand the terrain or any obstacles or special circumstances that need to be taken into account. Once this is done, Mission Planner can then be used to draw the survey map.

This survey grid is then converted to waypoints with flight altitudes, and uploaded into the UAV.

MP_Flight_Plan3

On the Day
Meeting the vineyard owner, a survey site was identified of approximately 7 acres, which kept the UAV away from trees, power lines, workers and an on-site event which was been setup. The site was focused on the middle of the vineyard with some elevation change involved. This is very common situation in the Santa Cruz area, where the vineyards propagate through the Santa Cruz Mountain region. Normally this is also associated with the vineyard been surrounded by tall trees such as Redwoods, which in turn leads to a large bird population. The upshot of this is the vineyard headache of birds been pests, which leads to most Santa Cruz vineyards using netting to protect their crop. However on this day only some of the crop was netted, so the majority of work was over the un-netted area.
Here is the Y6 ready to fly:

DSC_0001

Josh Metz, UAV Observer and Vineyard GIS Specialist @Geovine :

DSC_0004

As mentioned before the site had been pre-planned via Google Earth and a number of possible missions planned and stored. Therefore all that was required was to upload the correct mission the Y6. Firstly the NDVI NGB camera was loaded, turned on and allowed to get GPS. The mission was then started a survey grid flown with no issues, except one.
When the mission was preplanned, the landing site was in the center of the vineyard, however the take-off and landing site was moved to the top of a hill to get better observability of the UAV during the flight. The exception was that the take-off site was changed, but the landing site did not reset correctly. The result was at the end of the mission, the UAV attempted to land at the other end of the vineyard in the old landing spot. This was easily overcome by going to manual and flying it back and landing by hand. However the moral is, when you program a mission to a UAV, always read it back to make sure all changes are correct. Also always pay attention, have an observer in my case Josh Metz, Vineyard GIS Specialist @Geovine , and always be ready to take control back.
After the NDVI RGB camera flight, the camera was swapped out for a normal RGB camera, and the mission flown again. Again, this is the advantage of autonomous flight, both NGB and RGB doing two flights but over the same flight path.

Processing Information
A number of software suites exist to process images and create NDVI information, two examples are AgiSoft (http://agisoft.ru/ ) and Pix4D ( http://pix4d.com/ ). Another interesting choice is from Agribotix which is a Cloud Based NDVI service for post processing UAV images http://agribotix.com/
This survey was based on Pix4D who kindly gave us a Demo License for this investigation. The purpose of this software is numerous. Firstly it corrects for camera issues, as such the camera model used is added as input data, the separate images are uploaded and then the software joins all the separate images together in a point cloud. From this a single large image is generated and then numerous other outputs such elevation models, 3D models, plus NDVI data plots as outputs. The output formats are numerous, with Geo-Tiff been a primary output. To help the software align the images, ground control points can also be added, which gives known reference point for the software to stitch the images together.
To get a complete stitched image, all the images must overlap. The required overlap is normally 60% to 80% to allow the software to stitch properly. As such you need lots of images, and the lower you fly, the more images you need. The downside of this is that you must process more images, which takes more time. This process requires a fast computer using lots of memory, such as an Intel i7 running 32 or 64GB of memory. So two lessons fly as high as you can, but no higher than 400’, and have a very fast computer.
Pix4D did an excellent job of stitching the NGB NDVI images together, but issues did occur with the RGB images, although this was not a software issue. The GPS on the Canon SX260HS RGB camera had not geo-tagged the images correctly. However just using the Pix4D ground control points, the software was still able to stitch the RGB images together.
NGB Point Cloud showing the UAV position when the image was taken.

NGB_Point_Cloud1

NGB Mosaic Image, showing the separate images stitched together.

NGB_Mosaic1

RGB Point Cloud, where the images were stitched using just ground control points with no GPS data.

RGB_map

RGB Mosaic with all the stitched images.

RGB_Mosaic2

NDVI Image

DVI_NIR_MINUS_G1

The NDVI image was generated using the NGB bands processed by the Pix4D software. A quick explanation of the image brings to lights some details with NDVI imagery in vineyards. Firstly the dark blue is actually due to shadows on the ground between the vineyard rows. This was because the survey was flown in the morning around 10:30am, rather than noon with the sun directly overhead. The green indicates the ground. The red indicates the separate vines.

One of the main items to notice is where you have good vine virility and growth, you see red vines and the blue shadows. Where growth is low the vines in red are less obvious and the shadows (blue) less strong, and the ground (green) more merged together. Using this information it can be see that certain areas show lower growth and yields than other areas.

Correlating to Ground Data

After the data was processed, we went and talked with the vineyard owner, and compared ground data with our results. It was obvious from the discussions that the ground data and the UAV NDVI and RGB images both highlighted low yield areas, which were known to be lower than the rest of the vineyard due to soil type, irrigation etc. As such the UAV images with Pix4D processing were shown to have been able to correlate well to ground data.

It also became clear that the owner knew his vineyard very well, as it was approx. 17 acres, so he and his staff could walk the property and identify issues on the ground. As such UAV imagery only becomes effective as a business model when the property cannot be efficiently walked. At this point the UAV is indispensable in its ability to capture large areas and process data.

One advantage of UAV imagery that needs further investigation though, is that RGB images do not show yield issues that easily when the vines are netted, however preliminary analysis shows NDVI NGB images can show yield issues even when the vines are netted.

IMG_20140721_200505 (2)

Lesson Learned
1. Keep it simple
2. If it can go wrong it will go wrong
3. Preplan your mission, do a site visit or use Google Earth for site info
4. The higher you fly, the less images you need which means less processing time
5. The higher you fly, the larger the area you can map
6. Always check your images when in the field
7. Fly at noon to limit shadows from the vines
8. Use an observer
9. Crop analysis is 20% flying and 80% data processing
10. Image processing takes lots of computing power, get a fast processor with lots of memory
11. High quality images equates to high quality crop analysis, poor images mean poor data
12. Aerial images and analysis needs to be correlated with ground data to be effective
13. Normal photographs and video in RGB is almost as invaluable as NGB to the vineyard owner
14. Drone NDVI mapping becomes effective with vineyards greater than 50 acres
Summary
So we have shown how a UAV such as a 3D Robotics Y6, mounted with a simple Canon point and shoot camera modified with a NDVI filter, using powerful software such as Pix4D, can generate useful crop analysis for vineyards. We’ve pointed out lessons learned and are now ready to keep on helping the Santa Cruz Mountain Wineries stay the best in the World.

@theUAVguy

http://www.kextrel.com

iain.butler@kextrel.com