God is my Co-Pilot……or is it Drone Deploy?

God is my Co-Pilot……or is it Drone Deploy?
3D Model 1

Introduction

Recently I was approached about one of my previous blog posts on Vineyard Surveys https://theuavguy.wordpress.com/2014/09/26/drones-above-the-vineyards/ by a Drone Startup based in San Francisco, Drone Deploy. In my previous blogs I’ve stated the importance of a whole system approach to UAV work with a smooth end-to-end flow. It’s not about the drone, or how sexy it is. It’s using the drone as a tool to get actionable data that can be used to improve your crop yields, reduce your survey times, in a sense make you work more efficiently with less workload. What Drone Deploy suggested is that they had taken the processing section of the workflow and automated it. From my experiences and from talking with other UAV operators, it’s clear flying and capturing useful images/data is 20% of the time, the rest of the 80% is data processing, analysis and decision making. However Mike Winn, CEO and founder (Jono Millin and Nicholas Pilkington are also founders) at Drone Deploy told me that they had devised a product that would take the data as the drone is flying, yes as it’s flying, and start processing as it’s flying, and 15 to 30 minutes after landing you would have all the processed data available. Really? This seemed too good to be true, as such I set off on this investigation to see how well Drone Deploys system worked.

 

What is Drone Deploy?
I first came across Drone Deploy in May 2014 at the sUBSExpo in San Francisco, where they presented their Co-Pilot hardware and software post-processing system. The premise behind the presentation is that Drone Deploy gathers your images as they are taken and uploads them to the cloud, where they are then processed on extremely fast and optimized servers designed for this application. The results are the presented via the cloud to the user by any device with a web browser interface. In this way the cloud processing can achieve image post processing times unobtainable with your normal system at home or office.
So I hear you ask, how do you get the images from the aircraft to the cloud? Well Drone Deploy have addressed this by using a cellular module, the Co-Pilot (thus the blog title,) which runs at LTE data-rates to allow images to be pushed to the cloud from the camera as it is taking the images. So you start to realize now how much work Drone Deploy have put into the Co-Pilot and the associated cloud processing. The Co-Pilot itself has the LTE module for image transfer to the cloud, a Wifi module that talks to the survey Camera and a telemetry link that talks with the Flight Controller. That’s pretty cool. Why? Because the Co-Pilot takes the GPS coordinates from the flight controller when the camera is triggered, and stamps it onto the image. As such none-GPS cameras can be used to generate Orthomosaics, which can later be overlaid onto Google Earth. However what’s even cooler, is that the LTE link is bi-directional. What’s the advantage of this? It allows you to talk to the aircraft from your smart device via a web-browser, which really doesn’t seem like a major deal. But what Drone Deploy have done is integrate the flight controller mission planning into their own web browser mission planner. This allows you to plan a mission, fly the mission, and look at the post-processed mapping/survey data on the same device just by using a web browser, no telemetry links etc. Also Drone Deploy have addressed multiple markets and platforms with their Web-based mission planning and data viewing App. First they support both fixed-wing and rotary aircraft including helicopters and multirotors. Secondly, they have devised mission planning categories for surveying, agriculture, construction and 3D modeling. Chose the category to best plan your mission.

So what happens if you don’t have cellular coverage or if you don’t live in North America. Well Drone Deploy is working on both of those points. You can pre-plan a mission and fly without the cellular telemetry downlink. Once you are back where a cellular signal exists, trun on the UAV, and it will upload all the images to the Drone Deploy cloud and process them. Also in the future Drone Deploy is looking at adding WiFi/Bluetooth links to do in-field communication from your smart device to the UAV.

OK so what about international customers, again Drone Deploy has the foresight to see the UAv community as been global, and as such is rolling out deployment in the next months overseas. Keep watching.

 

What does a Drone Deploy Mission Look Like?
So here is a test case using a multirotor:
1. Put the multirotor at its takeoff point, turn on your transmitter, turn on the UAV camera and then connect the UAV battery. At this point the UAV will connect to the Drone Deploy cloud server and upload any updates if necessary.
2. Get out your smart device be it an Android phone, iPhone, laptop, tablet and log on to Drone Deploy website and enter your account information. You will then be taken to the Dashboard, which has all the missions you have planned off-line (you can plan missions in the warmth of your office/home) or missions flown with generated maps.
3. You now have the choice to plan a new mission, fly a mission you planned off-line, or re-fly a mission you have already flown.
4. Let’s plan a new mission. Pick a category, such as Agricultural.
5. You will be asked which aircraft profile and camera you want to use and your present location.
6. Draw a perimeter around your required survey area and define your required ground resolution per pixel or altitude.
7. Drone Deploy will then generate a flight path to optimize the correct image overlaps to allow correct stitching. What’s also important is that it only takes enough images necessary. Having too many extends processing time. As such normal rules are fly as high as you can, as fast as you can and get just enough images to allow quality data stitching and analysis. Drone Deploys figures that all out for you. As such the Co-Pilot also triggers the camera according to when it determines trigger points are in its computed flight path. Note, the flight path generation also checks flight duration versus aircraft capability. As such if the mission exceeds the aircraft endurance this will be flagged as a safety issue.
8. By now the multirotor status will be visible on the webpage and will report connection, updates, GPS-lock and finally FLY NOW.
9. You now click on the symbol “FLY NOW” and the Co-Pilot will go through a preset checklist, it will connect to the camera, check camera battery, check multirotor battery, ensure you have GPS-lock, and take a test image and down-load it from the copter to your browser. If all checks pass, the checklist is passed and you get the “Ready to Launch”. If any check fails, the copter will not launch, this is an inherent safety feature so you do not take off with low batteries, no GPS lock etc.
10. Now switch your transmitter to Auto, and press “Launch” on your browser screen. A notice will come up stating “Takeoff in 5 Seconds” The countdown will begin and at 0 the copter will takeoff and climb to survey height. It will then start to fly the mission.
11. During the flight, the images are downloaded to the cloud and processed as you fly. The images are also presented on your web browser screen along with aircraft speed, height and battery level. The flight plan is shown, with the aircraft progress, direction of travel and the direction it’s pointing.
12. The next bit is truly impressive. During the flight and as images are processed in the cloud, stitched results are placed in realtime, yes in realtime on the flight path of your web browser. From anyone who has used stitching software this is like a miracle to watch, realtime stitching whilst the copter is still in the air!
13. At the end of the survey, the copter returns to the take-off point (or the point you want it to land) and lands.
14. After landing large sections of the area will be stitched or even fully stitched. Results may even be available already. However results such as stitched Orthomosaic, ENDVI maps, Digital Elevation Models and 3D models will be viewable in approx. 15 minutes depending on mapping area.
15. So now power-off, pack-up your copter. Then get your smartphone etc. view the results as you are stood in the field, and using geo-referenced survey map generated from the mission go crop ground truthing etc.
16. On to the next site.

Note: Safety is paramount and Drone Deploy uses an extensive three dimensional geo-fence, that monitors aircraft deviation away from the expected flight corridor. Excessive deviations will trigger a return home and land. They also supply No-Fly Zones during mission planning.

It really is that simple and fast. Here is a video of the whole process using a preplanned flight, look how fast it takes (apologizes for iPad glare.)
http://youtu.be/c28uEXJvW6U

And here is a video of 20 acres been stitched in realtime:
http://youtu.be/d4gqTXSollw

 

Test Bed

For the initial exercise a 3D Robotics Iris+ was used. The reasoning behind this decision was to prove out that a small consumer drone could fly a commercial survey mission. Remember my initial point “the drone it’s just a tool”. I see a lot of UAV manufacturers with excellent products, however the price tags are in the $15k to $30k price range. Again all the drone does is act as a camera platform, it’s the images from the camera that are the important bit. If you want quality data, focus on the payload system, and just use the drone as a tool to carry it. Could a $750 consumer drone do a full commercial UAV survey using Drone Deploy? Most farmers and agronomists want a ROI on new technology, so why not give them a tool they can get a feel for UAV’s in precision agriculture, but without breaking the bank account? Only one way to find out.

Iris+_White

So next was the choice of camera and gimbal. Obviously, this is the tradeoff of resolution, weight, flight time and spectral coverage. The Iris+ can carry a GoPro Hero 3 or 4 either in a hard mount or using a Tarot Gimbal. To ensure good data a gimbal is used such that the camera is always pointed straight down in a NADIR position. This position and the brushless gimbal ensure the image is sharp with the correct overlap. However it is expected that the hard mount could be used and is a future investigation.

Iris

So now talking about the GoPro. Obviously we want to do vegetation health and stress analysis so we need some type of Near-Infrared, or Red Edge camera. This requires taking a stock GoPro, opening it up and removing the normal lens/filter used for RGB daylight photography/video:

http://youtu.be/_tUl-BzZN70

IR_Pro

(If you are not confident in doing this please use a vendor purchase from RageCams or IR-Pro instead.)
Once the old lens is removed, use a new replacable screw-in NIR-GB lens. There are number of lens that exist on the market from IR-Pro http://www.ir-pro.com/ and Peau http://www.peauproductions.com/main.html

IR_Pro_1

So what results should you expect from a GoPro NDVI? The most important thing to recognize is that this is not a referenced NDVI camera. To go that route you are looking at much more expensive solutions. Here we are talking about a cheap scouting copter for farmer to take the first steps into UAV with a short ROI which does not break the bank, but can generate actionable data and can identify vegetation and crop issues, that are correlated to ground data. The holy grail of UAV NDVI is still that, you speak with anyone with experience in this field (bad pun) and they will tell you that although everyone holds NDVI up as the only metric. That’s untrue, in reality it’s using any combination of cameras to generate UAV images, which are combined with ground data to generate actionable data. Different crops work better with different cameras, processing, camera combinations i.e. NGB and FLIR for example. There is no single camera/formula which will work for all solutions. Of course resolution, imager size, rolling shutter effects etc. play into this as well, but the question here is, can you use a GoPro for NDVI that allows useful correlation with ground data.

So let’s do some comparisons. Below are 2 scenes, one of an open space, one of a plant, with photographs taken with a normal RGB Canon SX260HS, the GoPro with a IR-Pro InfraBlue22 NDVI lens, a Canon SX260HS with an older style Event 38 filter, and finally a Canon SX260HS with a new style Event 38 filter. That’s 8 photographs in all. Obviously you cannot vegetation health from the RGB, but it’s our reference to understand what we are looking at.
RGB Photographs

RGB_FieldRGB_Plant

Now look at the GoPro NDVI photograph. You can see the vegetation is red/brown/pink, whilst the manmade objects are mostly as viewed by our RGB reference. However you see some pink tinge to the manmade objects.
GoPro Photographs

DCIM100GOPROGOPR0283. DCIM100GOPROGOPR0296.
Looking to the older Event 38 filter, you can see that this has similar type of spectral separation, with some overlap between the vegetation and manmade objects. Here the manmade objects such as the road and buildings have a slight pink tinge to them, but less than the GoPro. This is due to the red notch roll-off, filter corner frequency.
Old Event 38 Photographs

Old_E38_Field Old_E38_Plant

On the newer Event 38 filters, you can now see good separation of spectral content, with manmade objects having no or little pink, and the vegetation been much better defined pinks/browns. This is because the new filter has a roll-off which is steeper and the corner is moved slightly up in the spectrum.
New Event 38 Photographs

New_E38_Field New_E38_Plant

As such you can see how 3 different NIR filters with good imagers, two with the same Canon imager, can generate significantly differently results. For today the flights are only concerned with the GoPro InfraBlue22 NDVI lens, but in later blogs I’ll be looking at filters for Canon Point-Shoot and the Sony Alpha range of cameras, as finally with an Event 38 GoPro lens using the filter shown in the above Canon SX260HS. At that time I’ll be looking to do a fly off of all the different lenses and cameras, thus showing the relative merits and drawbacks of each.
Now one curious point about having a GoPro InfraBlue22 NDVI lens, is that you can now shoot NIR video, and here is a sample. Again look at the ground color which is sparse due to the California drought, the accumulated water from the first rain in months, the vibrant color of the trees, and manmade objects like the path and buildings:
http://youtu.be/hWKzjpjLM4E
Now one question I have been asked a lot is, “can I fly a GoPro NDVI and look at the video to see crop health. In this way I won’t have to do all this image stitching and processing.” I’m afraid not, the GoPro InfraBlue22 is not really NDVI camera, what it does is captures NIR, Green and Blue spectra which then needs to be processed by software on a pixel by pixel basis to generate a NDVI image, using a NDVI, ENDVI, DVI formula. Many formulas exist and each formula has its own merits in terms of results for different crops, sun and cloud conditions, filter type etc. To get an NDVI image of a large area, you need to capture a set of overlapped still images in NIR-G-B (or other band combination dependent on filters), stitch them together and then process them according to a formula. If anyone has a NDVI video processing software let me know, or I’ve just given you your next Startup or Kickstarter idea.

 

Test Cases
Three test cases were flown. The flights were preplanned away from the field, then flown using just an Apple iPad with Internet connection whilst at the field. Two surveys covered 20 acres at an altitude of 80m and took approximately 9 minutes to fly, whilst the last survey covered 5 acres at 50m with an elapsed time of 5 minutes. The flights were flown with the GoPro 3 Black converted and using the IR-Pro InfraBlue22 NDVI lens, in a downward pointing NADIR position, in a Tarot Gimbal, flown on a 3D Robotics Iris+ multirotor. The test sites were chosen as they consisted of a number of plant habitats and included wild and mown meadow, manmade objects, seeded lawn with grounds keeper care. To act as a reference, the images from each survey were also processed in Pix4D Pro as well. Due to the nature of the test sites, stitching was not easy and this was intended to test the efficiency of the algorithms.

 

Test Case 1
Description

Wild Meadow in drought with rolling hills with a general declining elevation away from the takeoff point. Mixture of wild meadow, mowed meadow, brush, with fences, paths and cattle. Flight covered 20 acres at an altitude of 80m and took 9 minutes to fly. 37% battery left.

Drone Deploy Flight Path

Flightpath_Meadow

Drone Deploy Orthomosaic

Mosaic_Meadow1

Drone Deploy NDVI

NDVI_Meadow1

Drone Deploy Digital Elevation Model

DEM_Meadow1

Drone Deploy 3D Model

3D_Meadow

Pix4D Ray Cloud

Pix4D_Ray_Meadow

Pix4D Orthomosaic

Pix4D_Moasic_Meadow

Pix4D NDVI

Pix4D_NDVI_Meadow

Ground Photographs

Picture 1 – Looking South over the Survey Site

IMG_0634

Picture 2 – Ground vegetation and brush

IMG_0644

Picture 3 – The hole in the ground and the bump

IMG_0652

Picture 4 – The path up the left side of survey area

IMG_0696

Picture 5 – The road and banking to the right of survey area

IMG_0835

Ground Truthing
Picture 1 shows a view looking over the survey sight. You can see the rolling nature of the meadow, which is made up of grass, vegetation and brush shown in picture 3. This is common grazing meadow. Interesting if you look towards the center of the photograph at the next hill, just to the left you will see an off-road vehicle track which bends around the hill. Now look at the Orthomosaic and NDVI to the right side and you will see a blue area which bends. This is the same feature, but why the blue? Well look more to the right towards the road and compare to Picture 5. You’ll see a bank, but on the NDVI where the bank is again a blue area. Now look at the Orthomosaic, NDVI, DEM and 3D model and in the middle you will again see a blue area. Looking at Picture 3 this can be seen to be due to a big hole in the ground. So what is causing the blue? Well the survey was taken in the afternoon in late December, with the Sun low in the horizon, as such the blue is from shadows. Now look at the NDVI to the left side and you see an area of blue, this is actually shadows from the dead brush shown in Picture 2.
The rest of the NDVI shows varying degrees of green which correlates to different densities of grass throughout the meadow. In Picture 4, you can see a path to the left of the hole, which is easily seen on the Orthomosaic, NDVI, DEM and 3D Model. And that is what’s interesting is the amount of data produced, from different vegetation densities, identifying elevation changes, depressions and water drainage areas, fence lines, paths etc. It’s data rich information from a single 9 minute UAV flight over 20 acres.

 

Test Case 2
Description
Meadow in drought with wild and mowed areas, buildings and concrete path in survey area, see 3D model. Takeoff was from the top of a hill to the left center of the survey site, with the land dropping away in altitude in a rolling manner. Rain the previous days highlighted by areas of accumulated water in low lying basins. 9 min 10 sec flight at 80m with 41% battery left at landing.

Drone Deploy Flight Path

Flightpath_Meadow

Drone Deploy Orthomosaic

Mosaic_Bike1

Drone Deploy NDVI

NDVI_Bike1

Drone Deploy Digital Elevation Model

DEM_Bike1

Drone Deploy 3D Model

3D Model_Bike

Pix4D Ray Cloud

Pix4D_Ray_Bike

Pix4D Orthomosaic

Pix4D_Mosaic_Bike

Pix4D NDVI

Pix4D_NDVI_Bike

Ground Photographs

Picture 1 – Looking South of take-off point

IMG_0610 - Copy

Picture 2 – Looking North of take-off point

IMG_0612 - Copy

Picture 3 – Looking at depression in ground South of take-off filled with water

IMG_0619 - Copy

Picture 4 – Looking to the West of take-off point, towards mowed area, treeline and Sun

IMG_0620

Ground Truthing
The DEM and 3D Model and good representations of the survey site, with the DEM showing the marked elevation changes, which can be seen by looking at the ground photographs. The buildings can be seen in Picture 2, and visible at the top of the mosaic and NDVI. The tarmac bike paths can also be clearly seen.
To the middle left of the NDVI image can be see a blue area. Closer examination shows this to be due to the shadow of the trees, look closer at the shadow shape. It can also be seen in the Orthomosaic. Looking around further, and comparing to the DEM, you can see that the significant areas of blue are all on the right side of elevation slopes in the DEM. As such these are again shadows due to elevation changes and plants/bushes. This observation falls in line with Test Case 1. In this case the survey was run in late December at around 3pm, where the Sun cast shadows. Another reason for blue is water. Look at Picture 3 with the depression filled with water, then look at the same location in the NDVI and Orthomosaic. So here you have similar colors for different reasons. Another point to watch out for.
Different plant and vegetation densities show up as different shades or colors. In this case more dense areas showed up as a stronger green, whilst mowed areas and paths were more of a darker green. You can see the differences by comparing Picture 4 where you can see unmowed/mowed with the Orthomosaic and NDVI imagery.

 

Test Case 3
Description
Soccer field cared by ground keeping staff. Covers approx. 5 acres. Flight time of 5 minutes at 80m, with 62% battery left after landing.

Drone Deploy Flight Path

Flightpath_Soccer

Drone Deploy Orthomosaic

Mosaic_Soccer

Drone Deploy NDVI

NDVI_Soccer

Drone Deploy Digital Elevation Model

DEM_Soccer

Drone Deploy 3D Model

3D_Soccer

Pix4D Ray Cloud

Pix4D_Ray_Soccer

Pix4D Orthomosaic

Pix4D_Mosaic_Soccer

Pix4D NDVI

Pix4D_NDVI_Soccer

Ground Photographs

Soccer_Field

Ground Truthing
From the Orthomosaic, NDVI from both Drone Deploy and Pix4D, the question is, what’s with the circle patterns? Well that’s a good question. Just looking at the ground photograph it’s hard to even see the pattern unless you already know it’s there. So what is it? To be truthful I can say what the difference is, but not why. The area within the circles appears to be of a different density of seeding, with more densely packed growth outside to inside the circles. As to why a circular pattern, I have no idea. There are no sprinklers to generate such a well defined pattern. At this time I’m going to approach the field owners to ask some questions.

 

Drone Deploy Observations
So let’s go over my observations from this investigation of Drone Deploy:
1. UAV data needs to be ground truthed. Without correlating ground and aerial data, you have no means to interpret what you are seeing. UAV images and in particular NDVI imagery however can highlight issue areas, and with experience seasoned UAV operators can pull on past knowledge to make educated assessments.
2. Drone Deploy is fast and simple. Its GUI is intuitive and actionable data is delivered in a simple to understand format. Drone Deploy is so easy to use via a tablet, a non RC experienced user can easily fly a survey, and get quality results in a very short time.
3. Stitching on the go, it’s unheard of and it’s not a gimmick. Having the ability to see maps generated as you fly means spotting issues with images whilst in-flight, or issues in the field straight away.
4. Actionable data, it’s not just a saying. Drone Deploy generates Orthomosaic overlayed on Google Earth which has great alignment. It’s amazing to watch as you zoom in the quality of detail in Drone Deploy area, compared to surrounding Google Earth imagery.
5. You get 4 sets of data in one go, Orthomosaic, NDVI, Digital Elevation model and 3D Model all in one . Also all the data can be exported and shared.
6. The GoPro does not have GPS and for this exercise I didn’t try to match the GoPro clock to the Pixhawk clock, so I could align using the Iris+ logs. Instead the Pix4D images were stitched with un-geotagged images. As such I was quite amazed at the speed and ability of Pix4D to stitch none geotagged images from a GoPro. At no time did Pix4D fail to stitch the images from any survey flight I flew.
7. Most parameters on the GoPro are controllable, except one, shutter speed. My expectation, and one raised by Agribotix in their blog in the past, is that the lack of shutter speed control with flight speeds of UAV’s would generate blurring. In fact this was not the case. You’d be unable to stitch images if this was the case.
8. Drone Deploy and Pix4D data correlates well, showing that both companies have done their homework on how to generate data that is quality, correct and actionable.
9. You can fly 20 acres with a consumer drone such as an Iris+, capturing NGB images to generate NDVI data whilst the aircraft is still in the air. Given the battery capacity left and flying at 107m it is feasible to cover 25 acres with such a setup.
10. NDVI data is not infallible, it is a function of camera quality, filters, Sun and lighting conditions. Be aware of the limitations of the technology you are using to get the best data available.

 

Drone Deploy Conclusions
Overall I’m very impressed with the Drone Deploy system. This is a disruptive technology in the commercial UAV space, where the UAV is viewed as a tool rather than pretty piece of sexy technology. In this space the UAV is just a part of the whole system, an important part, but the end goal is actionable data. Using a consumer drone from 3D Robotics, the Iris+, and flying a NDVI converted camera, Drone Deploy facilitated a number of 20 acre NDVI crop scouting flights, with actionable data available in minutes that correlated with ground data. On top of that Drone Deploy have made a system that a worker in the field with no previous experience of RC flight can be trained on a flown with a smartphone or tablet. Nothing else today exists to do this. I believe Drone Deploy has a very bright future.
If you are interested in learning more about this system, drop me an email at iain.butler@kextrel.com

 

Acknowledgements

I’d just like to personally thank the following people for making this investigation a success. Mike Winn for initially reaching out to me to discuss the vision of Drone Deploy. Jono Millin and Nicholas Pilkington for business and mapping support, and Jeremy Eastwood, Chase Gray and Manu Sharma for technical assistance. Best of luck and I’m sure that you will do well. Also congratulations on hiring Gretchen West.

In my next blog I aim to cover flying a number of different cameras on a 3D Robotics X8M platform with and without Drone Deploy. The cameras will range from GoPro, through Point and Shoot to high end consumer.

Regards

@theUAVguy

http://www.twitter.com/theUAVguy

http://www.kextrel.com

Advertisements

12 thoughts on “God is my Co-Pilot……or is it Drone Deploy?

  1. Pingback: 1p – God is my Co-Pilot……or is it Drone Deploy – blog.offeryour.com

  2. Pingback: God is my Co-Pilot……or is it Drone Deploy? - dronespain.es

  3. Pingback: Weekly Roundup 1/26/15 | Center for the Study of the Drone

  4. Pingback: The Drone Center’s Weekly Roundup: 1/26/15 - anyrobocom

  5. Iain, excellent write up. Very in-depth and thorough. I was pleasantly surprised to find that DroneDeploy held its own against Pix4D (which I have much more experience with).

    Looking forward to any future updates on your findings and comparisons.

    Thanks,
    Ian

  6. Pingback: Sometimes the Whole is Greater than the Sum of the Parts | theUAVguy

  7. Pingback: Sometimes the Whole is Better than the Sum of the Parts - dronespain.es

  8. Pingback: Sometimes the Whole is Better than the Sum of the Parts - Quadcopter Blog

  9. Very informative, thanks for all the detail. My question is how does DroneDeploy control the camera on the GoPro? Or even pull the images from the camera? Thanks!

  10. Very good article, I can very much relate to the importance of having a unified tool for UAV data production. Also using a GoPro for vegetation mapping at work. At first I couldn’t believe such a small camera with a short focal could produce good results… Gonna have to mod a cam for NIR though as well. Good day to you!

  11. Pingback: theUAVguy Case Study Using E38 Filters

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s