MacGuyvering cameras

I wanted to do some cover analysis of the places my feeders were hanging, to see how “risky” birds might perceive them as.

One part of this that I decided to do was to use a camera with a wide angle lens to take a photo of the canopy above the feeder, a technique I’d read about in some papers looking at cover structure. The lab had a gopro that seemed perfect for the job. However, I soon discovered that a gopro does not easily fit on a normal tripod. Not without buying expensive accessories!

“Fine” I said, “I’ll kitbash something”

I took the “display tripod” the gopro had come on, a cheap plastic version of a gopro mount with a sticky base that is attached to the cardboard box which contains the manual and wires. I then attached this to one of our home made bird tables that attach to our tripods. However I quickly learned that attaching it in this fashion meant I couldn’t tilt the camera far enough backwards to get it pointing directly upwards. So I started adding layers of card under the sticky base to tilt THAT backwards. Eventually I ended up with this:

Hopefully I used enough duct tape

The orange box pictured here is also part of my cover analysis, in this case to help measure horizontal cover. I’m sure I’ll write more about this in the future.

I also played with video camera set ups:

I’ve been helping out with video tracking on some cricket projects. In this case, I wanted to mount cameras in such a way that they COULD NOT MOVE (this makes video tracking easier). My solution in this case was to get hold of a threaded rod of exactly the right diameter to fit into the hole a tripod screw normally goes in on the bottom of the camera. Then I drilled a hole in the beam above the test arenas. I’d like to see those cameras try to change their fields of view now!

Advertisements

Translation Issues

Where on earth did the time go? One minute I was looking at ice sculptures and wading through snow drifts and now the snow is gone (mostly) and various wildlife has emerged from hiding.

Look, beavers!

I’ve also managed to get out on the water myself. And eat bacon while doing it. Canadian bacon is Different.

riverbaconRiver bacon!

Perhaps the main reason I’ve lost track of time in the last few weeks is that whether I’ve been awake or asleep, this keeps on flashing in front of my eyes.

codegif.gif

I haven’t been able to escape it. My office desktop has looked like this on and off for the whole month. I hope to later show what this has resulted in, but in the meantime I’m going to moan about the amount of pain it’s caused me.

The main source of trouble was that this code was originally written in Matlab. I decided to save us from having to acquire a Matlab license by translating the many scripts that make up the code into R.

Initially this was tedious. There are enough syntax differences (not to mention different names of functions etc) between R and Matlab that this required me to go through the code line by line. Then, even when I done this a number of errors arose simply due to the differing ways that the two programs handle data. I’ll post a guide based on what I learnt in separate post, featuring less pictures of beavers.

Much hair pulling later I got the code running, fed it my data and got a result. These results were consistent with some previous findings obtained using simpler methods. So far so good.

I then decided that instead of feeding my data to the code all in one go, it would be useful to give it one day at a time and then collate the results. “Fine” I thought. “Just modify my overarching processing code, no trouble”.

I was wrong.

c149182a696e73890de876f3d392e2da(Found via googling evil Matlab)

Once again the way in which the two programs handle data required me to make a lot of modifications to the various scripts. Cue more hair pulling. I should also mention that I’ve written this code to be run in parallel, utilising all of my computers cores to increase speed, which means R’s normal debugging tools don’t work.

Finally I got the code to run again and got a result. However, something had changed. A previously suggested relationship had completely reversed in direction. Was this simply due to the new way of feeding the data in? Or was it due to a bug in my code? Or due to me deleting some faulty data? I ran the code again using the original way of processing the data.

Even using parallel processing, this code can take anything from several hours, to all night to run. This meant that getting results was a slow process. So after waiting several hours for the code to run again using the original data processing, once again I got results.

The relationship had flipped direction in these results too.

Moving-animated-clip-art-dont-panic-picture

This suggested that the changes I’d made to accommodate the new data processing method had resulted in a COMPLETELY DIFFERENT RESULT. On the one hand, this was good. It meant that the biologically unrealistic result was due to my error rather than a fundamental problem with the methods. On the other hand, this is the sort of thing that can wake a scientist up at night screaming. A series of small changes in the way data was analysed leading to completely misleading results. In this case we’d caught it before we went too far, but if we’d approached this naively it might have been very easy to miss.

So, now I needed to work out which of my changes had caused this change. Luckily I save all my working files in Dropbox, which keeps a backup of all previous versions. I found a word document containing graphs I’d made to show my supervisor before I’d made changes and reverted all my code to a date before then. Then one by one I reinstated my changes.

In the end I pinned it down to one file. In that file, one line of code.

as.matrix(Y)

One line of code had resulted in huge, significant changes to my final result. As I said, the stuff of nightmares.

In the end I stripped out all the changes I’d made and carefully rewrote the scripts to deal with the new method of data processing. So my tale of woe has a happy ending, the code now works and perhaps I’ll even have some results soon. For everyone who made it this far, here is a view of Gatineu Park:

IMG_3201

Gemeinsame Nahrungssuche bei Krähenscharben

headerOr, my new paper:

Social foraging European shags: GPS tracking reveals birds from neighbouring colonies have shared foraging grounds

has been published.

From the German translation of my title and abstract I have learnt that the German for shag is Krähenscharben. This paper based on my first year of fieldwork and data from the FAME project is now available from Journal of Ornithology! The paper looks at the movement and behaviour of shags foraging in the Isles of Scilly using high resolution GPS data, showing that most individuals forage in the same areas within the islands.

After the tale of woe that was my first year of fieldwork, it’s nice to see the data collected finally producing something tangible. The paper also uses some of my rafting dataset , which I hope to release a more detailed study of at some point in the future.

There is a full text version on Researchgate at some point in the future.

As stated at the bottom of the article, many people helped with this: A big thankyou to Richard Bufton, David Evans, Liz Mackley and the volunteers who assisted with data collection. Thanks to Vicky Heaney for population data and advice! Finally thanks to everyone at the Isles of Scilly wildlife trust for their invaluable assistance and for permission to work on the islands!

EDIT: Link to journal now fixed and full text now available on researchgate.

More virtual birds

I have returned to my simulated rafts of shags. As well as general code refinements, I have also completely overhauled the way they dive and surface.

The dive rules have been in place for a while. They are as follows:

diverules

1. The virtual bird will dive wherever and whenever it wants, without paying any attention to what other birds are doing. Given that my hypothesis is that birds are using the diving behaviour of others to make their own diving decisions, this is the null model.

2. Birds are more likely to dive if another bird dived or surfaced recently. How likely they are to dive depends on how close the other bird was in space and time.

3. Similar to 2. but birds only have a limited cone of vision. They will not be away of a dive going on behind them for example.

 

Previously I only had one rule for surfacing: birds would surface at a random time after they had dived, within a circle of a random radius centred around where they initially dived . The distance they were allowed to travel was constrained by how long they’d been underwater. They were also more likely to surface in proximity to other birds, so as to maintain the cohesion we see in real birds.

Just before ISBE, my supervisor suggested that actually we probably want to try out a variety of surface rules to try a few different scenarios. This required a complete rewrite of the simulation, but this was a good opportunity for me to merge all my various dive rules into one piece of code too. Though there was quite a bit of hair pulling involved, I eventually ended up with a simple unified function.

The surfacing rules birds can now obey are as follows:

surfrules

1. Surface completely randomly in time and space. Virtual birds can reappear at any time within a circle of random radius centred around their dive, with the maximum radius depending on how long a bird was underwater. They don’t care if they are near another bird on the surface or not.

2. Same as rule 1, but birds are more likely to surface closer to another bird. The time they stay underwater is still random, which still controls the maximum possible distance they can travel underwater.

3. Similar to 2. with an important addition: A bird is more likely to surface in close proximity (in time and space) to a bird it dived in close proximity to (in time and space). This is to attempt to simulate birds following each other more closely underwater.

Of course. rewriting my code so that I can switch between which rules a simulation is using just by changing a number in my simulation uses was somewhat tricky, but eventually I hammered out all the bugs. Unifying all my code like this will make the selection of a best fitting model much simpler.

Back to Reality

When I eventually got back into the office after my trip to the USA, I turned on my computer and stared at it for a while, trying to remember how it worked.

Once the basics of how to operate a computer came back to me, I took a long hard look at where I am in my PhD. My basic thought process was something like this:

  • I have some stuff.
  • I need to write about this stuff.
  • Most of this stuff is going to require the doing of additional stuff before it is in a state where I can write about this stuff.
  • I REALLY need to write about this stuff.
  • I wonder where I have put all this stuff?

By which I mean I need to consolidate a few years worth of data, that in the past has been exported, rather haphazardly, to various folders in my dropbox. I also probably need to redo some statistics to include additional data collected this year. I then need to try and hammer this out into writing that people other than myself can understand and find interesting.

Let the gathering of Excel files commence.

excel

Arg.

 

Pre-Manhattan Madness

Only 2 days before I have to try and achieve the necessary escape velocity to leave Cornwall. I am still scrambling to pull everything together for my trip to ISBE. I am still running my simulations, trying to find the best models. I might be up until the very last minute, when have to stop and add the presented results to the talk.

sim

Still optimising..

With the (rather important!) exception of the final results, my presentation is more or less done. I could surely benefit by practising it a few (many) times. I also need to generate a PDF version of my slides, just in case of technical problems. I very much hope there are not technical problems, as my talk has a fair few videos, and animations to make my equations friendly..

PP

pleaseworkpleaseworkpleasework..

On the plus side, my virtual birds are behaving a lot better!

Virtual Scillies

It is an inevitability of fieldwork that all the time spent gallivanting about in the outdoors will result in a far greater period of time spent in an office staring at a computer screen.

For me this involves sorting videos and inputting data. Then using a combination of the bearings I recorded and the video data, I transpose the raft locations onto a map. This is what my set-up for doing this looks like :

vrscilliesI’ve constructed a 3d representation of the Scillies, using terrain elevation data and high quality bathymetry data,which contains all the little rocks and ledges you find in the channels within the islands. Onto this I add my observation points, which I recorded using a GPS, and then plot the bearings from these points. Whenever I recorded the bearing, I zoomed the camera out so as be able to match it up with the 3d imagery to estimate the placement of the raft along the bearing.

Unfortunately the university computer I have is the opposite of powerful. As such, rotating the camera in the 3d space takes ages. There is also a significant delay between clicking to add a point on the map and the actual appearance of that point.

The time I spend in the virtual Isles of Scilly may almost be as long as the time I spend in the real Scillies.

Though hopefully not.