Translation Issues

Where on earth did the time go? One minute I was looking at ice sculptures and wading through snow drifts and now the snow is gone (mostly) and various wildlife has emerged from hiding.

Look, beavers!

I’ve also managed to get out on the water myself. And eat bacon while doing it. Canadian bacon is Different.

riverbaconRiver bacon!

Perhaps the main reason I’ve lost track of time in the last few weeks is that whether I’ve been awake or asleep, this keeps on flashing in front of my eyes.


I haven’t been able to escape it. My office desktop has looked like this on and off for the whole month. I hope to later show what this has resulted in, but in the meantime I’m going to moan about the amount of pain it’s caused me.

The main source of trouble was that this code was originally written in Matlab. I decided to save us from having to acquire a Matlab license by translating the many scripts that make up the code into R.

Initially this was tedious. There are enough syntax differences (not to mention different names of functions etc) between R and Matlab that this required me to go through the code line by line. Then, even when I done this a number of errors arose simply due to the differing ways that the two programs handle data. I’ll post a guide based on what I learnt in separate post, featuring less pictures of beavers.

Much hair pulling later I got the code running, fed it my data and got a result. These results were consistent with some previous findings obtained using simpler methods. So far so good.

I then decided that instead of feeding my data to the code all in one go, it would be useful to give it one day at a time and then collate the results. “Fine” I thought. “Just modify my overarching processing code, no trouble”.

I was wrong.

c149182a696e73890de876f3d392e2da(Found via googling evil Matlab)

Once again the way in which the two programs handle data required me to make a lot of modifications to the various scripts. Cue more hair pulling. I should also mention that I’ve written this code to be run in parallel, utilising all of my computers cores to increase speed, which means R’s normal debugging tools don’t work.

Finally I got the code to run again and got a result. However, something had changed. A previously suggested relationship had completely reversed in direction. Was this simply due to the new way of feeding the data in? Or was it due to a bug in my code? Or due to me deleting some faulty data? I ran the code again using the original way of processing the data.

Even using parallel processing, this code can take anything from several hours, to all night to run. This meant that getting results was a slow process. So after waiting several hours for the code to run again using the original data processing, once again I got results.

The relationship had flipped direction in these results too.


This suggested that the changes I’d made to accommodate the new data processing method had resulted in a COMPLETELY DIFFERENT RESULT. On the one hand, this was good. It meant that the biologically unrealistic result was due to my error rather than a fundamental problem with the methods. On the other hand, this is the sort of thing that can wake a scientist up at night screaming. A series of small changes in the way data was analysed leading to completely misleading results. In this case we’d caught it before we went too far, but if we’d approached this naively it might have been very easy to miss.

So, now I needed to work out which of my changes had caused this change. Luckily I save all my working files in Dropbox, which keeps a backup of all previous versions. I found a word document containing graphs I’d made to show my supervisor before I’d made changes and reverted all my code to a date before then. Then one by one I reinstated my changes.

In the end I pinned it down to one file. In that file, one line of code.


One line of code had resulted in huge, significant changes to my final result. As I said, the stuff of nightmares.

In the end I stripped out all the changes I’d made and carefully rewrote the scripts to deal with the new method of data processing. So my tale of woe has a happy ending, the code now works and perhaps I’ll even have some results soon. For everyone who made it this far, here is a view of Gatineu Park:


Gemeinsame Nahrungssuche bei Krähenscharben

headerOr, my new paper:

Social foraging European shags: GPS tracking reveals birds from neighbouring colonies have shared foraging grounds

has been published.

From the German translation of my title and abstract I have learnt that the German for shag is Krähenscharben. This paper based on my first year of fieldwork and data from the FAME project is now available from Journal of Ornithology! The paper looks at the movement and behaviour of shags foraging in the Isles of Scilly using high resolution GPS data, showing that most individuals forage in the same areas within the islands.

After the tale of woe that was my first year of fieldwork, it’s nice to see the data collected finally producing something tangible. The paper also uses some of my rafting dataset , which I hope to release a more detailed study of at some point in the future.

There is a full text version on Researchgate at some point in the future.

As stated at the bottom of the article, many people helped with this: A big thankyou to Richard Bufton, David Evans, Liz Mackley and the volunteers who assisted with data collection. Thanks to Vicky Heaney for population data and advice! Finally thanks to everyone at the Isles of Scilly wildlife trust for their invaluable assistance and for permission to work on the islands!

EDIT: Link to journal now fixed and full text now available on researchgate.

More virtual birds

I have returned to my simulated rafts of shags. As well as general code refinements, I have also completely overhauled the way they dive and surface.

The dive rules have been in place for a while. They are as follows:


1. The virtual bird will dive wherever and whenever it wants, without paying any attention to what other birds are doing. Given that my hypothesis is that birds are using the diving behaviour of others to make their own diving decisions, this is the null model.

2. Birds are more likely to dive if another bird dived or surfaced recently. How likely they are to dive depends on how close the other bird was in space and time.

3. Similar to 2. but birds only have a limited cone of vision. They will not be away of a dive going on behind them for example.


Previously I only had one rule for surfacing: birds would surface at a random time after they had dived, within a circle of a random radius centred around where they initially dived . The distance they were allowed to travel was constrained by how long they’d been underwater. They were also more likely to surface in proximity to other birds, so as to maintain the cohesion we see in real birds.

Just before ISBE, my supervisor suggested that actually we probably want to try out a variety of surface rules to try a few different scenarios. This required a complete rewrite of the simulation, but this was a good opportunity for me to merge all my various dive rules into one piece of code too. Though there was quite a bit of hair pulling involved, I eventually ended up with a simple unified function.

The surfacing rules birds can now obey are as follows:


1. Surface completely randomly in time and space. Virtual birds can reappear at any time within a circle of random radius centred around their dive, with the maximum radius depending on how long a bird was underwater. They don’t care if they are near another bird on the surface or not.

2. Same as rule 1, but birds are more likely to surface closer to another bird. The time they stay underwater is still random, which still controls the maximum possible distance they can travel underwater.

3. Similar to 2. with an important addition: A bird is more likely to surface in close proximity (in time and space) to a bird it dived in close proximity to (in time and space). This is to attempt to simulate birds following each other more closely underwater.

Of course. rewriting my code so that I can switch between which rules a simulation is using just by changing a number in my simulation uses was somewhat tricky, but eventually I hammered out all the bugs. Unifying all my code like this will make the selection of a best fitting model much simpler.

Back to Reality

When I eventually got back into the office after my trip to the USA, I turned on my computer and stared at it for a while, trying to remember how it worked.

Once the basics of how to operate a computer came back to me, I took a long hard look at where I am in my PhD. My basic thought process was something like this:

  • I have some stuff.
  • I need to write about this stuff.
  • Most of this stuff is going to require the doing of additional stuff before it is in a state where I can write about this stuff.
  • I REALLY need to write about this stuff.
  • I wonder where I have put all this stuff?

By which I mean I need to consolidate a few years worth of data, that in the past has been exported, rather haphazardly, to various folders in my dropbox. I also probably need to redo some statistics to include additional data collected this year. I then need to try and hammer this out into writing that people other than myself can understand and find interesting.

Let the gathering of Excel files commence.




Pre-Manhattan Madness

Only 2 days before I have to try and achieve the necessary escape velocity to leave Cornwall. I am still scrambling to pull everything together for my trip to ISBE. I am still running my simulations, trying to find the best models. I might be up until the very last minute, when have to stop and add the presented results to the talk.


Still optimising..

With the (rather important!) exception of the final results, my presentation is more or less done. I could surely benefit by practising it a few (many) times. I also need to generate a PDF version of my slides, just in case of technical problems. I very much hope there are not technical problems, as my talk has a fair few videos, and animations to make my equations friendly..



On the plus side, my virtual birds are behaving a lot better!

Virtual Scillies

It is an inevitability of fieldwork that all the time spent gallivanting about in the outdoors will result in a far greater period of time spent in an office staring at a computer screen.

For me this involves sorting videos and inputting data. Then using a combination of the bearings I recorded and the video data, I transpose the raft locations onto a map. This is what my set-up for doing this looks like :

vrscilliesI’ve constructed a 3d representation of the Scillies, using terrain elevation data and high quality bathymetry data,which contains all the little rocks and ledges you find in the channels within the islands. Onto this I add my observation points, which I recorded using a GPS, and then plot the bearings from these points. Whenever I recorded the bearing, I zoomed the camera out so as be able to match it up with the 3d imagery to estimate the placement of the raft along the bearing.

Unfortunately the university computer I have is the opposite of powerful. As such, rotating the camera in the 3d space takes ages. There is also a significant delay between clicking to add a point on the map and the actual appearance of that point.

The time I spend in the virtual Isles of Scilly may almost be as long as the time I spend in the real Scillies.

Though hopefully not.

Introduction to Finality

I am now in the final year of my PhD. That means I am accelerating uncontrollably towards the end of my PhD, a point which once seemed like a vague area of looming dread on the horizon. That point is now somewhat more tangible.

This means I actually need to:

a) Finish things

b) Write those things down.

c) ???

d) Profit

Therefore I actually have to start process of turning all the things I’m excited about looking into, to actual science writings. Chapters! In a thesis!

So far I feel like I have a few potential data chapters, mainly the individual tracking data, the rafting data and collective behaviour data.

I haven’t talked about the collective behaviour stuff much yet. I’ll probably devote a blog post to it when it comes together more (at the moment it exists as some ideas in my brain, and some particular video footage on a hard drive). I’ve been inspired by a few really good talks I’ve seen at ASAB meetings from both Dr Christos Ioannou and Professor Ian Couzin who both do very cool stuff in the field of collective behaviour. Collective behaviour, put simply, looks into the dynamics that control movement and behaviour within large groups of animals. Which seems perfect for analysing my foraging rafts!

As I said though, this exists mainly as ideas at the moment. Actually doing this kind of stuff requires some serious programming and mathematical work. Dr Christos threw some useful papers my way after this years easter ASAB, which helped when it came to data collection, so those will probably inform the methods I use.

Aside from data chapters I have my modelling chapter and a lit review. These feel in many ways like they have been dragging on since the first year (because they have!) so I need to make a concentrated effort to actually finish those over the next few months.

Of course, I also need to analyse and write up all the data chapters too. As well as bring the collective behaviour chapter into being.

This has been a wordy post, full of plans and hand-wringing. Here is a nice picture.


Representing the sunset of my PhD aaaarg.