“I’m sorry Julian, I can’t do that.”
So in a previous post I mentioned that I had to calculate a 7 dimensional matrix. I also mentioned that this might be enjoyable for the computer.
It really, very much was not.
I’ll skip the bit where I had to work out how to code forward iteration. While it took me a while of staring at equations over the course of my short visit back to Wales at Christmas (the examples in the relevant books don’t deal with anywhere near as many state variables as I use) I got there in the end. I understood what I needed to get for the relevant steps.
The problem came when, as mentioned before, I had to try to get the computer to calculate a 7d transition matrix, which gives the probabilities of how a birds state variables might evolve. My first problem was that this is FAR too many numbers. My computer can’t handle that many numbers! This at least was sort of fixed by switching to using cell arrays of sparse matrices. A sparse matrix just stores the location of non-zero values and a cell array is a convenient way of storing these without having to go into the realm of ‘orrible multidimensional matrices.
My next problem was how to implement the calculations. My rather naive approach was to use loops, which essentially step through every given value, one at a time. In order to deal with every dimension, I had 7 loops, all nested within each other.
This was very, very, very slow indeed.
In fact, I did a quick calculation and worked out that to run the forward iteration, once, with one set of initial parameters, would take about a month. That clearly was not going to be acceptable.
So, I had to do something to optimise my code. After some standard code related hair pulling, I managed to vectorise the code. Vectorising basically means that rather than the computer having to check one number at a time, and fill one cell of the 7 dimensional matrix at a time, within each loop, I can do a load of checks at once, cutting down the number of loops.
This worked fantastically. A calculation that previously took over a day now takes only a few minutes.
However I still face errors when the model is run for more than 60 timesteps. A curiously inflexible barrier. 59, fine. 60, inevitable out-of-memory error.