We’ve all been there. You’ve just run an expensive computation in your Jupyter Notebook and are about to draw those conclusions which will prove that your theories were right all along (until you find the sixteen bugs in your code which render them invalid, but that’s an issue for a different time). Then at the critical moment, your flatmate begins streaming their Lord Of The Rings marathon in 4k and your already temperamental Wi-Fi severs your connection to the department servers in protest, crashing your Jupyter Notebook, leaving your hopes and dreams in tatters.
Continue readingAuthor Archives: Thomas Hadfield
Converting Miles to Kilometres – An inefficient but neat method
Picture this: You’re a zealous acolyte of the metric system, with a rare affliction that makes multiplying decimal numbers impossible. You’re on holiday in the UK, where road signs give distances in miles. Heathens! How can you efficiently estimate the number of kilometres without multiplying by approximately 1.60934?
Continue readingHow to be a Bayesian – ft. a completely ridiculous example
Most of the stats we are exposed to in our formative years as statisticians are viewed through a frequentist lens. Bayesian methods are often viewed with scepticism, perhaps due in part to a lack of understanding over how to specify our prior distribution and perhaps due to uncertainty as to what we should do with the posterior once we’ve got it.
Continue readingA Gentle Introduction to the GPyOpt Module
Manually tuning hyperparameters in a neural network is slow and boring. Using Bayesian Optimisation to do it for you is slightly less slower and you can go do other things whilst it’s running. Susan recently highlighted some of the resources available to get to grips with GPyOpt. Below is a copy of a Jupyter Notebook where we walk through a couple of simple examples and hopefully shed a little bit of light on how the algorithm works.
Continue reading