Valentine’s Day, Deep Learning and more!

Hello everyone!

It’s the month of February! Hope everyone is out of the exam fever and looking forward to new activities starting on campus. Well, I am trying to balance my research with some fun activities and, I was thinking of trying out some new SPORTICIPATE activities like women’s yoga and fitness. It is worth having a look at the timetable to see the kind of activities you would like to try out.

In addition, there was a Faculty Welcome event for new PGRs, organized on the 29th of January. I did not have the opportunity to attend this event in September so, I decided to attend this one. I was surprised to see a huge gathering of PhD students from all the schools in Manchester. Even though, I have done my Masters here at Manchester, I find that by attending these events I learn so much more about the University, the facilities and activities available for us. There were talks from PhD students at different stages in their study with tips on how to SURVIVE our PhD, Education officer from the Student Union and Faculty Representatives. It was a warm and welcoming event along with being quite informative on the ways to develop ourselves beyond just doing our PhD.

EPS Welcome Event, Renold Building – photo credits @epsgrads

Apart from that I have found new ways to procrastinate, like I read this interesting email from The Career Service which talks about February being the month of “LOVE” and whether we find ourselves falling out of love with our PhD? Well, I think it’s too early to say, since I have just been in a five month relationship with my PhD 😛 and things are in the exciting phase at the moment. 😀


Speaking about exciting things, I am currently looking into Caffe which is a deep learning framework. I have started to work on Convolutional Neural Networks (CNNs) and I am looking into the Basic linear algebra subprograms (BLAS) at the moment which is a library for performing Matrix and vector operations. Basically, the convolution, which is a primary source of computation in CNNs is implemented as a big matrix multiplication in Caffe. So, why am I doing this? Well, it’s the first step to understand what’s happening in the code and try to see if there are ways in which I can do things better.  I also came across a blog by a University of Manchester alumni Pete Warden who wrote a very helpful blog called “Why GEMM is at the heart of deep learning

Last month, I also had a meeting with a group of people in Manchester using deep learning in their research and powering these algorithms with GPUs. I was quite happy to interact with colleagues from Manchester Institute of Biotechnology and the School of Electrical and Electronic Engineering.

 Some of them were working on a variety of problems like using deep learning to do video classification and classifying proteins, if I can recollect correctly. Anyways, I reckon it’s not yet official but we have a new mailer list and we intend to have regular meetings. In a month’s time  we are going to discuss a paper “Practical Recommendations for Gradient-Based Training of Deep Architectures by Yoshua Bengio”

On a more fun note, I am currently in the process of planning to attend my first conference (drum roll….) which is the GPU Technology conference in Silicon Valley during April. 🙂 I am quite excited as it has a lot to offer in terms of deliveries by keynote speakers and talks on deep learning technologies, Internet of Things, Self-driving cars and more. I hope to write a blog post about my experience at this conference.

Finally, classes have started for the Undergraduates and Postgraduates, which means TA work has begun for me. This month I have an undergraduate module, COMP14112: Fundamentals of Artificial Intelligence and it involves knowing about probability and its application in Robot localization. It definitely sounds exciting and I’ll be spending the next week preparing for the lab sessions!

You can catch me on twitter @crefeda