Fuelled by insatiable curiosity
Miscellaneous » My Masters
After thoroughly enjoying the research-based aspects of my undergraduate engineering degree, I chose to further my studies with a Master of Engineering. I have since completed my ME thesis, which was titled A Semi-autonomous Wheelchair Navigation System. The title should give you a fairly clear indication on what the project was on. Using a systems engineering based approach, the key aspects of the system I developed were:
A lot of work! At my university, in addition to the thesis another requirement of Master of Engineering is to do four courses. Since I had already taken many of the interesting and relevant courses in Mechatronics undergraduate degree, for three out the the four courses that I took were ones which I was missing two years of pre-requisite courses each. Sure, I could have taken 'easier' courses, but I'm glad I took ones that challenged myself and broadened my skillset. The courses I took were machine learning, advanced algorithms, image reconstruction, and advanced embedded systems.
Believe it or not, the scope for developing a semi-autonomous wheelchair navigation system is large, and to be honest, quite ambitious for a Masters. I say this not to gloat but instead to share my hindsight. Your masters project is, and in my opinion always should be, entirely propelled by yourself. Of course it is easier if others are help you along the way (supervisors, technicians, colleages, etc), but ultimately and most likely, you are the only one who can overcome particular challenges and problems with your project. Even part way through your thesis, chances are that you are more of an expert than your supervisor(s) in that area.
Since I did not want to concentrate solely on one area, in order to deliver on the large scope of the project I utilised many existing open source frameworks. Without a doubt, bolting together existing software together greatly speeds up development and allows you to focus your attention to the higher level aspects of the project. However, in my experience, there is no such thing as a smooth and hassle-free install of an open-source project. The problem is dependencies and differences between your system and the developers. Installing and using open-source software is great, but be aware that it takes a lot more effort and work that others will probably give you credit for (but the more you install, the easier it gets). The main open source framework I used was The Player Project, and to make installing it easier, I've devoted a page on this website to get you up and running quicker.
Over the duration of my masters, I also added a couple of programming languages to my repertoire. I learnt how to use Python quickly after starting in the deep end (threads, event handlers, GUI design, and interfacing to code I had written in C/C++). I wanted to use Python for the front-end interface to the system, i.e., the GUI. I also learnt how to write documents in LaTeX. Like any other programming language, expect to have to get over the initial learning hurdle. I also became more fluent in operating Linux, particularly through the command line, and better at installing open-source projects.
Most of my masters was software related. In order to evaluate sensors and test out of the system, I also had to modify electronics, design circuit boards, and instrument a wheelchair with sensors. Dealing with hardware can be just as frustrating as software. A lot of time is taken up by testing, and it can also be difficult even to find a safe environment to test out your system in.
Lastly, and key component to the Masters, is the write-up of the project in the form of a thesis (sometimes referred to as a dissertation). My thesis ended up to be a 127 page document with approximately 40,000 words and 122 references. A fair chunk of my thesis was devoted to presenting and discussing others' research. Literature reviews can be quite time consuming - unless you are clued up already, you end up re-reading dissertations, articles, journals, and papers several times before you can start synthesising and writing about their work in your own words.
Here are several videos relating to my Masters. The first two video clips involve experiments using a physical robot (i.e., the instrumented electric wheelchair) and demonstrate the mapping module's ability to map a simple office environment. The first video uses a Hokuyo URG-04 laser rangefinder and also shows the GUI that I developed for the system. Note that the laser data overlay often is out of synch with the local map. This is because the laser data being displayed is current, whereas the map data is being shown at a much lower rate for performance. The frame rate of the onboard video has also been reduced, again for performance reasons.
In the second video, a comparison is made between the three different laser scanners that were installed on the robot. The leftmost map was created with a Hokuyo URG-04, a typical rangefinder used in robotics. The resulting map often has grey areas in the middle of the robot's path as a result of out of range readings. This could be fixed by changing the interpretation of these points. The middle one was from a sensor pulled from a robotics vacuum cleaner. Unfortunately this sensor somehow was damaged and was not working properly at the time it was used for mapping. The rightmost map is made from a sensor derived from an XBox Kinect sensor. Essentially the Kinect outputs distance to points in 3-D and to emulate a 2-D laser scanner, you need to project the points onto the X-Y axis (while taking into account the mounting geometry of the device).
The later two videos use the Stage robotics simulator for emulating not only the environment, but also the robot and its sensors. The video below shows the map being created while the robot driven around via the keyboard. Note that the interface is the same between the actual robot and the simulated one, and the two setups differ only by a few high-level configuration settings. This is one of the huge benefits of a modern robotics framework such as Player.
The final video demonstrates the path planning and autonomous navigation functionality of the system. Once the environment has been adequately mapped, a goal point was specified for the path planner to find the best route to it. Provided it found a path, by the click of a button the robot would then drive by itself to the goal point while avoiding dynamic obstacles (obstacles that were not present in the map).
You can access an online copy via the UC Research Repository here.
My project had a brief, light-hearted bit of media coverage. It was shown on CTV (Christchurch television) and published online. You can view this short two minute clip about my project and how a little about myself below:
After going through my Masters, I have a few words for those of you considering postgraduate studies in an engineering field.