The goal of my project was to design and implement a mobile robot that can be controlled wirelessly using speech as a user interface. This was my first mobile robot which I built by forking concepts from here and there. I tried to follow open standards and use opensource hardware and software components wherever possible.
I chose a rover/tank like two-wheeled mobile robot design. Using Blender and some freely available Blender models, I modeled the mobile robot and did some animations; the blend file.
After comparing many speech recognition hardware modules, I chose EasyVR an isolated speaker independent speech recognition hardware module. I wanted to use RaspbeeryPi but it was not available at the time so I used Arduino microcontroller board which is a opensource low-cost and easy to program microcontroller platform. For wireless communication I used a pair of ZigBee/IEEE-802.15.4 based radio module, XBee Series1.
Other main hardware parts I used were:
The whole system has three parts; the mobile robot; the base-station and the Monitor program running on a PC for debugging and visual feedback.
The handheld base-station is used by the human user to control the mobile robot using speech as an interface, the base-station is also Arduino microcontroller board that talks to a Monitor program running on a PC/Mac at 9600 baud rate over USB and also connects with the speech recognition module.
For the Monitor’s visualization I forked the Processing code from this tutorial.
For obstacle avoidance, I implemented a braindead naive algorithm in which the mobile robot would stop if a obstacle is found less than a given threshold distance and sweep-and-search for a direction without obstacle and move in that direction.
© Rohit Yadav 2009-2012 | Report bug or fork source | Last updated on 30 Nov 2012