I am currently working on a Mobile robot which will use Vision based Navigation for automatically finding Charging spots when the battery is low and get attached to itself.
For this task first the steps are broken down into following pieces:
Here is a working demo of my code:
For this task first the steps are broken down into following pieces:
- Taking a number of reference images of the object to be detected. (For checking purpose of my code i took a "POW"(power) written on a notebook and tracking those characters. Depending on the location of "POW" in the image, i will generate the control command for the robot, whether it should go left or right and by how much amount using PID controller.)
- Detection and Extraction of SURF features (which is faster version of SIFT features) from the reference images and making a feature vector by stacking features from different reference images. In my case, I have only taken a single reference image shown below but you take multiple images and create the feature vector.
- Detection and Extraction of SURF features from the images captured from camera in real-time.
- Matching of those features using different algorithms. In MATLAB, there is only one feature matching function which is very robust, but if you using openCV, you can go for Brute-Force Matcher of FLANN based matcher. I have done this same thing in openCV. If anyone want the code using openCV, comment below in comment section.
- Depending on the matched feature, Calculate centroid of the "POW" symbol and if the symbol is right half plane, generate right command to the motor otherwise left command.
The MATLAB code is embedded below,
Here is a working demo of my code:
For any suggestion, comment below. Thanks!
No comments:
Post a Comment