A multi-sensor system for a mobile robot for determining object location

Nathan R Lavallee, University of Rhode Island

Abstract

This thesis studies the problem of autonomous mobile robot control within a workspace utilizing multiple sensory techniques to determine the position and orientation of both the robot and specified objects of interest within an environment. The goal of this project was to design and develop a method through which a centralized computer can access sensory information from multiple sources in order to command a mobile robot to analyze a specified target within a workspace. Specifically, the robot is directed to investigate a rectangular target supported by a variety of different support structures, and relay information to the central computer where the structural details can be analyzed. ^ In this study, an overhead camera observes a workspace and relays captured images to a central computer. Captured images are analyzed through pixel recognition and the robot's position and heading, as well as the target location, are calculated. Using this information, the ideal path for the robot should follow is determined. After the environment is mapped, the robot i commanded to navigate through the work area. The robot, an iRobot® Create™, is controlled through several manufacturer programmed commands, in which velocity and turning radius are specified. Communication between the robot and the central computer is accomplished via a Bluetooth communication link. ^ The robot is directed to analyze a user specified object. To accomplish this, one of two types of sensors was mounted to the robot. The Parallax Ping)))™ Range Finder uses ultrasonic pulses and the SHARP GP2Y0A02YK Long Distance Measuring Sensor uses infra-red pulses to triangulate the distance to an object. Using either sensor, the distance is calculated by a PIC16F690 microcontroller and relayed to the central computer via a RoboTech EasyBT Bluetooth module from Parallax Inc. The central computer then uses this data to create an "occupancy grid" showing the surfaces of the target, or any objects under the target, that may be hidden from the overhead camera. ^ The results of this study show that the camera-based feedback system for robot motion control offer a significant improvement over the encoder-based feedback programmed into the iRobot Create. In addition, this study shows that the ultrasonic Ping Range Finder offers greater dependability and generates a more accurate visual depiction of specified objects. The pulse generated by the Ping Range Finder is emitted in a ±15° cone in front of the sensor; a hardware limitation causing the sensor to detect the beginning of surfaces before being directly in front of them. This resulted in the generation of error artifacts in occupancy grids, which appear like "wings" on the outside of surfaces. Two data processing methods were developed to filter out this error. The first, a simple measurement-clustering filter, works adequately for filtering out most large error but did not eliminate the aforementioned artifacts in all cases. The second filter, an angular error correction filter, uses the data from a distance-clustering filter and automatically accounts for the ±15° pulse angle. Using this filter, the system was able to generate occupancy grids accurately representing (with maximum error of 0.87 inches) various support structures which may have been impossible using the overhead camera alone. These results are subject to some system limitations. The robot's speed was set to 40 mm/s because faster speeds cause inaccuracies in the robot positioning system and occupancy grid generation. Furthermore, the robot should be as close to the target object as possible while recording distance readings, otherwise, small amounts of error in robot heading calculation result in large error in occupancy grid generation.^

Subject Area

Engineering, Automotive|Engineering, Robotics

Recommended Citation

Nathan R Lavallee, "A multi-sensor system for a mobile robot for determining object location" (2012). Dissertations and Master's Theses (Campus Access). Paper AAI1509524.
http://digitalcommons.uri.edu/dissertations/AAI1509524

Share

COinS