This paper describes the developmental effort involved in prototyping the first indigenous autonomous mobile robot, AMR, with a manipulator for carrying out tasks related to manufacturing. The objective is to design and develop a vehicle that can navigate autonomously and transport jobs and tools in a manufacturing environment. Proprioceptive and exteroceptive sensors are mounted on AMR for navigation. Among the exteroceptive sensors, a stereovision camera is mounted in front of AMR for mobile robot perception of the environment. Using the widely supported JPEG image file format, full high-resolution color images are transmitted frame by frame from the mobile robot to multiple viewers located within the robot work area, where fast reconstruction of these images enables remote viewing. A CMOS camera mounted on the manipulator identifies jobs for pick-and-place operation. A variation of correlation based adaptive predictive search (CAPS) method, a fast search algorithm in template matching, is used for job identification. The CAPS method justifiably selects a set of search steps rather than consecutive point-to-point search for faster job identification. Search steps, i.e., either coarse search or fine search, are selected by calculating the correlation coefficient between template and the image. Adaptive thresholding is used for image segmentation for parametric calculations required for proper gripping of the object. Communication with the external world allowing remote operation is maintained through wireless connectivity. It is shown that autonomous navigation requires synchronization of different processes in a distributed architecture, while concurrently maintaining the integrity of the network.
Mobile robotAutonomous navigationManufacturing environmentDistributed architectureAMR