Single-Chip Bio-inspired Motion Sensors for Autonomous Navigation
P. Abshire, S. Humbert
This project was sponsored by NSF.
Bioinspired, optic flow, wide field integration, low power, adaptive integrated circuits
There is increasing interest in adapting strategies from insect vision for use in visually guided navigation, particularly for micro/nano air vehicles which require lightweight sensors that are not consistent with conventional image processing approaches. Flies rely heavily and successfully on optic flow for navigation and flight stabilization. As an insect moves though its environment, the patterns that form on the retina are time dependent and are a function of its relative motion and proximity to objects. The rate and direction of these local image shifts form the optic flow field. Cells in the early portion of the cell’s visual pathway are thought to compute local estimates of optic flow, which are subsequently combined in patterns extending over much of the visual field by wide-field motion sensitive cells to estimates motion cues and to generate compensatory commands used for stabilization and navigation.
Dr. Abshire and her students have developed the first integrated, single-chip solution for autonomous navigation inspired by the insect visuomotor system. The chip design comprises: (1) an array of Elementary Motion Detectors (EMDs) to derive local estimates of optic flow, (2) a novel mismatch compensation approach to handle dissimilarities in local motion detector units, and (3) on-chip programmable optic flow pattern weighting (Wide-Field Integration, WFI) to extract relative speed and proximity with respect to the surrounding environment. Computations are performed in the analog domain and in parallel, providing outputs at 1 KHz while consuming only 42.6 ?W. In collaboration with Dr. Humbert, the resulting sensor was integrated with a ground vehicle and navigation of corridor-like environments was demonstrated.
The objectives of this project are:
1. Sense local optic flow and integrate patterns of those signals across the visual field to estimate control signals for autonomous navigation.
2. Perform offset cancellation so that component mismatch does not degrade the estimated control signals.
3. Demonstrate use of the sensor in a variety of ground- and air-based robots and applications.
Overview of approach
WFI Motion Sensor. The WFI motion sensor implements on-chip optic flow computation and a programmable current matrix for spatial filtering. Insect-inspired motion sensors for navigation have been built previously, but typically EMD responses are spatially averaged across the visual field to extract global velocity information. Detailed spatial structure of the optic flow pattern is required to extract dynamic and kinematic parameters of the robot’s state for navigation purposes, therefore variations within the chip arising from the fabrication process can pose a serious problem. The key technical feature that enables this motion sensor is the use of on-chip adaptation for both offset cancellation and programmable filtering. The adaptation mechanism employs nonvolatile charge storage on the gate of floating-gate MOS transistors, and the circuits are configured such that programming is achieved accurately and automatically.
Experimental Results. The single chip WFI motion sensor was integrated with a ground vehicle and used to guide navigation through corridor-like environments. Four spatial weighting functions were implemented. Closed loop navigation experiments were conducted in a bent corridor environment with the inner walls covered by natural scenery. The tunnel was approximately 12 feet long and 4 feet wide. Multiple runs were conducted for three initial conditions: (1) centered, (2) initial position displacement from center, and (3) initial angular displacement from center. The standard deviations of the position displacement along the resulting trajectories were 6.8, 6.6, and 9.6 cm respectively.
The positional accuracy is limited by the number of elements and the field of view. The present system is a first-time proof of principle system with only 19 EMDs, each separated by 5° of visual angle, covering just over 80° of visual angle. The results correspond to an average error (standard deviation) of ~0.1° per sec. With more elements the accuracy should improve. We estimate that the WFI sensor package will be <1cm3 in size, <1g in weight, and <200?W power consumption.
Dr. Pamela Abshire
Department of Electrical and Computer Engineering and Institute for Systems Research
2211 A. V. Williams Bldg.
University of Maryland
College Park, MD 20742
Dr. Sean Humbert
Department of Aerospace Engineering
3182 Glenn L. Martin Hall
University of Maryland
College Park, MD 20742
Phone: (301) 405-0328