The Autonomous Systems Lab (ASL) is involved in several areas of research. A primary focus of the lab is Cooperative Simultaneous Localization and Mapping (SLAM) using a “network” of robots. We work with Inertial Navigation Units (INU), odometry-based sensors, and vision systems. Sensor fusion, mobile mesh networking, and image processing are all areas of research within the lab.
A key project in the ASL involves the development of a software framework for the control and management of robots. A screenshot of the framework processing INU data is shown at right. The lab has eight identical robots that were built in-house from off the shelf components. These robots have been used for mapping projects, mobile networking, autonomous racing, and as test beds for other research projects in the lab.
In addition to the research carried out in the lab, the ASL supports the ECE Capstone Design Course ENEE408I Autonomous Robotics. The ASL team participates in a range of demonstration activities for the College and the ECE Department.
The ASL also supports a variety of undergraduate research projects including development of an autonomous helicopter, design of a balance bot (for an undergraduate controls lab), and a voice controlled autonomous assistant for first responders.