KEYWORDS: Robots, Field programmable gate arrays, Image processing, Sensors, LabVIEW, Cameras, Control systems, Distributed computing, Data processing, Global Positioning System
This paper presents the application of a distributed systems architecture to an autonomous ground vehicle, Q,
that participates in both the autonomous and navigation challenges of the Intelligent Ground Vehicle Competition.
In the autonomous challenge the vehicle is required to follow a course, while avoiding obstacles and
staying within the course boundaries, which are marked by white lines. For the navigation challenge, the vehicle
is required to reach a set of target destinations, known as way points, with given GPS coordinates and
avoid obstacles that it encounters in the process. Previously the vehicle utilized a single laptop to execute all
processing activities including image processing, sensor interfacing and data processing, path planning and navigation
algorithms and motor control. National Instruments' (NI) LabVIEW served as the programming language
for software implementation. As an upgrade to last year's design, a NI compact Reconfigurable Input/Output
system (cRIO) was incorporated to the system architecture. The cRIO is NI's solution for rapid prototyping
that is equipped with a real time processor, an FPGA and modular input/output. Under the current system,
the real time processor handles the path planning and navigation algorithms, the FPGA gathers and processes
sensor data. This setup leaves the laptop to focus on running the image processing algorithm. Image processing
as previously presented by Nepal et. al. is a multi-step line extraction algorithm and constitutes the largest
processor load. This distributed approach results in a faster image processing algorithm which was previously
Q's bottleneck. Additionally, the path planning and navigation algorithms are executed more reliably on the real
time processor due to the deterministic nature of operation. The implementation of this architecture required
exploration of various inter-system communication techniques. Data transfer between the laptop and the real
time processor using UDP packets was established as the most reliable protocol after testing various options.
Improvement can be made to the system by migrating more algorithms to the hardware based FPGA to further
speed up the operations of the vehicle.
Q is an unmanned ground vehicle designed to compete in the Autonomous and Navigation Challenges of the AUVSI
Intelligent Ground Vehicle Competition (IGVC). Built on a base platform of a modified PerMobil Trax off-road wheel
chair frame, and running off a Dell Inspiron D820 laptop with an Intel t7400 Core 2 Duo Processor, Q gathers
information from a SICK laser range finder (LRF), video cameras, differential GPS, and digital compass to localize its
behavior and map out its navigational path. This behavior is handled by intelligent closed loop speed control and robust
sensor data processing algorithms. In the Autonomous challenge, data taken from two IEEE 1394 cameras and the LRF
are integrated and plotted on a custom-defined occupancy grid and converted into a histogram which is analyzed for
openings between obstacles. The image processing algorithm consists of a series of steps involving plane extraction,
normalizing of the image histogram for an effective dynamic thresholding, texture and morphological analysis and
particle filtering to allow optimum operation at varying ambient conditions. In the Navigation Challenge, a modified
Vector Field Histogram (VFH) algorithm is combined with an auto-regressive path planning model for obstacle
avoidance and better localization. Also, Q features the Joint Architecture for Unmanned Systems (JAUS) Level 3
compliance. All algorithms are developed and implemented using National Instruments (NI) hardware and LabVIEW
software. The paper will focus on explaining the various algorithms that make up Q's intelligence and the different ways
and modes of their implementation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.