This work presents an approach to the sensorial device and control system of an autonomous vehicle intended for navigating and performing precise load/unload tasks in industrial environments. The control system is able to perform turns, line following, and arbitrary curve following specified as splines. It is based on a multivariable design using the technique of pole placement in state space. The control system uses results from parameter estimation modules to adapt to the changing responses of traction motors when loaded or unloaded, such estimators are Kalman filters that recover the vehicle motion parameters from measurements performed by the
positioning sensor. Several steering configurations are possible since the control system provides a radius of turn as output. So differential drive, tricycle drive or Ackerman steering can be done by transforming this radius in motor orders, depending on the geometry of the vehicle. The only sensor the system relies on is a laser-based local positioning system consisting of a rotating laser and retro-reflectors. Robust algorithms for signal analysis and position/orientation estimation have been developed. The sensor is able to detect reflectors 25 meters away in daylight or in dusty
industrial environments using a low-cost 1 mW laser. The system has been tested on two mobile bases, using differential drive and
tricycle drive.
The work presented in this paper is pat of a system developed for fruit sorting. The machine vision unit is part of a distributed control system in which several machine vision modules can be integrated with a control module and a user interface unit. The control module takes care of the distributed control of the conveyor belt, weight units and fruit output units. The user interface is a front end to the user who can watch and control any part of the distributed system. The machine vision units are connected through a LAN to the user interface and through a CAN bus to the control unit in order to send and receive real time information during the on-line sorting process. Information that does not need real time communications are sent through the LAN under an ethernet protocol.
This paper presents an approach to the vision tasks to be performed in a vehicle navigation application in crop fields. The objective is to automate chemical spraying by autonomous navigation and machine vision. A camera is used as the sensor device and a bar of spraying nozzles is provided to perform the spraying. The proposed solution consists of recovering maps of the environment from the image sequence, and exploring them to locate the path to follow and the nozzles that have to be switched on. The motion parameters of the vehicle are used to place the images in the map, and are computed from a feature tracking method. The plants and the weeds are identified through a segmentation, the features to be tracked are computed from the contours of the plants. Results with real image sequences of all the steps involved are presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.