Archival Information

Archival Projects

Period Project Funder
1993-1996 Mobile Autonomous Systems NUTEK
1996-1999 Mobile Autonomous Systems NUTEK
1998-2001 Complex Technological Systems VINNOVA
1999-2001 Complex Technological Systems VINNOVA

Research and Results

We pursue research for a continuous-time optimal and adaptive control scheme for robot or vehicles moving in six degrees of freedom. The control scheme is an extension of the algorithm of Johansson [12]. The algorithm is optimal in the sense that it minimizes the state errors and the forces which contributes to the vehicle's kinetic energy that is spend to correct these errors. The performance measure does also contain a term which penalizes the quadratic tracking errors proportional to the rate of energy which dissipates from the system due to damping.

Quadratic Optimization of Impedance Control

Algorithms for continuous-time quadratic optimization of impedance control have been developed. Explicit solutions to the Hamilton-Jacobi equation for optimal control of rigid-body motion are found by solving an algebraic matrix equation. System stability is investigated according to Lyapunov function theory, and it is shown that global asymptotic stability holds. The solution results in design parameters in the form of square weighting matrices or impedance matrices as known from linear quadratic optimal control. The proposed optimal control is useful both for motion control and force control.

CylindarFeedback-supported Application Programming

One direction of research represents an enhanced use of sensor information in robotics as described and exemplified in [10]. The environment of robots are dynamic and must be observed by perceptional equipment. For adaptation of task realizations to the environment, the robot control system must have the ability to support and react to the observed information. An event based robot control system with these advantages is described, the event-based control system operating from a model description of the world and the task. In the world model all objects significant for the task in the robot work cell are represented and generated during a visual task oriented programming session. The task realization is managed by the control system in small parts or executable events. An executable event is fired and realized when its preconditions are fulfilled. Changes in the work cell are detected by sensors and the information is used to update the world model, which all events are founded upon. Sensor information has influence on both planning and control of motion and application processes. The main goal of this approach is to create the ability to autonomously manage task realizations in a flexible environment?[10].

Another focus is a task-oriented robot programming method and a discussion of the associated control system. The purpose of creating a new programming environment is to give reusability, maintainability and reliability to the robot program code, all key factors in efficient programming. The whole system is focused on objects corresponding to physical objects and processes in the environment. In the task-oriented programming system, tasks are described as states of objects and their dependencies. The assisting control system is event driven and operates with the objects as base for realization of events [8].

A Robot  Playing SCRABBLE Using Visual  Feedback
Johan Bengtsson (1), Anders Ahlstrand (2), Klas Nilsson (3), Anders Robertsson (1), Magnus Olsson (4), Anders Heyden (2),  Rolf Johansson (1)

1 Department of Automatic Control
2 Centre for Mathematical Sciences
3 Department of Computer Science
4 Division of Robotics

ScrabbleToday most industrial robot systems use dedicated and rather limited sensors,  and available control systems provide limited support for feedback control. Aiming  towards more autonomous robot systems, we want to improve flexibility. The game  Scrabble is used as a test problem capturing these aspects. Our approach is to incorporate  visual servoing and a conventional powerful off-line programming (OLP) system into  the real-time control system, providing task specification and visual debugging. We use  the OLP tool Envision from Deneb and an ABB robot with reconfigured control system,  where the control system has an Open Robot Control architecture (ORC). The vision  system is connected to a host computer and the camera is attached to the robot gripper.  By extending the control system, we have designed and implemented both the vision  system and the application for the Scrabble game. Our system implementation shows that  ORC constitutes a necessary support for incorporation of real-time visual feedback and  that OLP may effectively be used with real-time feedback of sensor data.

Vision-supported Force-controlled Manipulation

Tomas Olsson, Johan Bengtsson , Mathias Haage, Henrik Malm, Rolf Johansson

Vision-supported Gripping
Michail Bourmpos (1,2) , Johan Bengtsson (1),  Mathias Haage (3), Rolf Johansson (1)

1 Department of Automatic Control, Lund Unversity
2 Imperial College, London, UK
3 Department of Computer Science, Lund University

Real-time vision allows the direct steering of a robotic manipulator. In this thesis we use a stereo rig mounted on an industrial robot (ABB IRB-6/2) which gives us information about the end effector position of a second robot (ABB IRB-2000/3). This second robotic manipulator performs the task of tracking a moving object, which in our case is a rolling ball.

The first concern of this project is to deal with the different kinds of problems occurring in such a system. After measuring the time-delays in our system we can establish the degree in which they effect it. We can then use these results for the actual control loop. More specifically a Kalman predictor scheme is implemented to produce estimates of coordinates for requested times, based on the time delay analysis. This Kalman predictor also helps us when we have lost track of either the robotic manipulator or the rolling ball.grabbing ball

Finally we perform the actual experiment of tracking the rolling ball.

The experimental setup consists of the two robots and the stereo rig mentioned above, as well as a metal ball rolling on a board. The IRB-6 is mounted with the stereo rig and is set in a pre-decided position to be able to observe the movement of the rolling ball throughout the whole of its course. The other robotic manipulator performs the task of tracking the rolling ball, using as feedback the data received from the stereo rig.

Ultrasonic Platform Results
To make a suitable platform for sensor experiments it is desirable to use a modular hardware and software structure. Our hardware design provides a VME- bus interface. That is very important since several robot manufacturers today produces open structure systems with an open hardware structure based on a VME-bus computers in the control system. The results from this project may then be fitted directly into a commercial system. The hardware for this platform is currently being completed and tested. The interface to the VME-bus consists of a prototype board that has been equipped with a piggyback interface board that connects to the local ultrasonic system interface bus.

The sampling unit from the previous project is reused without modification. It performs sampling at 2.5 MHz in 4 channels simultaneously. A preamplifier for electrostatic sensors has been developed. It is designed to be placed in the measuring device. To generate transmission pulses for the 200 kHz transducer an improved pulse generator has been developed (PZPLS). A module that generates a 300 V DC source and supplies an amplifier that can amplify a 0-10 V signal to 0-300 V signal with a bandwidth of 100 kHz has been developed (HVDRV). A module that can be programmed to produce arbitrary output functions with selectable length is currently being built and tested with 16 functions being stored in a PROM and 32 functions programmed in a RAM (FGEN). A further description of these modules is found in Lindstedt [20]. The software platform that has been developed at the Automatic Control Department is used for the control of the system. Program modules are written using the Modula-2 language. A real-time kernel developed within the department is used for the programming. Modules for data exchange with MATLAB is also used. Programs to interface the hardware to the system have been written. The last hardware module (FGEN) is designed and currently being built and tested. Identification experiments using a frequency sweep will then be made. It seems feasible that this method can improve the quality of the identification and temperature compensation can in this convenient way be implemented. Extended identification aiming at three dimensional object localization will then be studied.

Object identification based on ultrasonic echos by means of system identification and other advanced signal processing reported in [16], [33] which include original development in subspace-based identification for continuous-time systems. The new approach has proved effective in identification and modeling of ultrasonic echo applications.

© Lund Institute of Technology 2004. Web Master.

Last update: Friday, 04-Jun-2004 15:16:13 CEST