Robots interacting with human beings are widespread in modern environments, and those performing intelligent tasks without human supervision need to take into account potential criticalities. Making robots compete enable their evaluation with respect to navigation, mapping, object recognition, tracking and manipulation capabilities. Robot competitions date back to the early '80s proving to be useful for educational and research purposes. Several competitions are focused on human-robot interaction, even though they rarely produce as outcome robots capable to seamlessly interact with human beings. The main reason for this is the lack of understanding of human intentions and the failure to rapidly react to human actions. In other words, an ideal robot must be able to communicate and coordinate with humans or with other robots, to act autonomously, and to react under real-Time constraints. This paper proposes a new framework to simplify the development of intelligent robots, testing them in a real robot competition. The framework combines (i) a multi-Agent system to interact with humans, other robots and perform object identification and pathfinding, and (ii) a real-Time motion control deployed on the Erika RTOS, to move the robot and react in a timely fashion to changes in the environment. In the considered competition scenario, the robot is required to identify and collect common objects in a bounded arena with dynamic obstacles in a limited amount of time, receiving commands from humans and competing with other robots. This approach confirms the powerful combination of multi-Agent systems, computer vision, and real-Time systems.

A framework based on real-Time OS and multi-Agents for intelligent autonomous robot competitions

SERNANI, PAOLO;
2016-01-01

Abstract

Robots interacting with human beings are widespread in modern environments, and those performing intelligent tasks without human supervision need to take into account potential criticalities. Making robots compete enable their evaluation with respect to navigation, mapping, object recognition, tracking and manipulation capabilities. Robot competitions date back to the early '80s proving to be useful for educational and research purposes. Several competitions are focused on human-robot interaction, even though they rarely produce as outcome robots capable to seamlessly interact with human beings. The main reason for this is the lack of understanding of human intentions and the failure to rapidly react to human actions. In other words, an ideal robot must be able to communicate and coordinate with humans or with other robots, to act autonomously, and to react under real-Time constraints. This paper proposes a new framework to simplify the development of intelligent robots, testing them in a real robot competition. The framework combines (i) a multi-Agent system to interact with humans, other robots and perform object identification and pathfinding, and (ii) a real-Time motion control deployed on the Erika RTOS, to move the robot and react in a timely fashion to changes in the environment. In the considered competition scenario, the robot is required to identify and collect common objects in a bounded arena with dynamic obstacles in a limited amount of time, receiving commands from humans and competing with other robots. This approach confirms the powerful combination of multi-Agent systems, computer vision, and real-Time systems.
2016
File in questo prodotto:
File Dimensione Formato  
A framework based on real-Time OS and multi-Agents for intelligent autonomous robot competitions.pdf

solo utenti autorizzati

Licenza: Copyright dell'editore
Dimensione 1.07 MB
Formato Adobe PDF
1.07 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11393/302330
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? 8
social impact