Abstract
Robots are used in more and more complex environments, and are expected to be able to adapt to changes and unknown situations. The easiest and quickest way to adapt is to change the control system of the robot, but for increasingly complex environments one should also change the body of the robot – its morphology – to better fit the task at hand. The theory of Embodied Cognition states that control is not the only source of cognition, and the body, environment, interaction between these and the mind all contribute as cognitive resources. Taking advantage of these concepts could lead to improved adaptivity, robustness, and versatility, however, executing these concepts on real-world robots puts additional requirements on the hardware and has several challenges when compared to learning just control. In contrast to the majority of work in Evolutionary Robotics, Eiben argues for real-world experiments in his `Grand Challenges for Evolutionary Robotics’. This requires robust hardware platforms that are capable of repeated experiments which should at the same time be flexible when unforeseen demands arise. In this paper, we introduce our unique robot platform with self-adaptive morphology. We discuss the challenges we have faced when designing it, and the lessons learned from real-world testing and learning.
Abstract (translated by Google)
URL
https://arxiv.org/abs/1905.05626