https://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/issue/feedPost-Graduate Conference on Robotics and Development of Cognition: Proceedings2019-06-05T13:00:30+00:00Open Journal Systems<p>Editor: Joanna Szufnarowska</p>https://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/348A synchrony based approach for human robot interaction2019-06-05T13:00:30+00:00Syed Khursheed Hasnainojs.ub@uni-bielefeld.dePhilippe Gaussierojs.ub@uni-bielefeld.deGhiles Mostafaouiojs.ub@uni-bielefeld.deAs psychologists considered synchrony as an important parameter for social interaction, we hypothesize that in the case of social interaction, people focus their attention on regions of interest where the visual stimuli are synchronized with their inner dynamics. Then, we assume that a mechanism able to detect synchrony between internal dynamics of a robot and external visual stimuli can be used as a starting point for human robot interaction. Inspired by human psychological and neurobiological data, we propose a synchrony based neural network architecture capable of selecting the robot interaction partner and of locating Focus of Attention.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/349Active Learning in a Computational Model ofWord Learning2019-06-05T13:00:29+00:00Maarten Versteeghojs.ub@uni-bielefeld.deChristina Bergmannojs.ub@uni-bielefeld.deLouis ten Boschojs.ub@uni-bielefeld.deLou Bovesojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/350An Embodied View on the Development of Symbolic Capabilities and Abstract Concepts2019-06-05T13:00:28+00:00Marek Rucinskiojs.ub@uni-bielefeld.deFrancesca Stramandinoliojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/351Edge and plane classification with a biomimetic iCub fingertip sensor2019-06-05T13:00:27+00:00Uriel Martinez-Hernandezojs.ub@uni-bielefeld.deNathan F. Leporaojs.ub@uni-bielefeld.deHector Barron-Gonzalezojs.ub@uni-bielefeld.deTony Doddojs.ub@uni-bielefeld.deTony J. Prescottojs.ub@uni-bielefeld.deThe exploration and interaction of humanoid robots with the environment through tactile sensing is an important task for achieving truly autonomous agents. Recently much research has been focused on the development of new technologies for tactile sensors and new methods for tactile exploration. Edge detection is one of the tasks required in robots and humanoids to explore and recognise objects. In this work we propose a method for edge and plane classification with a biomimetic iCub fingertip using a probabilistic approach. The iCub fingertip mounted on an xy-table robot is able to tap and collect the data from the surface and edge of a plastic wall. Using a maximum likelihood classifier the xy-table knows when the iCub fingertip has reached the edge of the object. The study presented here is also biologically inspired by the tactile exploration performed in animals.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/352Emergence of Leadership in a Group of Autonomous Robots2019-06-05T13:00:26+00:00Francesco Puglieseojs.ub@uni-bielefeld.deAlberto Acerbiojs.ub@uni-bielefeld.deOrazio Miglinoojs.ub@uni-bielefeld.deDavide Maroccoojs.ub@uni-bielefeld.deFor modern biology and ethology, the reason for the emergence of leaders-followers patterns in groups of living organisms, is the need of social coordination. In this paper we attempt to examine factors contributing to the emergence of leadership, trying to understand the relation between leader role and behavioral capabilities. In order to achieve this goal, we use a simulation technique where a group of foraging robots has to choose between two identical food zones. Thus, robots must coordinate in some way in order to select the same food zone and collectively gathering food. Behavioral and quantitative analysis indicate that a form of leadership emerges and the emergence of leadership relates with high level of fitness. Moreover, we show that more skilled individuals in a group tend to assume a leadership role, in agreement with literature.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/353Emergent Spontaneous Movements Based on Embodiment: Toward a General Principle for Early Development2019-06-05T13:00:25+00:00Yasunori Yamadaojs.ub@uni-bielefeld.deYasuo Kuniyoshiojs.ub@uni-bielefeld.deWe investigate whether spontaneous movements, which initiate and guide early development in animals, can be accounted for by the properties underlying embodiment. We constructed computer and robotic models of several biological species with biologically plausible musculoskeletal bodies and nervous systems, and extracted the embodied and motor networks based on inter-muscle connectivities. In computer simulations and robot experiments, we found that the embodied and motor networks had similar global and local topologies, suggesting the key role of embodiment in generating spontaneous movements in animals.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/354From Low Level Motor Control to High Level Interaction Skills2019-06-05T13:00:24+00:00David Baillyojs.ub@uni-bielefeld.dePierre Andryojs.ub@uni-bielefeld.dePhilippe Gaussierojs.ub@uni-bielefeld.deThe goal of this research is to create a non-verbal system able to interact safely and naturally with humans. The main hypothesis is that mechanisms of high level interactions such as cooperation and understanding intentions can be obtained from well designed low-level systems. For example, an effector device instrumented to detect force constraints applied by others allows to get easily the direction (opposing vs facilitating) and, at a higher level of interpretation, the intention of others concerning the device's movement. This is one of the reasons we preferred hydraulic technology which presents a potential of physical compliance. Moreover, pressure control in the pistons is closer to muscles control than the electric motors. For the control architecture, we are interested in modeling the layers of motor command : low level force control, multimodal inputs (especially vision) leading to prediction and anticipation capabilities. To do so, this research includes the design of a bio-inspired neural network able to provide a force control of the hardware and merging inputs from different kind of sensors including vision and proprioception. The control has to be as close as possible to the hardware with the less layer possible. It is based on a control by activation of agonist and antagonist muscles. The position and torque sensor as well as short range proximity sensor are used to learn simple movements and their sensory outcome. The vision is also available through robotic eye mounted on a fast pan-tilt system allowing movement at human speed. High definition camera gives a video flow that can be used to analyze the scene. The neural network designed allows the system to analyze the scene using point of interest. By extracting local features around those points it is possible to construct a library of visual feature. Using this library objects can be recognize by learning simple associations between those local feature and sensorial context including supervision signals. Action can then be associated with the context or the presence of an object. Moreover sequences of simple actions can be learned through cognitive maps. For example the robot can learn from the human teacher to grasp, move and release an object. From then and with the recognition of object the robot is able to learn tasks such as sorting objects using their visual characteristic. As we construct this controller we hope to improve our knowledge of some structures of the brain such as the motor cortex, the pre-frontal cortex, the striatum or the cerebellum. Models of all these structures and other are used in the model here developed. The researches aim especially to better understand the influence of each structure on the global behavior of the robot as well as the synergies that emerge from the cooperation between structures and to create a new type of humanoid robot where all parts from the technology, through the low level control to the high level control is thought in the optic of realistic interactions with humans.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/355From Penguins to Parakeets: a Developmental Approach to Modelling Conceptual Prototypes2019-06-05T13:00:23+00:00Joachim de Greeffojs.ub@uni-bielefeld.dePaul Baxterojs.ub@uni-bielefeld.deRachel Woodojs.ub@uni-bielefeld.deTony Belpaemeojs.ub@uni-bielefeld.deThe use of concepts is a fundamental capacity underlying complex, human-level cognition. A number of theories have explored the means of concept representation and their links to lower-level features, with one notable example being the Conceptual Spaces theory. While these provide an account for such essential functional processes as prototypes and typicality, it is not entirely clear how these aspects of human cognition can arise in a system undergoing continuous development - postulated to be a necessity from the developmental systems perspective. This paper seeks to establish the foundation of an approach to this question by showing that a distributed, associative and continuous development mechanism, founded on principles of biological memory, can achieve classification performance comparable to the Conceptual Spaces model. We show how qualitatively similar prototypes are formed by both systems when exposed to the same dataset, which illustrates how both models can account for the development of conceptual primitives.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/356From typical neurocognitive development to neurorehabilitation of autistic children using mobile toy robots2019-06-05T13:00:22+00:00Irini Giannopuluojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/357Hebb-like Learning for the Grounding of High-Level Symbols in Sensorimotor Trajectories2019-06-05T13:00:21+00:00Martin F. Stoelenojs.ub@uni-bielefeld.deDavide Maroccoojs.ub@uni-bielefeld.deAngelo Cangelosiojs.ub@uni-bielefeld.deFabio Bonsignorioojs.ub@uni-bielefeld.deCarlos Balaguerojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/358Infants' perception from the physical relations between objects2019-06-05T13:00:20+00:00Lauriane Rat-Fischerojs.ub@uni-bielefeld.deKevin J. O'Reganojs.ub@uni-bielefeld.deJacqueline Fagardojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/359Innate Neonatal Face Preference - An Embodied Phenomenon?2019-06-05T13:00:18+00:00Nick Wilkinsonojs.ub@uni-bielefeld.deFrancesco Reaojs.ub@uni-bielefeld.deKatrin Lohanojs.ub@uni-bielefeld.deGiorgio Mettaojs.ub@uni-bielefeld.deGustaf Gredebackojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/360Integrated Model for Generating Non Verbal Body Behavior Based on Psycholinguistic Analysis in Human-Robot Interaction2019-06-05T13:00:17+00:00Amir Alyojs.ub@uni-bielefeld.deAdriana Tapusojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/361Morphology Dependent Distributed Controller for Locomotion in Modular Robots2019-06-05T13:00:17+00:00Avinash Ranganathojs.ub@uni-bielefeld.deJuan Gonzalez-Gomezojs.ub@uni-bielefeld.deLuis Moreno Lorenteojs.ub@uni-bielefeld.deStigmergy is defined as a mechanism of coordination through indirect communication among agents, which can be commonly observed in social insects such as ants. In this work we investigate the emergence of coordination for locomotion in modular robots through indirect communication among modules. We demonstrate how intra-configuration forces that exist between physically connected modules can be used for self-organization in modular robots, and how the emerging global behavior is a result of the morphology of the robotic configuration.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/362On a design of a torque sensor for the iCub humanoid robot2019-06-05T13:00:15+00:00Wiktor Sieklickiojs.ub@uni-bielefeld.deFrancesco Becchiojs.ub@uni-bielefeld.deThe paper presents the evaluation process of a first version of the one axis torque sensor designed for the iCub humanoid robot. Newly designed strain gauges equipped sensor was found to show a significant readouts hysteresis, therefore several tests were run to define the reason of the hysteresis. Some of the design issues met while testing the new sensor are discussed including the screws connection and relative rigidity of the sensor's elements analyses. Verification of the assembly procedure is also included. Tests revealed several problems on both design stage and exploitation of the sensor. Possible solutions to the encountered problems are further proposed.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/363Online Language Learning to Perform and Describe Actions for Human-Robot Interaction2019-06-05T13:00:14+00:00Xavier Hinautojs.ub@uni-bielefeld.deMaxime Petitojs.ub@uni-bielefeld.dePeter F. Domineyojs.ub@uni-bielefeld.de2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/365Reachable by walking: inappropriate integration of near and far space may lead to distance errors2019-06-05T13:00:12+00:00Beata J. Grzybojs.ub@uni-bielefeld.deVicente Castelloojs.ub@uni-bielefeld.deAngel P. del Pobilojs.ub@uni-bielefeld.deOur experimental results show that infants while learning to walk intend to reach for unreachable objects. These distance errors may result from inappropriate integration of reaching and locomotor actions, attention control and near/far visual space. Infants during their first months are fairly immobile, their attention and actions are constrained to near (reachable) space. Walking, in contrast, lures attention to distal displays and provides the information to disambiguate far space. In this paper, we make use of a reward-mediated learning to mimic the development of absolute distance perception. The results obtained with the NAO robot support further our hypothesis that the representation of near space changes after the onset of walking, which may cause the occurrence of distance errors.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/366Towards Modular Spatio-temporal Perception for Task-adapting Robots2019-06-05T13:00:12+00:00Zoltan-Csaba Martonojs.ub@uni-bielefeld.deFlorian Seidelojs.ub@uni-bielefeld.deMichael Beetzojs.ub@uni-bielefeld.deIn perception systems for object recognition, the advantage of multiple modalities, of combining approaches, and several views is emphasized, as they improve accuracy. However, there are great variances in the implementation, suggesting that there is no consensus yet on how to approach this problem. Nonetheless, we can identify some common features of the methods and propose a flexible system where existing and future approaches can be tested, compared and combined. We present a modular system in which perception routines can be easily added, and define the logic of making them work together based on the lessons learned from different experiments.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/367Towards Spatial Perception: Learning to Locate Objects From Vision2019-06-05T13:00:11+00:00Jürgen Leitnerojs.ub@uni-bielefeld.deSimon Hardingojs.ub@uni-bielefeld.deMikhail Frankojs.ub@uni-bielefeld.deAlexander Försterojs.ub@uni-bielefeld.deJürgen Schmidhuberojs.ub@uni-bielefeld.deOur humanoid robot learns to provide position estimates of objects placed on a table, even while the robot is moving its torso, head and eyes (cm range accuracy). These estimates are provided by trained artificial neural networks (ANN) and a genetic programming (GP) method, based solely on the inputs from the two cameras and the joint encoder positions. No prior camera calibration and kinematic model is used. We find that ANN and GP are both able to localise objects robustly regardless of the robot's pose and without an explicit kinematic model or camera calibration. These approaches yield an accuracy comparable to current techniques used on the iCub.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/368Whom Will an Intrinsically Motivated Robot Learner Choose to Imitate from?2019-06-05T13:00:09+00:00Sao Mai Nguyenojs.ub@uni-bielefeld.dePierre-Yves Oudeyerojs.ub@uni-bielefeld.deThis paper studies an interactive learning system that couples internally guided learning and social interaction in the case it can interact with several teachers. Socially Guided Intrinsic Motivation with Interactive learning at the Meta level (SGIMIM) is an algorithm for robot learning of motor skills in highdimensional, continuous and non-preset environments, with two levels of active learning: SGIM-IM actively decides at a metalevel when and to whom to ask for help; and an active choice of goals in autonomous exploration. We illustrate through an air hockey game that SGIM-IM efficiently chooses the best strategy.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedingshttps://biecoll.ub.uni-bielefeld.de/index.php/robotdoc/article/view/364Proceedings of the Post-Graduate Conference on Robotics and Development of Cognition, 10-12 September 2012, Lausanne, Switzerland2019-06-05T13:00:13+00:00Joanna Szufnarowskaojs.ub@uni-bielefeld.deThe aim of the Postgraduate Conference on Robotics and Development of Cognition (RobotDoC-PhD) is to bring together young scientists working on developmental cognitive robotics and its core disciplines. The conference aims to provide both feedback and greater visibility to their research as lively and stimulating discussion can be held amongst participating PhD students and senior researchers. The conference is open to all PhD students and post-doctoral researchers in the field. RobotDoC-PhD conference is an initiative as a part of Marie-Curie Actions ITN RobotDoC and will be organized as a satellite event of the 22nd International Conference on Artificial Neural Networks ICANN 2012.2012-12-31T00:00:00+00:00Copyright (c) 2023 Post-Graduate Conference on Robotics and Development of Cognition: Proceedings