menu

Neuromorphic cognitive robots

We will integrate the neuromorphic VLSI device ROLLS (256 spiking neurons with realistic neuronal dynamics, 128K synapses, including synapses with on-chip plasticity circuits) with neuromorphic vision sensor (embedded Dynamic Vision Sensor eDVS) and sensors and motors of robotic vehicles (2x"Pushbots", 1x"Omnibot") using a miniature PC "Parallella". We will use this setup to implement cognitive architectures using dynamic neural fields (realised by soft winner takes all architecture of spiking neurons in hardware) to enable vision-based navigation, map building, and sequence learning with the robots.

New addtion: We will also have available the SpiNNaker system and will integrate this with the robots - you will be able to try your networks on 2 different platforms, or larger, more complicated ones (up to almost 200K neurons, 50M neurons) if you want.

Login to become a member send

Timetable

Day Time Location
Tue, 26.04.2016 16:00 - 17:00 Disco
Thu, 28.04.2016 16:00 - 17:00 Disco

Obstacle avoidance with a Pushbot robot based on eDVS vision and a simple neuromorphic controller using ROLLS chip

This project aims to develop a simple closed-loop controller for the pusbot to anable obsatcle  avoidance based on the DVS vision only. The solution is based on a simple heuristics: the robot turns away from the direction (left or right) where more events come from the lower half of the DVS pixel array. In the first vresion of the controller, demonstrated in the first weel, we created a histogram of the DVS events for each coloumn of the DVS array, collecting events from the lower half of the array. The preprocessing of the sensory stream included dropping 80% of events, which improved the signal to noise ratio. The histogram bins from the left and right halfs voted for increasing speed of the right and left wheels, respectively. This led to a fairly robust obstacle avoidance. If the robot faced a large object, a "freeeing" maneuvre. 

Here's a snapshot of the video showing the Pushbot driving in the lab, avoding obstacles:

IMG_7670.JPGPushbot driving in the lab avoiding obstacles

The following figures show the histograms that we used for the first version of obstacle avoidance systems for an object on the left and right side of the image, as well as objetcs on both sides and an objetc in the upper part of the image only: 

left-stimulus.png

left-side-histogram.png

both-sides-histogram.png

upper-events-no-histogram.png

In the second week, we have set-up two populations of spiking neurons on the ROLLS chip, representing turning left and turning right respectively. We created interfaces to stimulate these populations with event from the eDVS and tuned the populations to be activated by events in the right and left half of the image, respectively. After many hours of setting biases and tuning the parameters of the couplings we arrived at an amazing neuromorphic controller for obstacle avoidance for the pushbot. 

This figure shows output from the ROLLS chip, with spikes from neurons of the "left" population colored blue and neuros of the "right" population -- red. One can see the transition from the "left" to the "right" state as an object moves from left to right in front of the DVS of the robot: 

pushbot_right.pngROLLS output

The overall goal is to implement on the robot a sequence learning algorithm already implemented on the the ROLLLS chip in the past. It's purpose would be to present that the robotic agent is capable of fulfilling not only reactive tasks but also of learning on line.

Here we present the output of our neural network developed by student Raphaela Kreiser, able to successfully learn different sequences (ABA, AAC).

weights_ABA.pngSequence learning of ABA AAC_weights.pngSequence learning of AAC

Looming detector

The goal of this project was to realise the looming detector of Clair Rind in a neuronal network and implement it in hardware. The detector was succesfully implemented in pyNN spiking neural networks simulator, the next step bing porting it onto SpiNNacker hardware as well as cxquad chip. We have collected eDVS data with a looming stimulus, approaching the robot and will use this data to fine-tune and test the detector. Sketch of the developed architecture is shown in the following diagram:

looming.png

 

The list of all our planed projects looked like this:

IMG_7611.JPGProject list

And here are some sessions that we've held: 

  • sEMD
  • Looming Detector
  • ROLLS 
  • Parallela
  • Collect data from eDVS
  • Keyboard controller

 

 

 

Leaders

Alexander Rast
Yulia Sandamirskaya
Dora Sumislawska

Members

Alessandro AImar
Richard George
Arren Glover
Giacomo Indiveri
Aleksandar Kodzhabashev
Alejandro Linares-Barranco
Bragi Lovetrue
Moritz Milde
Florian Mirus
Felix Neumärker
Guido Novati
Christian Pehle
Gary Pineda-Garcia
Francesca Puppo
Ole Richter
Yulia Sandamirskaya
Nikolaos Vasileiadis
Borys Wrobel
Qi Xu
Yexin Yan
Marion betizeau