Episode 153 – Origami-inspired robots
Claire chatted to Chenying Liu from the University of Oxford about how a robot’s physical form can actively contribute to sensing, processing, decision-making, and movement.
play_arrow
Episode 154 – Visual navigation in insects and robots Claire Asher
play_arrow
Episode 154 – Visual navigation in insects and robots Claire Asher
Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Andrew Philippides is a Professor of Biorobotics at the University of Sussex, where he co-directs the Centre for Computational Neuroscience and Robotics and the be.AI Leverhulme Doctoral centre for Biomimetic Embodied AI. His research combines biological experiments with robotics, modelling, and machine learning to understand how intelligent behaviour emerges from the interaction of body and brain acting in an environment. Focussing on visual navigation, he aims to understand the navigation and learning abilities of ants and bees to develop novel AI and biorobotic algorithms.

This episode is powered by the Advanced Research + Invention Agency’s Robot Dexterity programme, which aims to transform robotic capabilities and unlock a step-change in human productivity.
Tagged as: Bee, ant, Artificial intelligence, Brain, Machine learning, Behaviour, Perception, Navigation, SLAM.
Claire chatted to Chenying Liu from the University of Oxford about how a robot’s physical form can actively contribute to sensing, processing, decision-making, and movement.
Post comments (0)