Robotics has come a long way since first industry applications in 1980s. Symbolic AI and computing power have made phenomenal advances that computers can generate articles on a given topic and even provide expert advice in specific areas. Yet, things begin to fall apart when it comes to physical contact despite many advances in AI, control theory, sensing and actuation. For instance, we cannot yet trust a robot to hold a live hamster. We cannot trust a robot to physically examine a patient. Most advanced robots find it hard to walk on an uncalibrated outdoor terrain.

It looks like we are missing a few fundamental things to do with realtime computation for perception and action that living beings handle with noisy sensors and slow communication fibers in neural circuits. One well supported theory is that the brain uses internal models to predict the consequences of motor commands so that it can run local simulations to optimise the command before it is sent down to muscles. This, however, requires prior experience with a task. A largely unexplored phenomenon is that the brain tunes the kinematic parameters of the body to let it solve dynamic problems efficiently. For instance, a bicycle rider would stand up with bent knees when riding on a bumpy terrain. If someone is asked to estimate the weight of an object, they would bob it up and down with slightly varying levels of elbow stiffness before making an estimate. When a Physician is required to estimate the location of the edge of the liver of a patient using manual palpation, they would palpate with varying levels of finger stiffness and shape. Why does the brain tune the body while commanding movements and force control actions even in perception tasks such as palpation and weight estimation? Can it be that the physical body itself has a computational role and its function can be regulated by tuning the physical mechanics?

Some clues are in the Cochlea in the ear. It is a tapered tube with liquid inside it. When pressure waves cause the liquid to vibrate, and due to the shape of the tube, different frequency components get separated into known locations along the tube. Then Cochlea hair pick these frequency components to feed the brain. Mapping the pressure signal from its time domain to the frequency domain is a computation purely implemented using the physics of liquid vibration in a tapered tube. This also means that when the viscosity of Cochlea liquid changes, the computing function also changes. This phenomenon of computing and physical dynamics coming together is not limited to hearing. We believe that conditioning the body during dynamic interaction tasks is also related to this phenomenon in a controlled manner. We suspect that there are many bodily parts such as cam profiles in joints and networks of joints and muscles that serve to bring physical dynamics and computing together. Some are static such as the human ear doing frequency analysis, and some others can be dynamically conditioned to suit the context.

There are at least three advantages seen across physical systems that exhibit computing functions.

  1. Realtime computing: Such systems use parallel physical pathways in the mechanics of the body to solve complex computational problems to meet tight deadlines imposed by the environment for survival. In the case of the cochlea, it provides fast frequency separation of signals as they come into contact with the ear for realtime hearing. We have recently found that the mountain goat hoof also has a rapid passive slip resistance action.
  2. Entropy reduction: Such systems use contracting regions in nonlinear dynamics of physical systems to reduce the entropy (uncertainty of a measurement) of the world for efficient computation. For instance, when a bicycle rider comes to a bumpy terrain, the rider would come off the seat to ride with bent knees. The visco-elastic dynamics at the knees provide a mechanical filter that reduces the uncertainty of the world, so that the brain can deal with a simplified dynamic task. We have seen this in our recent work on mountain goat hoof and in joints with an angle dependent damping profile.
  3. Relevance: Such systems exhibit the ability to condition the physical dynamics to change the state space involved in control to suit the situation. For instance, in an Octopus, throwing the tentacle to catch prey uses passive dynamics in a hyper redundant system that unrolls along a straight line to minimize resistance. If the prey is caught, it would make two rigid links with a joint at the middle to bring the prey to the mouth, because that simplifies control of a fetching movement of a soft tentacle.

In the Morph Lab, we try to understand the principles behind such realtime conditioning of the physical body to improve both perception and action. Since it is difficult to unentangle parallel pathways of motor commands in a living being, we take a soft robotics approach to test questions to do with embodied intelligence. We enjoy this approach because it not only allows us to understand principles that are applicable to biological beings but also helps us to build useful soft robots.

Here is a recent public talk I gave covering these topics:

https://www.youtube.com/watch?v=_jkwiLtv7cQ

Research areas

Contact the PI

Professor Thrishantha Nanayakkara
RCS1 M229, Dyson Building
25 Exhibition Road
South Kensington, SW7 2DB

Email: t.nanayakkara@imperial.ac.uk