Andy Pye visits the Hanover Fair and catches up on developments in bionic learning and how engineers are deploying concepts gleaned from the animal kingdom.
In 50 years’ time, we may look back with horror and consider that we burnt oil in industrial combustion engines, just to get around on the surface of our planet, when arguably we should have been using that resource to make plastics and medicines.
Yet for many years, mankind has been seeking inspiration from the animal kingdom for new types of propulsion systems and control concepts. Both flying and gripping have a long tradition in Festo’s Bionic Learning Network, which has explored projects ranging from artificial jellyfish to simulating the motion of birds in flight. “We are constantly seeking out new or relatively unknown motion and drive concepts,” explained Dr Frontzek, Head of Corporate Communication back in 2012. “Our engineers are working with universities, institutes and development companies in order to transfer mathematical and scientific principles to industrial applications.”
The “SmartInversion Future Concept” was first seen at the 2012 Hanover Trade Fair. It is not a bird or a plane, but an unusual flying object that propels itself by flipping inside out. Created by engineers at Festo’s HQ Esslingen, Germany, the floating band filled with helium takes on different shapes while expanding and contracting to generate thrust and move through the air.
The design is based on the inverted cube shape discovered by the late inventor and mathematician Paul Schatz. Schatz discovered that a third basic type of motion, inversion, was possible in addition to the more familiar rotation (rotary motion) and translation (linear motion). He divided a cube into two star shapes and an invertible cubic belt. The cubic belt is a six-member joint ring, which separates from the two interlocking parts at the corners. It can be continuously inverted and thus take on different shapes.
The flying object itself is made up of six identical prisms filled with helium, held together by a carbon-fibre framework. The combination of lightweight, electric drives and open and closed-loop control makes possible continuous, rhythmically-pulsating inversion in the air. Three motors drive the motion coordinated by a tiny lithium battery-powered ARM computer processor to power the shape-shifting mechanism. Flight control is done remotely using an iPhone app, so that, by using a smartphone, a person on the ground can guide the object around a room.
Festo still hasn’t come up with a specific use for inversion-driven propulsion. Though one application of sorts already established is as the answer to difficult mixing problems, such as the homogeneous blending of particulate solids – especially those of greatly different specific weight. Unlike most mixing devices, the inversion kinematic is very gentle, so that almost no shear forces are generated. For this reason it is often used in industry to mix explosives. The mixing motion of the Canadian made 88 MIXERS (manufactured by Inversion Machines Ltd) is based on the Schatz Inversion Kinematic. One of these mixers only needs to be seen in operation to know that you are looking at a truly unusual phenomenon!
Moving to 2016, Festo’s latest indoor flying object is described as an autonomously airborne assistance system with infinite degrees of freedom. FreeMotionHandling consists of an ultralight carbon-fibre ring, with eight adaptive propellers, around a rotatable helium ball with an integrated gripping element. Thanks to the intelligent onboard electronics and indoor GPS, the ball can autonomously manoeuvre in any desired direction, pick up objects and put them down in a suitable place. No pilot is needed to control FreeMotionHandling, but the human operator can safely and easily interact with the flight object at all times.
This opens up new perspectives for the workspace of the future: spheres such as these could serve humans as airborne assistance systems – for example in overhead operations, at dizzying heights or in large storage facilities. In realising FreeMotionHandling, engineers made use of two existing developments from the Bionic Learning Network: the gripping mechanism was based on the universally applicable FlexShapeGripper, whose working principle is derived from the chameleon’s tongue; and the airborne helium ball itself.
The 3D Cocooner, also from Festo, takes inspiration from the impressive structures built in nature using thread by spiders and caterpillars to create complex lattice structures using a robotic spinneret. The device spins filigree figures and customised lightweight structures from a fibreglass thread. It is precisely controlled by means of a handling system, and the sticky fibreglass threads are laminated with UV-hardening resin and are joined together to form complex structures. Unlike conventional additive 3D printing processes, however, these structures do not arise in layers on a surface but are created freely in three-dimensional space.
The handling system for the 3D Cocooner is a horizontally mounted EXT-45 type tripod. Its three-arm parallel kinematics can be precisely and rapidly controlled in three-dimensional space; its agility makes the system ideally suited to these tasks. With the 3D Cocooner, the virtual design program directly conveys the manufacturing instructions to the machine level. An object can thus directly pass along the entire digital chain from the initial concept up to the finished product without having to proceed through the usual channels of sales, production and logistics.
Meanwhile, Siemens is using robot spiders in a different way: even though you can now fabricate objects out of anything from precious metals to biomaterials with 3D printers, the size of the machine’s build space remains a huge limiter on creativity and engineering prowess.
One tactic is to scale up. But Siemens researchers in Princeton, New Jersey, have developed 8-legged 3D printing robots, called SiSpis, capable of extruding polylactic acid (PLA), a mixture of corn starch and sugar cane used in some 3D printing applications. It makes perfect sense, as arachnids have in effect been 3D printing for millions of years.
The researchers have integrated an algorithm into the 3D printing spiders to allow them to tackle any complex 3D-printing job, big or small. They deploy a swarm of tiny robot spiders, each tasked with manufacturing only a small portion of a workpiece.
The spider uses an on-board camera and laser scanner to interpret its environment, and then autonomously calculates how much of an object it can handle. Other spiders would then decide which vertical boxes they would cover, using the algorithm developed by Siemen’s Product Design, Modeling and Simulation Research group. This ensures that even when dealing with complex geometries, no area is missed.
The technology is in its early infancy, and PLA is the only substance the spiders employ, but the team is looking into concrete and other materials.
What do artificial jellyfish have to do with process automation? Festo’s AquaJellies are artificial, autonomous jellyfish with integral electric drive and an intelligent, adaptive mechanical system. They consist of a translucent hemisphere and eight tentacles which provide forward thrust. A watertight, laser-sintered pressure hull is located in the middle of the AquaJelly. This encloses a central electric drive, two rechargeable lithium-ion-polymer batteries of the same size as smartphone batteries, a charging controller and servomotors for the swash plate. Each tentacle is formed as a structure which demonstrates what is known as the Fin Ray Effect.
The Fin Ray Effect is a construction which has been derived from the functional anatomy of the fin. Its peristaltic drive, which is based upon the recoil principle and contracts in a wave-like fashion, is similar to that of its biological model and provides the AquaJelly with the ability to move around.
The swimming replicas, whose natural role models consist of 99% water, function as self-controlled systems in the water with pronounced collective behaviour. Furthermore, the autonomous jellyfish are capable of controlling their own supply power by themselves. This is accomplished by means of continuous communication with the charging station. When the artificial jellyfish reaches a charging station which is located above the surface of the water, it is drawn into place by means of suction and supplied with electrical power. The AquaJelly can use energy saving ZigBee short range radio communication on the surface of the water to exchange status data with the charging station, and to inform other AquaJellies on the surface that the charging station is occupied. In addition to their environmental sensors, which prevent the jellyfishes from colliding with one another, the AquaJellies are also equipped with internal sensor technology which allows for energy status monitoring.
The status of each individual jellyfish can be monitored with the help of an android app. This visualisation of real-time diagnostics allows for, amongst other things, a parameters query regarding the current battery charge level. Each artificial jellyfish decides autonomously which action it will take on the basis of its own status, which depends upon, for example, charge level and drive position, as well as its proximity to other jellyfish.
Transferring this principle to the field of automation makes it possible for many autonomous or semi-autonomous, intelligent systems to work together, and to execute large tasks with small systems, which act together in a targeted fashion. Applications for this principle may be found, for example, in the factory of the future – in the form of self-controlling, decentralised, complete mechatronic systems, which act autonomously and are networked with each other. And the peristaltic drive used by the jellyfish with its so-called Fin Ray Effect can just as readily be transferred to actual practice. The DHDG adaptive gripper has already established itself within the market as a series product. Ultramodern materials, processed by means of laser sintering lamination, have been used to produce AquaJelly’s housing as well.
Self-driving truck acts like an animal
Researchers at Chalmers University of Technology are finding inspiration in evolution’s biological counterparts in the development of a driverless truck. The first public demonstration of the vehicle took place on a Dutch motorway on 28th May.
The truck is a Volvo FH16 and is in the newly launched Chalmers Revere Laboratory. Researcher Ola Benderius explains that the traditional – and clearly dominating – way of developing vehicles is to constantly base progress on earlier vehicle models and gradually add new functions. He says that this method might not work when developing the autonomous vehicles of the future. “Traditionally, the aim has been to try to separate and differentiate all conceivable problems and tackle them using dedicated functions, which means that the system must cover a large number of scenarios. You can cover a large number of different cases, but sooner or later the unexpected occurs, and that’s when an accident could happen.”
Instead, they have chosen to regard the self-driving vehicle as more like a biological organism than a technical system. “Biological systems are the best autonomous systems we know of. They absorb information from their surroundings via their senses and react directly and safely, like an antelope running within its herd, or a hawk pouncing on its prey on the ground,” says Benderius.
All information that the truck compiles from sensors and cameras is converted into a format that resembles the way in which humans and animals interpret the world via their senses. This enables the truck to adapt to unexpected situations in its basic design.
Instead of just one large program with dedicated functions for all conceivable situations, the team is working on small and general behavioural blocks that aim to make the truck react to various stimuli, just like an animal does. The truck is programmed to constantly keep all stimuli within reasonable levels, and it will even continuously learn to do this as efficiently as possible. This makes the framework extremely flexible and good at managing sudden and new dangers.
The software, OpenDLV (which stands for driverless vehicle), is being developed as open source code and is freely available on the internet. Through this, Benderius hopes that other researchers around the world can join the project by running and developing the software in their own vehicles.
Bats’ flight technique could lead to better drones
Long-eared bats are assisted in flight by their ears and body. A study at Lund University in Sweden has improved understanding of the bats’ flying technique and could be significant for the future development of drones.
The experiments were conducted in a wind tunnel in which trained bats flew through thin smoke to reach a stick with food on it. Meanwhile the researchers aimed a laser beam at the smoke behind the bats and took pictures of the illuminated smoke particles. The researchers measured how the smoke moved to calculate the forces generated by each beat of the bats’ wings.
Contrary to what researchers previously assumed, Christoffer Johansson Westheim and his colleagues have shown that long-eared bats are helped in flight by their large ears. “We show how the air behind the body of a long-eared bat accelerates downwards, which means that the body and ears provide lift. This distinguishes the long-eared bats from other species that have been studied and indicates that the large ears do not merely create strong resistance, but also assist the animal in staying aloft”, he asserts.
The findings entail a greater understanding of the flight technique of bats. They also highlight the evolutionary conflict between flying as efficiently as possible and echo-locating – discovering objects by sending out soundwaves and perceiving the resulting echoes.
Another discovery made during the experiments and never previously described in research is how the bats generate forward motion when flying slowly. The forward motion is generated when the wings are held high and away from the body at the end of each beat.
“This specific way of generating power could lead to new aerodynamic control mechanisms for drones in the future, inspired by flying animals”, Westheim says.
Bee model breakthrough for robot development
Scientists at the University of Sheffield have created a computer model of how bees avoid hitting walls – which could be a breakthrough in the development of autonomous robots.
Researchers from the Department of Computer Science built their computer model to look at how bees use vision to detect the movement of the world around them and avoid crashes.
Bees control their flight using the speed of motion – or optic flow – of the visual world around them, but it is not known how they do this. The only neural circuits so far found in the insect brain can tell the direction of motion, not the speed.
This study suggests how motion-direction detecting circuits could be wired together to also detect motion-speed, which is crucial for controlling bees’ flight.
“Honeybees are excellent navigators and explorers, using vision extensively in these tasks, despite having a brain of only one million neurons,” said lead researcher Dr Alex Cope. “Understanding how bees avoid walls, and what information they can use to navigate, moves us closer to the development of efficient algorithms for navigation and routing – which would greatly enhance the performance of autonomous flying robotics.”
Professor James Marshall added: “This is the reason why bees are confused by windows – since they are transparent they generate hardly any optic flow as bees approach them.”