A mind controlled game, a machine that writes original music, and a mechanical assistant that is based on the dexterity and flexibility of an elephant’s trunk are recent Festo bionic developments.
Festo’s work on CogniGame has practical applications for the factory of the future by addressing the question of how people and machines can interact more efficiently in the face of constantly changing technologies.
Even in the factory of tomorrow, not all work sequences will be fully automated. New operating concepts are needed to enable people to communicate more quickly, more directly and more easily with the technology: from joystick solutions through voice input to controlling partial sequences using thoughts.
Festo has developed new concepts for cooperation between people and machines: from human-machine interaction for gamers in the “CogniGame” or music lovers in the “Sound Machines 2.0” through to real-life use of the “Bionic Handling Assistant 3.0” in industry.
CogniGame (pictured below) is a reinterpretation of a well-known 1970s video game. Based on the game of table tennis, players used a joystick to move a paddle up and down on the screen to return the ball to their opponent.
For CogniGame, the developers at Festo implemented the virtual game on a real court built using Festo components.
Two linear axes whose drives move to the left and right along the baselines move the racquets to return the ball and keep it in play.
One player controls his racquet by thought alone via a brain-computer interface (BCI). This measures voltage fluctuations on the surface of the head (as with electroencephalography (EEG)) via attached electrodes.
Opposite that person, a second player moves his or her racquet by pressing a lever using muscular power. For the game, Festo worked with CogniWare to develop a software solution for controlling a racquet using thoughts and bio signals.
It establishes a communication channel between the brain and hardware without any interaction from the user via voice or input devices.
Artistic human-machine interaction
As well as demonstrating human-machine interaction in the CogniGame, Festo has also developed an artistic interpretation of the factory of the future with the Sound Machines 2.0.
This is an intelligent, robot-controlled sound installation consisting of five self-playing musical instruments that records a melody, uses it to compose a new piece of music and plays it live.
The instruments – two violins, a viola, a cello and a double bass – are suspended freely to guarantee a high level of sound quality.
They work like proper stringed instruments, the only difference being that they each only have one string. The quintet is operated by electric drives and pneumatic cylinders controlled by the automation platform CPX.
The instruments react to any melody played to them according to stored composition rules and reinterpret it. The individual sound robots are networked together so that they can “listen” to each other.
New variations are continuously being produced that differ from the original theme, but retain the essence of the composition.
Someone determines the starting position for further interaction with the machine by entering the melody and by so doing initiates the subsequent process.
Future music: the factory of tomorrow
Self-controlling instruments demonstrate system optimisation in modern plant concepts for factory and process automation using mechatronics.
Festo pursues the idea of networking various decentralised components into self-controlling, mechatronic overall systems in the factory of the future, which in accordance with the theory of evolution develop autonomous group behaviour patterns and independently make decisions that need not be directly influenced by a human operator.
The Sound Machines 2.0 installation (pictured alongside) realised in cooperation with the artist Roland Olbeter, shows how decentralised components will be networked into self-controlling, mechatronic complete systems in the factory of the future.
They can develop their own, autonomous behaviour patterns in the group and make decisions themselves that do not have to be directly influenced by people.
Control using voice and image recognition
The Bionic Handling Assistant 3.0 shows how structural resilience and new control concepts in the factory of the future will enable people to interact easily and above all reliably with machines and be actively supported in their work.
The Bionic Handling Assistant (pictured alongside) is a flexible assistance system based on an elephant’s trunk. It won the German Future Award in 2010.
The structural resilience of the assistance system permits safe and direct contact between a person and machine and points the way towards new methods of interaction between people and technology.
This is because the system does not pose any danger in the event of a collision with a person, and neither does it have to be carefully cordoned off like conventional factory robots.
With eleven degrees of freedom, the Bionic Handling Assistant can be moved freely in all directions.
The assistance system can now grip objects independently and without the need for any programming or manual operation. A miniaturised camera in the gripper module registers the working space, detects target objects, follows them and initiates the command to grip at the right time.
Voice detection is realised by the Festo engineers using an appropriate interface: the system grips, reaches for and moves the objects using the defined collection of commands – easily and safely.