Researchers working at Bristol Interaction Group have developed prototypes of realistic ‘Skin-On Interfaces’ which replicate the responsiveness of human skin for use in electronic devices
What is a Skin-On Interface?
Researchers have been looking at how we interact with electronics; as technology develops the capacity for human-machine interaction is changing wildly. As tech develops from manual controls to remote instruction, through to timer controls and now voice and gesture activated functions, Skin-On may be the next big leap.
The Skin-On project led by Marc Teyssier has been investigating the potential for developing artificial skin materials, which could be used as a surface material to replace the ‘cold interface’ of existing technology to allow for more intuitive and natural responsiveness.
How does Skin-On work?
The concept here is that skin is the perfect substance for humans to use when communicating instructions. This would enable users to incorporate natural gestures such as tickling, twisting and scratching to carry out functions, and the Marc Teyssier research paper indicates that this would create an enhanced ‘user expressiveness for mediated communication’.
How does this augment the functionality of electronic devices?
The focus is on creating the perfect interaction interface, where users can perform natural gestures. The idea that human skin is the ideal interface is not a new one, with papers published by Science Direct as far back as 2014 exploring the potential of ‘e-skin’ for use in robotics, mechanics and medicine.
It is the responsiveness and sensitivity of skin to touch, movement and pressure which makes it such a diverse material. Imagine your phone or smart watch being able to ‘feel’ what you want to do, rather than needing to be programmed.
The published journal Journals of Materials Chemistry B looks at the implications for wearable tech in terms of healthcare and how artificial skin could be used to monitor conditions and even deliver medicines automatically.
How is Skin-On created?
Artificial skin is developed to mimic all the qualities of human skin; to follow the same sensory ability, to look and feel like skin, and to be able to understand and respond to natural gestures.
Skin-On is created in layers, each with different compositions, using materials such as silicone, skin coloured pigment and Ecoflex Gel with a grid of conductive thread and electrodes to act as the nerves of the material.
The material is designed and fabricated to mimic both the qualities and look of human skin, to ‘increase anthropomorphism’.
What does this mean for the future of technological interactions?
The Skin-On Interfaces team, led by lead research author Marc Teyssier, have looked at how the long speculated on properties of e-skin, or artificial skin, can be practically incorporated into modern technology, and how we use it.
How this translates to user experience is yet to be seen, but it seems certain that the existing virtual buttons and keyboards will be superseded by technological innovations in the near future.
It is hard to say how soon Skin-On might be brought to market, as it will require the investment of one of the big brands to develop the mass production potential and look at how to pitch the new technology to consumers. Whether this might be developed into a more ‘commercial’ aesthetic remains to be seen, but as far as the prototypes go the material has been created specifically to look just like human skin.Skin-On prototypes have been tested on mobile devices, smart watches and laptop keypads. Image: Marc Teyssier
I can hear the cries of ‘the machines are taking over’ just as soon as a mass market product is released which looks – slightly disturbingly – human, so there is a possibility that this might be manufactured in a less life-like appearance to make it more acceptable to public perception of how their tech should look. How it behaves, however, is nothing short of profound.
What is clear is that robotics is developing at light speed, and finding new ways of interacting with technology is leading us down the path of creating devices which look, feel, and respond more and more like we do.