On hearing the terms “haptic devices” or “haptic displays” we are thinking of the screens of our portable devices, the screens of the ATM machines, the screens of the ticket machines, and many more that we use in our daily life. Haptic interfaces are not just  simple touch screens. They offer a lot of potential that we are going to analyze and discover during this article.

In recent years with the increase of computational power, new, more advanced ways of human computer interaction have risen. Haptic devices comprise one of the main ways of human computer interaction. They have interesting applications in heterogeneous research areas such as robotics, telemedicine, computer graphics, physiology, neuroscience, virtual reality and many more.  The evolution of the technologies with which the tactile devices are designed and manufactured lead to the development of applications that simulate the sense of touch with greater accuracy.

Haptic devices are based on the sense of touch which is a crucial sense for the perception of the environment and its understanding and constitutes a parallel sensor channel along with the channels vision and hearing. It can be used to give additional information to the user. The haptic experience is decisive when the sense of presence and skillful handling in remote or virtual environment is required. A haptic interface does not only provide the sense of an object to the user, it also describes its features such as its shape and texture.

Haptic interaction generally refers to the physical contact that is developed in order to perceive or manipulate objects. These interactions can occur between a) a human hand and a real object, b) a robotic arm and a real object, c) a human hand and a virtual object (through haptic interfaces) or between different other combinations of human or mechanical parts with real, remote or virtual objects.

Think of the impression the images produced by computers caused to people decades ago, and nowadays people are impressed when they feel, through touch, their first virtual objects.

Haptic rendering describes the procedure through which the computer causes the desired and necessary stimuli to the user in order to deliver information for a virtual object. In a simple case of haptic rendering, this information concerns the physical properties of the object such as shape, elasticity, texture, mass etc. Just as an object is rendered differently in graphics systems depending on the used technique, in haptic rendering, the feeling of interaction is different depending on the chosen method.

In the real world, we feel the forces of touch when we touch the objects of the environment. These forces depend on both the surface and properties of the object and the location, speed and direction in which we touch it. Haptic rendering can be defined as the process of assembling the proper forces to give the user the illusion of touch of virtual objects. The sense of touch can be divided into two main categories, depending on the sensory organs involved each time. The first category concerns the skin sensation associated with the sensory organs of the skin and especially the hand. The second category is about the kinesthetic sensation associated with sensory organs in joints, muscles and tendons.


Basic types of haptic devices

Haptic devices are constructed using haptic/tactile sensors that recreate the sense of touch by creating a combination of force, vibration and feeling of movement to the user. Apart from this, haptic technologies use a force feedback to handle the user’s movements which goes beyond a simple vibration notification.

The main separation of haptic devices is: (a) tactile displays and (b) haptic devices. Their differentiation is that in the first case, for the simulation of touch and sensation of the object, there should be contact of the skin with the object, while in the second case the interaction is done through a controller/stylus. Simulating the interactions via a controller or a stylus requires a device with feedback force. The interaction is based on the assumption that there is a haptic copy of the terminal (stylus nose) of the rod, which moves in the virtual environment.

The simulation of direct contact with objects is much more difficult since tactile display is required to properly distribute the forces throughout the contact area of the object with the skin. There are also devices that can be worn on the body, tactile gloves or exoskeletal devices that can simulate with quite a lot of precision the kinesthetic function of the human hand!

In a next article we will talk about the Tactile Internet (TI) that will allow the transmission of touch and movement in real time, as well as how it can contribute to finding solutions to the challenges facing our societies.

To be continued…

 Irene Myrgioti, Business Development Consultant, OTS