Improved touch screens allow you to feel the effect of touching different surfaces
Texas A&M University has begun developing a new touch interface technology that will be able to transmit tactile sensations. It is about adding another feeling to the process of working with gadgets. As Dr. Cynthia Hipwell argues, today we rely only on sight and hearing to work with electronics, and they have long been overloaded, so it is time to involve touch.
The ultimate goal of the project is to teach the machine to imitate touching physical objects with different types of surfaces. This will significantly enrich the application control menu, give new opportunities in virtual reality, and most importantly, fundamentally change the virtual shopping experience. If it is possible to “touch” goods remotely, “feel” their texture, the shopping process will become much more attractive and effective.
The technical basis for implementing such an interface is based on the principles of multiphysics, when several different fields act at one point in space – in this case, at the point where a person’s finger touches the screen. This requires taking into account the effects of electro-wetting or electrostatics, the geometry of the screen surface, the properties of its coating. As well as fluid movement and charge transportation, the mechanics of finger contact with the screen, and what happens to the finger itself.
At the moment, touch technology is all built around the screen, all efforts are focused on the correctness of its operation and the correctness of touch recognition. But now it is time to move in the opposite direction, to deal with the user himself and provide him the opportunity to feel not only the fact of touch, but also its different forms. There are already developments, and the first such interfaces should appear within the next five years.