Skip to content

Dynamic Blog

The Touchscreen Technology Revolution

Touchscreens are fascinating because they introduce an inherently human element to advanced technology. No expertise is needed to interact with the intuitive interface. This democratizes technology by giving children access to more computing power than the vessel that brought man to the moon.

The precursor to this technology, the mouse, eliminated the need for memorizing complex series of commands to interact with computers. This technology was one of the cornerstones for Steve Jobs’ fascination with the human/tech interface, and it drove him to develop innovation after innovation, culminating in the first widescale touchscreen application: the iPhone in 2007.

 

The Touchscreen Evolution

Like all technical revolutions, though, this one began much earlier. The first touchscreen was invented in 1965: a capacitive touchscreen that used a glass case as an insulator and was coated with indium tin oxide (ITO) as a conductor. A user’s finger would cause a change in the electrical charge flowing through the screen, which the computer would detect and interpret into a command. The technology evolved and branched into three other major types: resistive touch screens, infrared sensors and surface acoustical wave (SAW). These innovations came full circle with the original iPhone, which utilized an evolved version of a capacitive touchscreen.

In a capacitive touchscreen, the ITO coating is etched along the X and Y axis, and any physical contact with a human finger changes the electrical charge. This change is measured at the four corners of the screen, then interpreted into a command by a controller. Resistive touchscreens are made up of a flexible top layer of polythene and a rigid bottom layer of glass, separated by insulating dots. These are cheaper than capacitive screens but aren’t as sensitive, and the top layer is more susceptible to being pierced or scratched by sharp objects.

Infrared touch screens employ IR LEDs arranged on an X and Y axis array. Photodetectors sense a change in the light emitted by the LEDs, then interpret the position and movement of the user’s finger to interpret the motion as a command. Finally, SAW technology leverages two transducers placed on the X and Y axis that interact with a series of reflectors. Surface interruption causes the waves to be absorbed, indicating that a touch has occurred.

As the technology behind how computers sensed human touch developed, a simultaneous evolution occurred with the number of points of contact a computer could detect. The initial touchscreens featured single points of contact, where a stylus or finger replaced a mouse. This developed into a dual-touch capability, where a user could use two fingers to perform more complex commands, such as pinching or stretching to manipulate images and windows.

Today, touchscreens are commercially available that allow 10 points of contact, permitting users to employ all ten fingers as they interact with the image displayed on the screen. This allows everything from virtual keyboards to commands involving complex motions, such as closing a window by grasping it with five fingers and drawing them all together simultaneously. With all ten fingers employed, we seemed to have reached the limit of what touchscreens could do for a single user in a personal computing environment. The question begs to be asked: where is technology going from here?

The Next Stages of Development

Three primary development pathways are being explored simultaneously. The first is the evolution of optical sensors that enhance a computer’s sensing abilities beyond a mere touch interface to include hand and wrist motions. Although touch is also a feature of this tech, a user’s ability to interact with the computer has begun to involve a more full-bodied experience, expanding beyond just fingertips. Another pathway is the advent of screens that can simultaneously accommodate multiple users. An example would be a large wall-mounted or desktop display that drastically expands a team’s ability to make rapid progress on a project. Finally, we’re seeing more public-facing applications, both indoor and outdoor, where physical controls such as keys and knobs are becoming less common as machines such as ATMs transition to being controlled primarily by touchscreens.  

Each of these introduces individual challenges. For example, ITO coating has a resistance of 100 ohms per square, which is perfectly acceptable for small, lightweight designs with relatively thin (0.5 – 1.0 mm) glass. As the size of a display increases, however, the thickness of the glass interface must increase to effectively deal with the stress of constant use, harsh weather, vandalism and the sheer weight of the display surface itself. The largest displays feature glass that’s up to 25 mm (or nearly one inch) thick, and ITO coatings cease to be a viable option in that scenario.

Some manufacturers are replacing indium tin oxide with nearly invisible copper tracks inlaid on the rear surface of a glass touchscreen; since copper has an electrical resistance of 5 ohms per square, this approach is viable for screens up to 90 inches. Although this substantially increases the potential applications of touchscreens, it isn’t an all-encompassing solution. Touchscreen technology heavily relies on the use of vibrations to sense disruptions in electrical signals. The largest screens, as well as those that must contend with outdoor elements, must be immune to vibration, ushering in a new set of factors that must be resolved.  

 

From Haptics to Ultrahaptics

While touchscreens were the peak of technological innovation as recently as a few years ago, they’ve become the foundation of a larger world. Haptic technology is the field that deals with tactile interfaces, and this has become the launching point for ultrahaptics, expanding signals from touch alone into a combination of various senses.

One exciting application was developed by Immersion, a French visual simulation company. They created a tabletop display that allowed individuals to interact with a 3D cityscape projection. Users donned polarized glasses that were wired to position and orientation sensors, tracking where the individual looked. The glasses had an active shutter system that synchronized with the display, allowing the screen to transmit images that matched someone’s perspective regardless of where they stood. Arguably the most exciting aspect of this innovation was that several individuals could interact with the table simultaneously, and the image remained true to their perspective regardless of where they stood — even when another individual stood in a different position and manipulated the landscape.

Interacting with the environment involved actual screen touching as well as optical interaction with images that were virtually projected above the screen itself. Infrared sensors detected finger movement where a 3D image was perceived to be and signaled the computer to react to the motion before an individual’s hands broke the visual plane.

Other applications involve leveraging ultrasound to generate a haptic response directly into a user’s bare hands. The next generation of touchscreen technology will employ the air above the screen as a tactile interface, leveraging focused points of acoustic radiation generated by a phased array of ultrasonic emitters. These are essentially miniature ultrasound speakers that generate steerable focal points of ultrasonic energy powerful enough to be felt by your skin.

The potential benefits of this technology are virtually endless. In a world where the COVID-19 pandemic has introduced extreme caution about touching surfaces that could be contaminated with viral particles, the ability to manipulate controls via ultrasound waves addresses a myriad of public health concerns. This also expands the realm of sensory control to individuals who are deaf or blind, drastically increasing their digital access in a world that’s becoming more and more virtual.

CATEGORIES
SUBSCRIBE TO OUR POSTS
RECENT POSTS

WHITE PAPER

Proactive End-of-Life Management

The Key to Product Lifecycle Extension

To DOWNLOAD our EOL White Paper, submit the form below.