In The Begining...
In 1971, Sam Hurst, an instructor at the University of Kentucky, invented the first touch sensor called the "Elograph." This very early version of a physical interface more closely resembled the touch pads on some older laptops; it was dark and non-transparent like modern touch screens. Hurst patented his sensor in 1972 and used it as the focal point of a new business called Elographics. In 1977 Siemens Corporation offered to financially support an effort by Elographics to produce the first curved glass touch sensor interface, later called a "touchscreen."
(The First Ever Touchscreen can be seen below)
How It Works On The Inside
- The touch sensor is a panel with a touch responsive surface. Systems are built based on different types of sensors: resistive (most common), surface acoustic wave, and capacitive (most smart phones). However, in general sensors have an electrical current running through them and touching the screen causes a voltage change. The voltage change signals the location of the touching.
- The controller, is the hardware that converts the voltage changes on the sensor into signals the computer or other device can receive.
- Software tells the computer, smartphone, game device, what's happening on the sensor and the information coming from the controller. Who's touching what where; and allows the computer or smart phone to react accordingly.
Of course, the technology works in combination with a computer, smart phone, or other type of device.
(Below is the inside of Apple's latest iPod Touch)
Using existing touch screen technology, there is an incredible potential to reduce size, weight and portability limitations on interactive hardware like computers and televisions. Paired with other technologies like robotics, scientists are able to perform complex, physical interactions in harsh or remote locations. With the inevitable technological advances in touch screen technology, this technology could continue to have an increasing impact on our daily lives.