A touchscreen is an electronic visual display that the user can control through simple or multi-touch gestures by touching the screen with one or more fingers. Some touchscreens can also detect objects such as a stylus
or ordinary or specially coated gloves. The user can use the
touchscreen to react to what is displayed and to control how it is
displayed (for example by zooming the text size).
The touchscreen enables the user to interact directly with what is displayed, rather than using a mouse, touchpad, or any other intermediate device (other than a stylus, which is optional for most modern touchscreens).
Touchscreens are common in devices such as game consoles, all-in-one computers, tablet computers, and smartphones.
They can also be attached to computers or, as terminals, to networks.
They also play a prominent role in the design of digital appliances such
as personal digital assistants (PDAs), satellite navigation devices, mobile phones, and video games.
The popularity of smartphones, tablets, and many types of information appliances
is driving the demand and acceptance of common touchscreens for
portable and functional electronics. Touchscreens are popular in the
medical field and in heavy industry, as well as in kiosks such as museum displays or room automation, where keyboard and mouse systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying
controller-based firmware have been made available by a wide array of
after-market system integrators,
and not by display, chip, or motherboard manufacturers. Display
manufacturers and chip manufacturers worldwide have acknowledged the
trend toward acceptance of touchscreens as a highly desirable user interface component and have begun to integrate touchscreens into the fundamental design of their products.
History
The prototype[1] x-y mutual capacitance touchscreen (left) developed at CERN[2][3] in 1977 by Bent Stumpe, a Danish electronics engineer, for the control room of CERN’s accelerator SPS (Super Proton Synchrotron). This was a further development of the self-capacitance screen (right), also developed by Stumpe at CERN[4] in 1972.
E.A. Johnson described his work on capacitive touch screens in a short article published in 1965[5] and then more fully—along with photographs and diagrams—in an article published in 1967.[6]
A description of the applicability of the touch technology for air
traffic control was described in an article published in 1968.[7] Bent Stumpe with the aid of Frank Beck, both engineers from CERN, developed a transparent touch screen in the early 1970s and it was manufactured by CERN and put to use in 1973.[8]
This touchscreen was based on Bent Stumpe's work at a television
factory in the early 1960s. A resistive touch screen was developed by
American inventor G Samuel Hurst and the first version produced in 1982.[9]
From 1979–1985, the Fairlight CMI
(and Fairlight CMI IIx) was a high-end musical sampling and
re-synthesis workstation that utilized light pen technology, with which
the user could allocate and manipulate sample and synthesis data, as
well as access different menus within its OS by touching the screen with
the light pen. The later Fairlight series IIT models used a graphics
tablet in place of the light pen. The HP-150
from 1983 was one of the world's earliest commercial touchscreen
computers. Similar to the PLATO IV system, the touch technology used
employed infrared transmitters and receivers mounted around the bezel of its 9" Sony Cathode Ray Tube (CRT), which detected the position of any non-transparent object on the screen.
Six images of General Motors' ECC (Electronic Control Center), released in 1985 as the first touchscreen included as standard equipment in a production automobile. The CRT-based ECC first debuted on the 1986 Buick Riviera as the primary interface used to operate and monitor the vehicle's climate and stereo systems.
In the early 1980s General Motors tasked its Delco Electronics
division with a project aimed at replacing an automobile's non
essential functions (i.e. other than throttle, transmission, braking and
steering) from mechanical or electro-mechanical systems with solid state alternatives wherever possible. The finished device was dubbed the ECC for "Electronic Control Center", a digital computer and software control system hardwired to various peripheral sensors, servos, solenoids, antenna and a monochrome CRT touchscreen that functioned both as display and sole method of input.[10] The EEC replaced the traditional mechanical stereo, fan, heater and air conditioner
controls and displays, and was capable of providing very detailed and
specific information about the vehicles cumulative and current operating
status in real time. The ECC was standard equipment on the 1985-1989 Buick Riviera and later the 1988-89 Buick Reatta, but was unpopular with consumers partly due to technophobia on behalf of some traditional Buick
customers, but mostly because of costly to repair technical problems
suffered by the ECC's touchscreen which being the sole access method,
would render climate control or stereo operation impossible.[11]
In 1986 the first graphical point of sale software was demonstrated on the 16-bit Atari 520ST color computer. It featured a color touchscreen widget-driven interface.[12] The ViewTouch[13]
point of sale software was first shown by its developer, Gene Mosher,
at Fall Comdex, 1986, in Las Vegas, Nevada to visitors at the Atari
Computer demonstration area and was the first commercially available POS
system with a widget-driven color graphic touch screen interface.[14]
Sears et al. (1990) [15] gave a review of academic research on single and multi-touch human–computer interaction
of the time, describing gestures such as rotating knobs, swiping the
screen to activate a switch (or a U-shaped gesture for a toggle switch),
and touchscreen keyboards (including a study that showed that users
could type at 25 wpm for a touchscreen keyboard compared with 58 wpm for
a standard keyboard); multitouch gestures such as selecting a range of a
line, connecting objects, and a "tap-click" gesture to select while
maintaining location with another finger are also described.
An early attempt at a handheld game console with touchscreen controls was Sega's intended successor to the Game Gear,
though the device was ultimately shelved and never released due to the
expensive cost of touchscreen technology in the early 1990s.
Touchscreens would not be popularly used for video games until the
release of the Nintendo DS in 2004.[16]
Until recently, most consumer touchscreens could only sense one point
of contact at a time, and few have had the capability to sense how hard
one is touching. This has changed with the commercialization of multi-touch technology.
Technologies
There are a variety of touchscreen technologies that have different methods of sensing touch.
Resistive
Main article: Resistive touchscreen
A resistive
touchscreen panel comprises several layers, the most important of which
are two thin, transparent electrically-resistive layers separated by a
thin space. These layers face each other; with a thin gap between. The
top screen (the screen that is touched) has a coating on the underside
surface of the screen. Just beneath it is a similar resistive layer on
top of its substrate. One layer has conductive connections along its
sides, the other along top and bottom. A voltage is applied to one
layer, and sensed by the other. When an object, such as a fingertip or
stylus tip, presses down on the outer surface, the two layers touch to
become connected at that point: The panel then behaves as a pair of voltage dividers, one axis at a time. By rapidly switching between each layer, the position of a pressure on the screen can be read.
Resistive touch is used in restaurants, factories and hospitals due
to its high resistance to liquids and contaminants. A major benefit of
resistive touch technology is its low cost. Disadvantages include the
need to press down, and a risk of damage by sharp objects. Resistive
touchscreens also suffer from poorer contrast, due to having additional
reflections from the extra layer of material placed over the screen.[citation needed]
Surface acoustic wave
Main article: Surface acoustic wave
Surface acoustic wave (SAW) technology uses ultrasonic
waves that pass over the touchscreen panel. When the panel is touched, a
portion of the wave is absorbed. This change in the ultrasonic waves
registers the position of the touch event and sends this information to
the controller
for processing. Surface wave touchscreen panels can be damaged by
outside elements. Contaminants on the surface can also interfere with
the functionality of the touchscreen.[17]
Capacitive
Main article: Capacitive sensing
A capacitive touchscreen panel consists of an insulator such as glass, coated with a transparent conductor such as indium tin oxide (ITO).[18][19] As the human body is also an electrical conductor, touching the surface of the screen results in a distortion of the screen's electrostatic field, measurable as a change in capacitance. Different technologies may be used to determine the location of the touch. The location is then sent to the controller for processing.
Unlike a resistive touchscreen,
one cannot use a capacitive touchscreen through most types of
electrically insulating material, such as gloves. This disadvantage
especially affects usability in consumer electronics, such as touch
tablet PCs and capacitive smartphones in cold weather. It can be
overcome with a special capacitive stylus, or a special-application
glove with an embroidered patch of conductive thread passing through it
and contacting the user's fingertip.
The largest capacitive display manufacturers continue to develop thinner and more accurate touchscreens, with touchscreens for mobile devices now being produced with 'in-cell' technology that eliminates a layer, such as Samsung's Super AMOLED
screens, by building the capacitors inside the display itself. This
type of touchscreen reduces the visible distance (within millimetres)
between the user's finger and what the user is touching on the screen,
creating a more direct contact with the content displayed and enabling taps and gestures to be even more responsive.
Surface capacitance
In this basic technology, only one side of the insulator is coated with a conductive layer. A small voltage is applied to the layer, resulting in a uniform electrostatic field. When a conductor, such as a human finger, touches the uncoated surface, a capacitor is dynamically formed. The sensor's controller can determine the location of the touch indirectly from the change in the capacitance
as measured from the four corners of the panel. As it has no moving
parts, it is moderately durable but has limited resolution, is prone to
false signals from parasitic capacitive coupling, and needs calibration during manufacture. It is therefore most often used in simple applications such as industrial controls and kiosks.[20]
Projected capacitance
Projected Capacitive Touch (PCT; also PCAP) technology is a variant
of capacitive touch technology. All PCT touch screens are made up of a
matrix of rows and columns of conductive material, layered on sheets of
glass. This can be done either by etching a single conductive layer to form a grid pattern of electrodes,
or by etching two separate, perpendicular layers of conductive material
with parallel lines or tracks to form a grid. Voltage applied to this
grid creates a uniform electrostatic field, which can be measured. When a
conductive object, such as a finger, comes into contact with a PCT
panel, it distorts the local electrostatic field at that point. This is
measurable as a change in capacitance. If a finger bridges the gap
between two of the "tracks," the charge field is further interrupted and
detected by the controller. The capacitance can be changed and measured
at every individual point on the grid (intersection). Therefore, this
system is able to accurately track touches.[21]
Due to the top layer of a PCT being glass, it is a more robust solution
than less costly resistive touch technology. Additionally, unlike
traditional capacitive touch technology, it is possible for a PCT system
to sense a passive stylus or gloved fingers. However, moisture on the
surface of the panel, high humidity, or collected dust can interfere
with the performance of a PCT system. There are two types of PCT: mutual
capacitance and self-capacitance.
Mutual capacitance
This is common PCT approach, which makes use of the fact that most
conductive objects are able to hold a charge if they are very close
together. In mutual capacitive sensors, there is a capacitor at every intersection of each row and each column. A 16-by-14 array, for example, would have 224 independent capacitors. A voltage
is applied to the rows or columns. Bringing a finger or conductive
stylus close to the surface of the sensor changes the local
electrostatic field which reduces the mutual capacitance. The
capacitance change at every individual point on the grid can be measured
to accurately determine the touch location by measuring the voltage in
the other axis. Mutual capacitance allows multi-touch operation where multiple fingers, palms or styli can be accurately tracked at the same time.
Self-capacitance
Self-capacitance sensors can have the same X-Y grid as mutual
capacitance sensors, but the columns and rows operate independently.
With self-capacitance, the capacitive load of a finger is measured on
each column or row electrode by a current meter. This method produces a
stronger signal than mutual capacitance, but it is unable to resolve
accurately more than one finger, which results in "ghosting", or
misplaced location sensing.
Infrared grid
An infrared touchscreen uses an array of X-Y infrared LED and photodetector
pairs around the edges of the screen to detect a disruption in the
pattern of LED beams. These LED beams cross each other in vertical and
horizontal patterns. This helps the sensors pick up the exact location
of the touch. A major benefit of such a system is that it can detect
essentially any input including a finger, gloved finger, stylus or pen.
It is generally used in outdoor applications and point of sale systems which can not rely on a conductor (such as a bare finger) to activate the touchscreen. Unlike capacitive touchscreens,
infrared touchscreens do not require any patterning on the glass which
increases durability and optical clarity of the overall system. Infrared
touchscreens are sensitive to dirt/dust that can interfere with the IR
beams, and suffer from parallax in curved surfaces and accidental press
when the user hovers his/her finger over the screen while searching for
the item to be selected.
Infrared acrylic projection
A translucent acrylic sheet is used as a rear projection screen to
display information. The edges of the acrylic sheet are illuminated by
infrared LEDs, and infrared cameras are focused on the back of the
sheet. Objects placed on the sheet are detectable by the cameras. When
the sheet is touched by the user the deformation results in leakage of
infrared light, which peaks at the points of maximum pressure indicating
the user's touch location. Microsoft's PixelSense tables use this technology.
Optical imaging
Optical touchscreens
are a relatively modern development in touchscreen technology, in which
two or more image sensors are placed around the edges (mostly the
corners) of the screen. Infrared back lights are placed in the camera's
field of view on the other side of the screen. A touch shows up as a
shadow and each pair of cameras can then be pinpointed to locate the
touch or even measure the size of the touching object (see visual hull).
This technology is growing in popularity, due to its scalability,
versatility, and affordability, especially for larger units.
Dispersive signal technology
Introduced in 2002 by 3M, this system uses sensors to detect the piezoelectricity
in the glass that occurs due to a touch. Complex algorithms then
interpret this information and provide the actual location of the touch.[22]
The technology claims to be unaffected by dust and other outside
elements, including scratches. Since there is no need for additional
elements on screen, it also claims to provide excellent optical clarity.
Also, since mechanical vibrations are used to detect a touch event, any
object can be used to generate these events, including fingers and
stylus. A downside is that after the initial touch the system cannot
detect a motionless finger.
Acoustic pulse recognition
In this system, introduced by Tyco International's
Elo division in 2006, the key to the invention is that a touch at each
position on the glass generates a unique sound. Four tiny transducers
attached to the edges of the touchscreen glass pick up the sound of the
touch. The sound is then digitized by the controller and compared to a
list of prerecorded sounds for every position on the glass. The cursor
position is instantly updated to the touch location. APR is designed to
ignore extraneous and ambient sounds, since they do not match a stored
sound profile. APR differs from other attempts to recognize the position
of touch with transducers or microphones, in using a simple table
lookup method rather than requiring powerful and expensive signal
processing hardware to attempt to calculate the touch location without
any references.[23]
The touchscreen itself is made of ordinary glass, giving it good
durability and optical clarity. It is usually able to function with
scratches and dust on the screen with good accuracy. The technology is
also well suited to displays that are physically larger. Similar to the
dispersive signal technology system, after the initial touch, a
motionless finger cannot be detected. However, for the same reason, the
touch recognition is not disrupted by any resting objects.
Construction
There are several principal ways to build a touchscreen. The key
goals are to recognize one or more fingers touching a display, to
interpret the command that this represents, and to communicate the
command to the appropriate application.
In the most popular techniques, the capacitive or resistive approach, there are typically four layers:
- Top polyester coated with a transparent metallic conductive coating on the bottom
- Adhesive spacer
- Glass layer coated with a transparent metallic conductive coating on the top
- Adhesive layer on the backside of the glass for mounting.
When a user touches the surface, the system records the change in the electrical current that flows through the display.
Dispersive-signal technology which 3M created in 2002, measures the piezoelectric effect—the
voltage generated when mechanical force is applied to a material—that
occurs chemically when a strengthened glass substrate is touched.
There are two infrared-based approaches. In one, an array of sensors
detects a finger touching or almost touching the display, thereby
interrupting light beams projected over the screen. In the other,
bottom-mounted infrared cameras record screen touches.
In each case, the system determines the intended command based on the
controls showing on the screen at the time and the location of the
touch.
Development
Most touchscreen patents
were filed during the 1970s and 1980s and have expired. Touchscreen
component manufacturing and product design are no longer encumbered by royalties or legalities with regard to patents and the use of touchscreen-enabled displays is widespread.
The development of multipoint touchscreens facilitated the tracking
of more than one finger on the screen; thus, operations that require
more than one finger are possible. These devices also allow multiple
users to interact with the touchscreen simultaneously.
With the growing use of touchscreens, the marginal cost
of touchscreen technology is routinely absorbed into the products that
incorporate it and is nearly eliminated. Touchscreens now have proven
reliability. Thus, touchscreen displays are found today in airplanes,
automobiles, gaming consoles, machine control systems, appliances, and
handheld display devices including the Nintendo DS and multi-touch enabled cellphones; the touchscreen market for mobile devices is projected to produce US$5 billion in 2009.[24]
The ability to accurately point on the screen itself is also advancing with the emerging graphics tablet/screen hybrids.
TapSense, announced in October 2011, allows touchscreens to
distinguish what part of the hand was used for input, such as the
fingertip, knuckle and fingernail. This could be used in a variety of
ways, for example, to copy and paste, to capitalize letters, to active
different drawing modes, and similar.[25][26]
Ergonomics and usage
Fingernail as stylus
These ergonomic issues of direct touch can be bypassed by using a
different technique, provided that the user's fingernails are either
short or sufficiently long. Rather than pressing with the soft skin of
an outstretched fingertip, the finger is curled over, so that the tip of
a fingernail can be used instead. This method does not work on
capacitive touchscreens.
The fingernail's hard, curved surface contacts the touchscreen at one
very small point. Therefore, much less finger pressure is needed, much
greater precision is possible (approaching that of a stylus, with a
little experience), much less skin oil is smeared onto the screen, and
the fingernail can be silently moved across the screen with very little
resistance,[citation needed] allowing for selecting text, moving windows, or drawing lines.
The human fingernail consists of keratin which has a hardness and smoothness similar to the tip of a stylus
(and so will not typically scratch a touchscreen). Alternatively, very
short stylus tips are available, which slip right onto the end of a
finger; this increases visibility of the contact point with the screen.[27]
Fingerprints
Touchscreens can suffer from the problem of fingerprints on the display. This can be mitigated by the use of materials with optical coatings designed to reduce the visible effects of fingerprint oils, or oleophobic coatings as used in the iPhone 3GS,
which lessen the actual amount of oil residue, or by installing a
matte-finish anti-glare screen protector, which creates a slightly
roughened surface that does not easily retain smudges, or by reducing
skin contact by using a fingernail or stylus.
Combined with haptics
Touchscreens are often used with haptic
response systems. An example of this technology would be a system that
caused the device to vibrate when a button on the touchscreen was
tapped. The user experience with touchscreens lacking tactile feedback
or haptics
can be difficult due to latency or other factors. Research from the
University of Glasgow Scotland [Brewster, Chohan, and Brown 2007 and
more recently Hogan] demonstrates that sample users reduce input errors
(20%), increase input speed (20%), and lower their cognitive load (40%)
when touchscreens are combined with haptics or tactile feedback [vs.
non-haptic touchscreens].
"Gorilla arm"
The Jargon File dictionary of hacker slang defined "gorilla arm"
as the failure to understand the ergonomics of vertically mounted
touchscreens for prolonged use. By this proposition the human arm held
in an unsupported horizontal position rapidly becomes fatigued and
painful, the so-called "gorilla arm".[28]
It is often cited as a prima facie example of what not to do in
ergonomics. Vertical touchscreens still dominate in applications such as
ATMs and data kiosks in which the usage is too brief to be an ergonomic
problem.[citation needed]
Discomfort might be caused by previous poor posture and atrophied muscular systems caused by limited physical exercise.[29]
Screen protectors
Some touchscreens, primarily those employed in smartphones,
use transparent plastic protectors to prevent any scratches that might
be caused by day-to-day use from becoming permanent. Cases, such as
[[OtterBox] ], help protect the smartphone from falls.
No comments:
Post a Comment