Anyone immersed in the literature of contemporary communication theory and practice is well aware of the difficulty of traversing the bridge between human intercommunication and that of the subhuman species. When forms of behavior little used by either for elaborate intercourse, such as those employing somesthesis as a medium, become the focus of attention, perplexities abound, for the simple fact is that the phylogeny of tactile communication, far from having been rounded into a systematic division of knowledge, currently represents a substantial scientific void.
It is not that animals below man do not communicate tactually. Especially among the subhuman primates there is much in grooming and preening, and aggressive, copulatory, and suckling behavior that, at least by a loose definition of "communication," falls in this category. Although the vast majority of signals by which lower organisms impart information of importance to others of their own and related species tend to be auditory, visual, or olfactory, especially if released at a distance, there are clear instances of stroking, nudging, and other contact behavior short of the thigmotropic that might reasonably be regarded as communicative in nature. The difficulty is that records of such observations are scattered throughout a vast literature and are largely incidental to descriptions of more socially prominent (and, doubtless, significant) forms of conduct.
In what follows, therefore, the decision has been to bring together the current facts of human tactile communication, an area of almost continuous and intensive investigation for a half-century or more and one characterized by both persistent observation and ingenious experimentation. Moreover, it is one that bids fair to circumvent some old limitations of communication stemming from the handicaps of blindness and deafness.
WHY TACTILE COMMUNICATION?
The vast majority of messages that flow through the human community are visual or auditory. They are generally designed to make an appeal to the eye if they involve many relational comparisons, if their content is complex or unfamiliar, if fine spatial discriminations are involved, or if they comprise large masses of reference data. Communication is bound to be visual if pictorial representation is demanded, for no other sense can rival vision in dealing with complex spatial patterns. Obviously, also, the medium of communication will typically be visual where there is auditory impairment or where custom or usage leads to the expectation of visual signaling.
Contrariwise, brief, simple, or transitory messages tend to be cast in auditory terms. Where one wishes to transmit, out of a larger context, only immediately relevant information or where rapid conveyance of the message is desired, the auditory channel is to be preferred. Especially where the recipient is preoccupied and we wish to "break in" on his attentional stream we do so by way of the ears. They are always open, so to speak. Moreover, audition is the channel of choice where some flexibility is required in framing a message; variations of emphasis and inflection carry shadings of meaning that can be entrusted to the eye only at the expense of time-consuming and inefficient circumlocution. Auditory messages tend to be employed where vision is overburdened or suffers outright impairment. The visual channel can be degraded through such deleterious influences as enforced mobility of the recipient or a variety of untoward environmental changes: ambient light variability, vibration, g-forces, hypoxia, or other defects arising from similar stressful alterations of the surroundings. And, of course, as with vision, there are situations leading normally, through the operation of usage or custom, to the expectation that auditory signals will be provided (Henneman, 1952).
With such complementariness between the realms of seeing and hearing, how can there be a role for the cutaneous senses in the world of communication? There is, to be sure, the obvious instance in which both major senses are unavailable; the deaf-blind constitute one population crying out for tactile aid. We shall consider some of the approaches to this problem, for it is an ancient one and one supporting an extensive literature. But, normally, as modern life brings on an overload of the ordinary channels of communication, it is also proper to inquire what the remaining sensory systems of the body have to offer by way of relief. Let us call the roll of possibilities.
The cutaneous channels, by physiological and psychological convention, are four in number: touch (pressure), pain, and the two temperature senses, warmth and cold. Additionally, there are the two chemical senses, smell and taste. What are the chances that any of these might substitute for hearing or seeing?
We may dispense with the last two rather quickly. Both gustation and olfaction normally come into operation through the triggering action of chemical stimuli: materials in solution in the case of taste, volatile substances for smell. In both instances there is the necessity of getting the stimuli from the source to the site of stimulation; typically there is a time-consuming transport problem. Odorants must be admitted to the nose by sniffing; they must then pass up to the sensitive epithelial patch high in the nostrils, chiefly by eddy currents swirling around the turbinate bones; and then they must be adsorbed on tiny fibrils immersed in mucus. All this takes time. Moreover, olfactory sensations, once aroused, have a slow subsidence rate. Again, it takes time for the chemical stimulus to be expended. The net result is a ponderous rise and fall of sensation, even though it seems probable that, at the site of transduction in the olfactory epithelium, the interchange of energy for a given smell molecule must be an exceedingly rapid affair. But each sniff involves millions, even trillions, of molecules, and not all follow the same time course through transport and adsorption.
The story for taste is much the same. Food particles and chemical substances, once led to the tongue and palate, must go into solution. It is not entirely clear yet whether the solvent is always simply water or whether enzymes, already present in the lingual tissues, must interact with them. In any case, there is the transport problem again—the process whereby dissolved taste stimuli are brought into juxtaposition with sensitive gustatory receptors—and the further question of how ions are able to release impulses in the cranial nerves subserving the tongue and palate. The whole chain of events is again time consuming and persistent. Neither of the chemical senses is really suitably constituted to relay promptly the messages available to them.
To return, then, to the skin, what of the thermal senses, warmth and cold? Somewhat similar considerations arise here. To be sure, low and high skin temperatures produce radically different effects, cold "flashing out," warmth "welling up." But the two responses have the common characteristic that, at least as related to the physical sources prompting them, they come out of relatively sluggish systems for information transmission. Again, the final word on the nerve-impulse generation mechanism is not yet in—we do not know whether impulses reporting on warmth and cold come from the same or different receptors, whether they are conducted over identical nerve fibers, or whether their central processing differs greatly one from the other— but we do know that the organism's integument is better designed to protect internal organs against surges in temperature than to provide faithful reports on thermal changes in the environment. To learn about heat interchanges in one's surroundings it is better to consult a thermometer than to heed the messages coming from one's own skin.
We are thus left with the prospect of utilizing the pressure and pain senses if cutaneous channels are to become effective in the transmission of messages, and since it will generally be conceded that pain, in the context of human intercommunication, is not a "consummation devoutly to be wish'd," we are, for all practical purposes, reduced to the use of the tactile sense in our search for an eye-ear substitute. At the same time, it should be hastily pointed out that the world of tactile sensation is a rich and versatile one. Of the six classes of physical stimuli that can arouse human senses—photic, acoustic, mechanical, electrical, chemical, and thermal— all but the first, light, can act upon the skin to produce somesthesis. Normally, to be sure, acoustic stimuli are not generated at power levels that will affect the skin appreciably, and the undesirable features of thermal and chemical stimulation for the purposes under discussion will be somewhat apparent from what has been said above about the senses to which these forms of energy are appropriate. Of the remaining two classes of stimuli, the mechanical and the electrical, we shall have much to say, for these provide the really live prospects for cutaneous communication.
The virtues (and some of the defects) of vision and audition as communication channels have been set forth. Does touch have comparable qualities to commend it? There are some features worthy of mention. To begin with, the skin is eminently available. Its area is somewhat over a thousand times that of the retina and is freely supplied with nerve endings, thus providing a nearly continuously sensitive surface. Moreover, it is an area that is relatively traffic-free. There is also a free orientation of the skin toward potential sources of information; bodily orientation toward the source is necessary for vision, less so for audition. Also, the skin is a flexible organ, rarely retaining a deformation for long. It thus has something of an advantage with respect to declining sensitivity from continuous stimulation, sensory adaptation. There are few accidents of stimulation where the skin is concerned; at least, there are rarely sustained accidental stimuli. On the contrary, unwanted lights and sounds often interfere with message reception in vision and audition. Finally, as with audition, there is little redundancy in cutaneous information; visual information, conversely, is commonly highly redundant. Thus, touch shares with hearing the advantage that information can be presented only when needed.
Tactile communication has been spoken of as if the only role it might play is that of a replacement for a missing visual or auditory system. Actually, except for the rare instances in which neither is available, there is no real expectation that somesthesis could furnish the richness of experience normally supplied by either sight or hearing. Broadly speaking, there are two great classes of perceptual discriminations made about things and events in the world; they involve distinctions of space, on the one hand, and of time, on the other. Vision is the great spatial discriminator; audition excels in the realm of time. Touch stands midway between the two major senses in the respect that it is endowed with some of each character.
Touch is better than vision at temporal discriminations and better than audition at spatial discriminations, but by the same token it is spatially inferior to vision and temporally inferior to audition. This means, however, that it can provide an avenue for the avoidance of mental bankruptcy for the deaf-blind, and while some spectacular steps have been taken—witness the Laura Bridgman-Helen Keller phenomenon— the vast potentialities of somesthetic substitution have not yet begun to be realized. As Cassirer has said:
Vocal language has a very great technical advantage over tactile language; but the technical defects of the latter do not destroy its essential use. The free development of symbolic thought and symbolic expression is not obstructed by the use of tactile signs in the place of vocal ones. If the child has succeeded in grasping the meaning of human language, it does not matter in which particular material this meaning is accessible to it. [1944:36]
The truth of this assertion is amply evidenced by the clinical experience that at least a few of the deaf-blind can quite successfully employ the Tadoma, or "speech feeling," method to receive communications from their teachers. In this technique the "reader" places his thumbs on the instructor's lips, the index fingers to the sides of the nose, and the remaining fingers on the cheeks and upper throat. With the faint cues thus provided the total speech pattern of movement is remarkably faithfully interpreted (Kirman, 1973:59).
Currently, it is less as a surrogate than as an ancillary system that tactile communication should be viewed. Except for a few somewhat esoteric situations—to attract attention in emergencies, especially where the monotony of routine has dulled normal perception; to permit intercommunication where darkness and enforced quiet have supervened; to provide warnings of threatening events outside the visual and auditory fields; to preserve secrecy in clandestine operations—a tactile communication system's main prospect for service is as a cooperative supplement to other sense channels. Perhaps because we normally attend so little to what is going on in the skin, any startling stimulation of the integument immediately "cuts through" and comes to our attention. Panic buttons should be wired to tactile stimuli. For much the same reasons any visual and auditory overloading can best be relieved by appeal to the skin.
Cutaneous Channels of Communication
CLASSIFICATION OF TACTILE SYSTEMS
Attempts to deliver messages through the skin appear to have been made from very early times. However, there is no unbroken continuity of effort dating back to the pyramids of Egypt or the Golden Age of Greece, only isolated and sporadic attempts to deal with blindness and deafness at various periods of history. Thus, the Venerable Bede, in late seventh-century England, "master of all knowledge of his time," seems to have been greatly concerned about deaf-mutism and its attendant communication problems. George Dalgarno, Scottish protégé of Charles II, in his "deaf and dumb man's tutor" (1680), developed a whole touch alphabet for delivery by the fingers to another's hand. Jean-Jacques Rousseau incorporated a passage in his well-known educational treatise, Émile (1762), on the possibility of cutaneous communication and suggested that "if our touch were trained to note [natural vibratory] differences, no doubt in time we might become so sensitive as to hear a whole tune by means of our fingers . . . [and these tones] might be used as the elements of speech." To these scattered ideas there must be added those of string writing, a means of communication in which the "sender" tied a succession of distinctive knots, coded to the alphabet, to be felt by the recipient, who passed them through his fingers, and various systems of finger spelling.
In the late eighteenth century Valentin Haüy introduced raised letters embossed on paper (Farrell, 1950). Though extremely difficult, this system was the prototype for many similar tangible alphabets, most of which have gone out of use. So-called Moon type, an arrangement of embossed lines somewhat resembling the Roman alphabet, has persisted with some success, especially among those losing their sight late in life. The system of raised dots, now so widespread in use, was initially developed by Louis Braille from a previously existing military code (Bledsoe, 1972). The method of Braille, himself bereft of sight at an early age, has become the modern refuge of the blind, at least the more pertinacious of them. The coding of the braille language is an arbitrary one, developed along logical rather than psychological lines, and is very difficult to master. It exists in several forms, the variations stemming chiefly from different practices with respect to the formation of contractions of more elaborate language units. Withal, braille is by far the most widely used language for the blind and currently represents the most successful medium of their education.
It was earlier pointed out that both mechanical and electrical stimuli are capable of providing the signals from which cutaneous language symbols and other forms of communication can be constructed. A possible classification of tactile systems of communication follows almost immediately upon this distinction. One class, clearly the largest, would include all systems employing mechanical stimuli to move the skin, whether with single pulses, trains of them, steady vibration, or elaborately patterned energy fluctuations such as are found in the speech signal. A second class would involve all direct applications to the skin of electrical currents having temporal patterns corresponding to the mechanical ones, whether originating in direct, oscillating, or alternating potentials. Such a classification would have to make room for a third category, the electromechanical, for it is possible by taking advantage of the electrical capacitance properties of tissues to produce mechanical movements of the skin by this means. In such a system the skin becomes one plate of a condenser. With the application of suitable voltages it is possible to bring about substantial mechanical skin displacement.
Another way to classify tactile communication systems is in terms of their major mode of operation. Thus, some systems are designed for direct mediation of spoken speech, whether the final link with the body is mechanical, electrical, or electromechanical. Other systems involve pictorial displays. Many simply present a systematic array of discriminable signals that have either been coded to language elements or been given some other symbolic meaning. Still other devices are arranged for cutaneous monitoring or tracking of a continuously variable set of environmental events.
Actually, it is unnecessary to choose between these two modes of classification. The two dimensions of analysis practically force on one an Aristotelian cross-classification (see Table 1.) The suspicion is strong that, if we knew more about electromechanical possibilities, all the "pigeonholes" in the table would be filled. As we shall see, this area, which, for technological reasons that will become obvious, has been greatly neglected. Keeping the provisions of our table in mind as an orientation, let us take a closer look at the several operational problems represented in it.
DIRECT SPEECH MEDITATION: MECHANICAL METHODS
The modern period of research aimed at direct transmission of speech to the skin began with the work of David Katz (1930) and his coworkers in Germany, and of Robert Gault (1926) and his associates in the United States. Both lines of work initially made use of electromagnetic receivers (speaker units) held in contact with the skin while the amplified output of a microphone transmitted voice sounds or music to it. Thus, with fidelity limited only by the generally crude sound system then available and variable back action resulting from damping of the contactor by the skin, moment-to-moment variations of frequency and amplitude of the speech signal were imposed directly on the skin, typically that of the palm or fingertip. In Gault's work there was a precursor, a long, hollow tube held against the palm. The reinforced acoustic signal was thus impressed directly on the skin; with the tube's help (and with hearing masked) it was possible to distinguish among several of a group of simple words.
Tactile communication systems and devices.
But the approach for which Gault's laboratory became best known centered on a new instrument, the teletactor, which was developed for the purpose by Bell Telephone Laboratories. This device consisted of a piezoelectric crystal of the type subsequently popular in sound systems (tweeters). It was small in size, little affected by skin damping, and would vibrate over a considerable range of frequencies. It could be held so as to affect the fingertip or it could readily be applied to other body areas. In one application of the teletactor (Gault and Crane, 1928) five of the instruments, dividing up the speech frequency band, were used simultaneously on the fingers of one hand; in effect this anticipated the tactual vocoder (see below).
Many patient experiments were conducted in an effort to transmit speech to the fingertip. Early results were promising. After 14 hours of well-distributed practice, the subjects learned to identify with about 75 percent accuracy which one of ten brief sentences had been spoken into the microphone, and with 30 hours of training behind them they could judge about half the time which one of more than fifty words, isolated from context, had been presented. But it was also found that a change in experimenters or in the rate of speaking by the same experimenter produced a collapse of this apparent ability to "hear through the skin." The subjects had presumably been relying on cues of emphasis and rhythm, in general the prosodie features of speech. The attack by way of the teletactor was eventually dropped, and it must be judged to have been a major disappointment.
Several lines of investigation have approached the problem of transmitting speech to the skin by way of variants of the vocoder, an acoustic analytic device dating from 1936 (Dudley). The prototype of this instrument in its tactual application is to be found in Project Felix of the Massachusetts Institute of Technology (Levine et al., 1949-1951). A development from it, a device designed at the Speech Transmission Laboratory, Stockholm (Rosier, 1957), broke up speech into ten channels having center frequencies ranging from 210 Hz to 7,700 Hz. The energy in each channel modulated a 300-Hz carrier that was common to all ten outputs, one for each finger. The net result of this arrangement was that patterns of vibration, analyzed with respect to the relative amplitudes of their components, could be presented in real time to different skin loci. Unfortunately, in a thorough study of performance with this vocoder (Pickett and Pickett, 1963) too few patterns could be discerned to yield anything but a hope that the device, through future refinements, might one day permit something approximating satisfactory recognition of tactual speech.
Subsequently there have been other attempts to utilize the vocoder principle. One device (Guelke and Huyssen, 1959), consisting of an array of twenty vibrating reeds contacting the several joints of the fingers at eight locations of one hand, delivers constantly changing patterns of vibration that follow those of the speech signal. The reeds perform a fairly discriminative frequency analysis that, by this arrangement, gets transformed to distinctive skin loci. Though some vowel sounds were easily distinguishable in early tests, as were characteristic temporal patterns of some diphthongs, great difficulty was experienced in discriminating among consonantal sounds. Also, as might have been predicted, there were difficulties in analyzing out separate loci when two or more were stimulated simultaneously.
Another variant of the vocoder is an instrument going under the name of Tactus (Kringlebotn, 1968). Somewhat like the early Gault instrument, it applies the speech signal to five bone conduction receivers, which contact the fingers of one hand. These are also energized simultaneously. Its unique feature is that, through the intermediation of a set of multivibrators, which fractionate the input in different ways, all the fingers receive the basic signal but with different portions stressed. No elaborate claims have been made for this device beyond the suggestion that it may help, in teaching the deaf child to speak, to reinforce the concept of rhythm and to provide some useful ancillary cues. Still other variants of the vocoder are known; the idea has been a persistent one. Kirman, reviewing the vocoder story (1973:59) was led to conclude: "The history of tactile vocoders indicates that simply providing the skin with such frequency-to-locus translators as have been tried does not enable it to comprehend speech."
It is not necessary to lead each frequency band to its own position on the skin. Up to some point of resolution determined by its "funneling" capacity, it is possible for a stretch of skin to distinguish among frequencies on the basis of the location each will seem to occupy when the area in question is broadly stimulated. This is the principle, so influential in auditory theory, on which the Békésy cochlear model operates (Békésy, 1955). An approach made by Keidel and his students at Erlangen, Germany, relies on the Békésy model. Their application has its source in the consideration that tactile response to vibration has much in common with that of the ear, except for three important features: (1) The two senses occupy overlapping, but quite different, frequency ranges. For the ear, the range important for speech extends from about 300 Hz to 3,000 Hz. The skin is most sensitive in the 25 Hz to 400 Hz range. (2) The skin's sensory response to a stimulus is relatively sluggish, requiring about 1.2 sec to reach a peak, while the ear does so in less than one-sixth that time (0.18 sec). (3) Differential frequency discrimination is sharp in the case of the ear, which in its best performance yields a Weber fraction of about 0.2 percent; the skin at its best does about 5 percent and, as frequency goes up, becomes 50 percent or more. Obviously, if spoken speech is to be apprehended by the skin with any fidelity at all, it is necessary to transpose frequencies downward into the region of the skin's optimal response. The Erlangen group went about it this way.
The basic experiment is that of Biber (1961). He first stored samples of speech—three classes of monosyllabic words having high-, low-, and middle-range characteristic frequencies, together with a selection of phonemes—on magnetic tape. The tape was then played back at reduced speed to a Békésy cochlear model on which the subject's forearm rested. Three different ratios of playback-to-recording speeds were tried: 1:8 (which should yield frequencies nearly ideal for the skin), 1:4, and 1:2. Endless tapes with known syllables or words permitted multiple training trials.
The 1:8 speed reduction proved unsatisfactory, mainly because the presentation time exceeded the subject's short-term memory; by the end of a word he would have forgotten how the word had started! (This perplexing problem has been encountered in other tactile performances and is far from solution.) The 1:4 time reduction was satisfactory, however, and with this amount of slowing in force Biber continued his training experiments. Indeed, he found that unknown words could be correctly recognized 83 percent of the time after only 18 hours of training, and performance became practically errorless after 32 hours.
The fault of Biber's system lies, of course, in the fact that the time necessary for the transmission of a given segment of speech has been multiplied by four. Since, in any normal operation of the senses or the response mechanisms triggered by them, there is likely to be a good deal of cooperation involving their acting together in time, artificial slowing of speech has to be put down as a serious disadvantage. Accordingly, the Erlangen laboratory (see Keidel, 1968, 1974) set about to remedy the defect.
The answer came in an ingenious application by Finkenzeller (1973) of still a different physical principle. Divers, working at great depths below the surface, can sustain normal breathing only in a helium atmosphere. But it has the disadvantage that the human voice, because of the high velocity of sound in helium, becomes so distorted as to be unintelligible. The way out of this dilemma has been found, quite recently, to be simple. Speech sounds are highly redundant; removing parts of them and providing continuity to the remainder leaves them quite intelligible. By a suitable computer program it is possible to extract every fourth wave in a speech signal, suppress the rest, and expand each remaining wave to fill up the time period originally occupied by all four. The only problem is that of joining the segments in such a way as to avoid repetitive clicks, and hence unwanted "periodicity pitch" from them. This, too, was accomplished by Finkenzeller by smoothing the connection between segments within a 400-microsecond period, thus obviating the click stimulus, which would otherwise constitute an interfering signal.
The technical difficulties of sound expansion having been overcome, there remains the important question of how well the skin is going to be able to utilize the transposed speech signal. To date one has only the report (Keidel, 1974:31): "The preliminary results are very promising, because (a) it now works in real time, (b) tactile memory is not overloaded, and (c) vibrotactile information can be combined with that from other sense modalities. . . ."
Until now we have been considering only mechanical systems that deliver signals, chiefly those of speech, directly to the skin. There are electrical analogs, however, some that extend a considerable distance back in time. Early suggestions along these lines were made at the turn of the century by MacKendrick (see Lindner, 1936) and only a little later by DuPont (1907), who was impressed by the fact that electrical pulses corresponding to different musical tones from phonograph records yield some discriminably different "feels" and could especially transmit musical rhythms to the skin. Serious attempts to deliver speech signals to dermal electrodes have not been carried out at a frenzied pace, however, especially following the disappointment occasioned by the more or less complete failure of the electrocutaneous form of MIT's Felix (Levine et al., 1949-51). There have been some, however. Mention should be made of the work of Anderson and Munson (1951) at Bell Telephone Laboratories, which did much to establish limits for the avoidance of pain and to specify some conditions of successful discrimination of signals. Significant also are the experiments of Breiner (1968), in Germany. He devised a fourteen-element electrocutaneous vocoder, seven electrodes on each hand, which possessed the virtue that spatiotemporal patterns, marked by vivid perceived movement, could be created with word sounds, thus adding a possibly valuable element for use in cutaneous communication systems.
Perhaps the most persistent and thoroughgoing attempt to mediate speech sounds electrocutaneously was that of Lindner (1936). Indeed, after finding that direct feeding of speech signals into dermal electrodes gave little in the way of discrimination, he made a valiant effort to salvage what he could by devising an interesting, if primitive, vocoder that utilized both mechanical and direct electrical stimuli. The four fingers of the left hand divided up the task, the index and little fingers receiving electrical signals reproducing high and middle frequencies, while the middle and ring fingers got mechanical stimulation by low frequencies. There were no revolutionary findings. This somewhat analytic scheme overestimated the value of the vocoder principle, but it does represent a unique combining of mechanical and electrocutaneous approaches to tactile communication, perhaps the only such hybrid ever attempted.
The impression should not be left that efforts to communicate electrocutaneously have been entirely sterile. On the contrary, much has been learned about the basic psychophysical functions involved (lower and upper limits of intensity and frequency discrimination, etc.), relation to pain and discomfort, preferred electrode arrangements, and reaction time to such stimuli. Incomplete to scanty information has been acquired on such matters as the limits and possibilities of spatial discrimination, generation of movement patterns, masking, and temporal integrations in electrical stimulation. Much more needs to be learned about the conditions for the avoidance of pain in electrocutaneous communication. The reviews by Breiner (1968), Rollman (1974), and Sherrick (1975) point up the problems.
We shall be returning to the topic of electrical approaches, for there have been some interesting attempts to synthesize coded skin languages with this mode of stimulation.
PICTORIAL DISPLAYS IN TACTILE COMMUNICATION
It is not only the auditory environment that may be transmuted to somesthetic sensation; the visual world can be represented as well. Therein lies the possibility of alleviating some of the burden of blindness. Three lively systems, tactile television, the Optacon, and the Elektroftalm, have been devised to this end. Each will be dealt with briefly.
Tactile TV, a development carried out at the Smith-Kettlewell Institute of Visual Sciences, San Francisco, provides an elaborate array of 400 tiny electromagnetic vibrators that presses against a 10" X 10" area of a subject's back (Collins, 1970; White et al., 1970). The vibrator matrix reproduces with sufficiently fine grain the image picked up by a vidicon camera. Thus a vibrating pattern of a two-dimensional image in the camera's field of view is transferred to the subject's back. The subject can manipulate the vidicon, moving it vertically and horizontally to bring the image to any part of the matrix or, indeed, to take it "off screen." This is important, because it may well be that the transit of the image across the border of the field provides the salient cues that permit recognition of objects by way of their cutaneous patterns. That is strongly implied by the experiments of Craig (1974).
It is too early to evaluate tactile TV. The system has undergone several modifications, including departure from the mechanical mode of stimulation; e.g., an electrocutaneous matrix that contacts the abdomen has been substituted. Also, special arrangements with respect to electrode design and application have been introduced (Saunders, 1974). Whatever the outcome for a visual substitution system, it is already apparent that much remains to be learned about the spatial properties and inherent limitations of both mechanical and electrocutaneous displays.
The Optacon (optical-to-Actual-conversion), an instrument developed at the Stanford Research Institute, is also intended as a reading aid for the blind (Bliss et al., 1970). The device presents a set of tiny pins, 144 of them in a 24 X 6 array, to a single fingertip. Movement of the pins depends on vibration of piezoelectric Bimorph reeds to which they are rigidly attached, and these, in turn, are controlled by a bank of 24 phototransistors, each controlling a lineal array of 6 contactors. As the image of a printed letter, say, passes over the photosensitive surface, corresponding vibrators are set in motion, with the result that a vibratory "image" crosses the fingertip. The moving display, in principle, is exactly like the message that travels around the Allied Chemical Building in Times Square, New York; indeed, this kind of representation is familiarly known as a "New York Times display."
At least for some purposes, practical reading rates are possible with the Optacon, but as with tactile TV, full evaluation is not possible yet. The device has only relatively recently been made available commercially, but already scores of blind people have received training on it. It has an inherent limitation as a reading device in that it can comprehend only one letter at a time, but this is also true of many less "pictorial" systems that involve, additionally, the learning of a novel code. Moreover, written messages are not the only materials to which the Optacon may be adapted. Tracking experiments have been conducted with its aid, and there have been at least preliminary explorations of the possibility of using it as an environmental form detector (Bliss et al., 1970).
The third of this trio, the Elektroftalm, is a Polish invention (Starkiewicz and Kuliszewski, 1963). It was developed mainly to provide a mobility aid for the blind. Its "field of view," some 30° or more wide, is sensed by a camera, the "film" of which consists of 120 photocells mounted on top of the user's head. With suitable amplifying and switching elements in each circuit, changes in light intensity activate a mosaic made up of 120 small vibrators contacting the forehead. As with tactile TV (the Elektroftalm, first reported in use in 1962, is essentially its technological ancestor) sufficiently large and bright objects on a dark background may be detected. Assuming head-orientation cues to have been learned sufficiently well, the tactual "image" can aid in avoiding obstacles and in recognizing some familiar objects by their highlights. But, as its inventors say (Starkiewicz and Kuliszewski, 1965:32), "appreciation of a distinct plastic image is out of the question."
All three of the pictorial systems described above have embodiments involving direct electrical stimulation of the skin. We have seen that tactile TV has recently taken a turn in that direction with its electrocutaneous abdominal signaling system. In the course of its development, the Optacon in electrocutaneous form was given a brief trial, which proved generally unsatisfactory (Melen and Meindl, 1971), and the Elektroftalm was also converted to direct electrical stimulation of the forehead, but with results that seemed not to warrant continuation of the project.
It would be helpful if a set of crucial experiments could be devised and carried out to settle the long-standing question of the degree to which the human skin simulates the human eye in its capacity to make spatial distinctions. There is no difficulty in supplying a fine-grained mosaic to the skin, but to what extent is it a waste of effort? The skin is not innervated in the same way that the retina is; the suspicion is strong that tactual space is not just a vague replica of visual space. Presumably, pictorial systems of tactile communication stand or fall on the answer.
CODED CUTANEOUS LANGUAGES: MECHANICAL SYSTEMS
All the message systems we have considered thus far have involved imposition on the skin of language symbols developed throughout human history for either visual or auditory conveyance. The thought must have occurred that spoken or graphic stimuli may not be the most congenial for the skin to mediate. There is another way to go about solving the general problem of cutaneous communication. Instead of requiring the tactile mechanism to deal with signals unnatural to it, one may ask, "What discriminations are possible for the skin and its neural attachments?" If a substantial number of highly distinctive and manipulable signals can be discovered, it would then be possible to code them to symbols already in the receiver's possession.
There is one obvious existing code that presents itself as a viable candidate, International Morse. Anyone who has learned the Morse code for use in radio or telegraphy and is therefore able to receive it through audible dots and dashes or, as in shipboard signaling, by flashing lights, is equally capable of interpreting messages by way of mechanical taps or buzzes on the skin. Indeed, cutaneous Morse has benefited from a specialized miniature and portable instrument dubbed Taccon (Dalrymple, 1973), which facilitates coded communication in many situations. Interestingly, there have also been experiments aimed at ascertaining the feasibility of electrocutaneous Morse (Foulke and Brodbeck, 1968). The results of these were not unexpected. Subjects well versed in International Morse who could receive errorlessly at the rate of 20 five-letter words a minute when getting an audible signal were highly variable in performance on the dermal code, receiving at no better than half this speed when fully practiced. The main difficulty was in distinguishing the boundaries between dots and dashes; and adjustments of intensity, rise time, carrier frequency, etc., did not improve the situation. The very factors that bring about a deterioration of performance when electrocutaneous stimulation is substituted for mechanical in the various pictorial systems already considered appear to be responsible for a similarly poor outcome here.
Actually, despite its widespread use for a variety of purposes, chiefly because it is highly resistant to noise degradation, International Morse is really quite inefficient because it is very wasteful of time. Having only the two building blocks, dots and dashes, with which to work, strings of these symbols have to be separated by costly silences if the alphabet, numerals, punctuation marks, and conventional abbreviations are to be accommodated. Proficient Morse operators, those to whom important messages are entrusted, seldom exceed the rate of about 25 fiveletter words per minute. Compared with ordinary reading or speaking rates, even with speeds achieved by practiced braille readers, Morse is really a very pedestrian language. Any scheme demanding fewer blocks of silence would be an improvement.
An approach to achieving a set of tactile coded languages was made by the author, his students, and colleagues over a span of years at the University of Virginia and, later, at Princeton University. Four systems of cutaneous communication were developed: vibratese, body braille, the optohapt, and polytap. Each has its own set of principles while sharing the common property that each depends on coding.
Vibratese was the outgrowth of a broad attack on the area of cutaneous communication that, reversing the earlier approach of trying to make the skin adapt to some previously existing communication hardware, asked what basic discriminations the tactile system was capable of making. In brief, it asked, "What is the tongue of the skin?" From the outset it was apparent that a dimensional analysis of the skin's discriminatory capacities was needed. Once the decision was made that sustained vibration was the stimulus of choice—superior to transient pressures, pokes, or jabs, which fail to take advantage of the excellent temporal distinctions of which the cutaneous system is capable—the dimensions involved, at least the first-order ones, became obvious. The discriminable dimensions of vibration, taken in relation to bodily stimulation, are: intensity (amplitude or energy), duration, frequency, and locus. There are several higher-order dimensions, such as wave form (simultaneously operating frequencies), movement (change of locus through time), "attack" (change of amplitude through time), and some others, but these are not at the heart of the matter and, for the most part, only provide shadings of perceptual patterns that, by extending coding possibilities, could enrich the vibratese language.
The fundamental experiments leading to vibratese are those of Spector (1954), who did a rough casting up of the roles of both intensity and duration; Howell (1956), who obtained the first important data on locus; and Goff (1967), whose measurements of frequency discrimination revealed why this vibratory dimension could not be readily accommodated. It remained for Howell to engineer a suitable code and to ascertain with what ease it could be learned (1956).
The initial attacks on intensity and duration were strictly psychophysical. Applying vibrators to the chest region—mainly because it provided a relatively traffic-free expanse of skin (we had possible vehicular use in mind)—there were first determined the number of discriminable steps ("just-noticeable differences") that could be felt between a point safely above threshold and one well below the discomfort level. On the average, fifteen steps could be appreciated. Similarly, there were charted perceived differences in vibratory duration between the briefest buzz identified as such and a relatively long-lasting one (2 sec). About twenty-five steps could be discriminated. But just-noticeable differences are not codable units; if a particular intensity or duration were to be presented in isolation it could not be identified without error (Miller, 1956). If code symbols are to be attached to vibratory intensities and durations it is not safe to go above four or five steps of each. Actually, not more than three steps were used in either dimension. More were not needed because, meanwhile, the decision had been reached to use five well-spaced loci on the chest and not to present more than one vibrator at a time. Since, where separate dimensions of the stimulus are combined, the internal relations are multiplicative, a total of forty-five signals (3X3X5) could be derived from the three steps of intensity (strong, moderate, weak), the three of duration (long, medium, short), and the five loci (separated by about 5-6 inches and arranged domino-fashion). Twenty-six specific combinations of intensity, duration, and locus were coded to letters of the alphabet, attention being paid to frequency of usage when assigning them; another ten were coded to digits; and a few were set aside for the most commonly recurring words in English prose (Geldard, 1962).
The vibratese language, having available a relatively large number of unique collocations of its three basic dimensions, proved to be far more efficient than simpler schemes, such as the telegraphic code. There are no wasteful silences in it. But could anyone learn it? The answer came quickly. Not only could it be mastered in a short series of training sessions, but two- and three letter words could be introduced quite early in that learning. One subject acquired vibratese so rapidly that, within a matter of a few weeks, he was receiving almost without error at the rate of 38 five-letter words a minute, a speed that nearly doubles that of proficient Morse reception. When the experiment was discontinued it was not because the learning ceiling had been reached but because, in the precomputer period in which the first work with vibratese was done, signals could not be transmitted any faster!
Much later (Geldard and Sherrick, 1970), the presentation-rate difficulty was overcome by controlling the vibrators with eight-hole tapes punched for the Tally reader, locus being coded on five hole positions, intensity on two, and duration as either a one-column or a three-column pattern. This arrangement made possible speeds hitherto unobtainable. Also, by then, the more rugged Sherrick vibrators were available to replace the modified relays originally used in the Spector and Howell experiments, and easy attachment to the body could be effected with Velcro tapes (Sherrick, 1965). Accordingly, vibrator positions were changed from the chest region, where all but the weakest signals are readily conducted to the cochlea by way of the rib cage, to the fleshy surfaces of the upper and lower arms and to the pit of the stomach. The new placement, if anything, made acquisition of the vibratese alphabet easier than before, but it did not resolve the problem, alluded to previously, of the almost aphasia-like effect in which individual letters could be comprehended with ease but "blocking" intervened to prevent retention of letter order and hence word meaning. We shall encounter this phenomenon again.
Body braille—as it came to be called because it involved presentation of six stimulators widely spread out on the skin: two on the forearm at the wrist, two near the elbows, and two close to the shoulders—was devised less as a practical communication system than as an attempt to manipulate the variable of body locus in basic experiments (Virginia Cutaneous Project: 1948-1962:47). The confining of stimuli to the hand and especially the fingertips has become almost a fetish in tactile communication, with convenience for the experimenter as the chief factor recommending it. Since it had been found with vibratese that not more than seven contactors could be used and still preserve absolute identification of locus, even on the most expansive thorax, a natural conclusion was that intercontactor distance should never be allowed to limit discrimination in a tactile communication system.
In body braille uniformly brief, 60 Hz bursts of vibration were delivered to the six contactors on the arms. For the commonest letters (E, T, A, O, I, N) only a single vibrator was assigned; for all others double, triple, or quadruple contactor combinations were presented simultaneously. It was found that as many as four vibrators could be energized in a single pattern and still preserve perceptual uniqueness, provided only that three of them not be confined to one arm. The latter arrangement produces interesting inhibitory interactions in which wrist and shoulder loci combine to suppress the elbow vibration unless it, in turn, is raised in intensity. Then, indeed, if it is intensified sufficiently, it can inhibit the other two! The body braille alphabet was mastered in a few hours by its constructors (J. F. Hahn and the author), and, within a week of daily sessions, German proverbs were being successfully transmitted and deciphered, albeit a little ponderously.
A third communication system characterized by coded signals is that furnished by the optohapt (Geldard, 1966), essentially a tactile reader that utilizes a large portion of the two square yards of skin constituting the human integument. It was originally devised to demonstrate the fundamental principle that spatial information, when cut off from its primary sensor, the eye, should be fed to the only other space receiver the body possesses, the skin. At the same time, the optohapt relies on temporal information as well, so that all signals generated by it are essentially spatiotemporal.
The reading element consists of a bank of nine photocells—it is, indeed, the reader unit of the Battelle optophone—which triggers nine vibrators distributed over the body (two on each arm and leg, one on the upper abdomen, care being taken to avoid corresponding body points). The language concocted for the optohapt was arrived at only after thorough investigation of all available symbols in the IBM library of type faces, for the material to be "read" was typed on paper carried by the platen of a longcarriage, accounting typewriter. If uniqueness of perceptual recognition is taken as the criterion, which it was, most of the letters of the English alphabet prove to be poor candidates. Only the letters, I,J, and V survived the weeding-out process, which involved a total of 180 characters and was conducted by the pair-comparison technique. The really discriminable signs are made up of punctuation marks, some mathematical and business symbols, and a few more esoteric forms, some found chiefly on branding irons!
Several subjects were trained up to a sufficiently high level of performance to demonstrate the adequacy of this system for general tactile communication (Geldard and Sherrick, 1968). Beginning with familiarization with isolated letters, then random alphabets at the pedestrian rate of 70 characters per minute, two-letter, then three-letter words were presented, until short sentences, containing only short words, could be received with ease. Concomitantly, as the "traffic would stand it," speed was moved up to 100 characters per minute, and for one subject, eventually, 125. This, of course, is not rapid communication (a maximum of about 25 words per minute), but performance left little doubt that more rapid rates would be possible if presentation speed could be increased. This was subsequently accomplished by punching into Tally tape a code having the greatest possible resemblance to that of the optohapt.
Meanwhile, a fourth system, polytap, came under development, chiefly at the hands of Douglas Röhn, working in the author's laboratory. It has existed in several forms derived from the unexpected experimental result that radical changes could be made in vibrator number and position without seriously interfering with the subject's ability to retain and use the code. The question arose as to whether it might be possible to transfer optohapt patterns to the fingertips (for convenience) despite the generally poor discriminatory behavior shown by the fingers in earlier experiments with vibration (Gilson, 1968). Since the difficulty with prolonged vibratory patterns delivered to the fingertips seems to reside in the prodigious amount of stimulus spread within the bony hand, it was judged expedient to reduce the "buzz" to a "tap" and rely largely on locus (but somewhat on duration, supplied by short trains of well-spaced discrete taps) to give uniquely codable signals. The polytap instrument is thus equipped with twelve Bimorph benders, six for each hand, one for each finger, and one for the thenar eminence of the palm. This battery of contactors delivers either single (2-msec) taps or a train of three of them. Suitably coded—the commoner letters all represented by single taps—24 letters of the alphabet are accommodated. The two least-frequently encountered letters are signaled by a whole-hand "blast," Z on the right, Q on the left hand. The code is readily computerized, and was.
Experience with the polytap is very revealing(Geldard and Sherrick, 1972). Since the desire was to ascertain possibilities, not establish population norms for what was essentially an untried system, two subjects were given intensive training for a period of six months. After alphabet familiarization they were rapidly brought up to the handling of familiar four-letter words. When words, as contrasted with scrambled letters, were introduced, tactile reading speed doubled quite abruptly and both subjects felt strongly that, with intensive practice in that direction, speed could have been tripled. But, though reading rate got up to 24 four-letter words per minute, chains of words could not be sustained beyond about six or seven. Blocking occurred whenever there was a failure to assemble the elements of a word in visual imagery. Simple diphthongs could not be handled as units; even the word "the" had to be visualized letter by letter and the letters put together. We saw the same difficulty in Biber's experiment.
It is clear that polytap points up the necessity, in developing tactile communication systems, of placing a high priority on central somesthetic processing capabilities, not just the skin's capacity to discriminate signals. Toward the end of the six-month training period (and with something of a view to rallying flagging interest!) the polytap subjects were given less monotonous materials to read. The first part of Madame Bovary was edited to restrict word and sentence length but, of course, to retain inherent associative values and redundancy. Ceiling performance proved to be at the rate of 28 five-letter words per minute, but it simply could not be held at that speed for long; rate increases from daily practice and familiarity with context were nullified by failures of synthesis and consequent blocking.
It is not only contactor impacts and electrocutaneous substitutes for them that may be coded to form tactile languages. Another approach is to supply local pressures with the gentle gradients furnished by air puffs, as was done in the air-jet technique developed by Bliss and his colleagues (Bliss and Crane, 1969). This method proved to be a precursor to the Optacon, many features of the latter having been worked out with a matrix of 40 air jets (5X8). These, spaced on 1/4" centers (and subsequently packed together with only 1/16" spacing that would fit the fingertip) emitted brief puffs of air under 3-psi pressure. Air flow was interrupted 200 times a second and thus there was delivered an essentially vibratory stimulus. At first, full block letters were used in the matrix display, but subsequently better performance was realized by simplifying these to a set of characters more nearly like those of the optohapt.
A novel development with the air-jet stimulator was the introduction of translatory movement in the display. Two kinds were tried, a rotation in small circles of the jet tips and a "moving-belt" (Times Square) motion across the display. Both were effective in that they improved letter recognition and thus increased reading rate. Highly motivated blind subjects were the beneficiaries of the training supplied by these experiments; one of them is reported to have achieved a reading speed of forty words a minute.
The air-jet display also served as the medium for the investigation of another important aspect of tactile communication (Bliss et al., 1966). This concerns the amount of information obtained in brief tactile exposures as contrasted with the relatively meager amounts reported in immediate memory span experiments. The 24 interjoint "pads" of the fingers of the two hands (thumbs excluded) were stimulated by from 2 to 12 simultaneous air puffs having 2.5 msec overall durations. The observer was set to note certain specific loci and his time of report was controlled, as in comparable experiments in visual perception (Estes and Taylor, 1964). While showing the absolute inferiority of tactile to visual short-term memory, similar processing operations were judged to be in effect in that subjects had much more information in their possession at the time of reporting, providing time was not delayed more than about 0.8 sec beyond stimulus termination, than their relatively poor tactile immediate memory spans, measured conventionally, would seem to indicate. A model depicting tactile memory tasks was constructed. It calls for a sensory register having a duration of only a few seconds, which, however, has a storage perhaps 50 percent greater than the capacity of the short-term memory store and which decays exponentially with a time constant of 1.3 sec. Short-term capacity is held to differ widely in size for different people and to be limited chiefly by spatial, rather than temporal, resolution properties.
Our inventory of coded tactile mechanical devices would be incomplete without mention of the Visotactor (Smith and Mauch, 1968), developed at Mauch Laboratories, Dayton, Ohio, and designed as a reading aid for the blind. This instrument is not unique in principle, but it involves an interesting adaptation to the reading task. The Visotactor consists of a molded, handheld device that can be laid on a typed page and, guided by a so-called colineator, which insures that the instrument will be kept "on line," transformed by photocells typed or printed letters into patterns of tactual stimuli delivered through pins to the fingers; the resulting tactile complex is a lively one. We have encountered the principle, of course, in both the optohapt and the Optacon. It is the mobility and adaptability of the Visotactor that makes it different. It will accommodate type from 7 to 36 points in size and can be used by the blind to read unusual materials, such as envelopes, bank checks, and labels on jars, bottles, or cans, to mention a few. Other applications suggest themselves.
CODED CUTANEOUS LANGUAGES: ELECTRO-CUTANEOUS SYSTEMS
The coded systems based on electrocutaneous stimulation, of which there are several, should not detain us long. By and large, they have not met with much success. The difficulty seems always to be the same, the relatively unclear signal that fails in discreteness coupled with the excessive time required to process it, once received.
Continuous efforts, over a two-and-one-half year period, to develop an analog of vibratese led to such poor results as to have discouraged further exploration in that direction (Foulke, 1968). Loci at all ten fingertips, two intensities, and two durations were keyed to 26 letters, several punctuation marks, and some common short word endings. Though subjects attained speeds of 20 words per minute or more, it was at the cost of long reaction times to signals, well over a second on the average, and, in most instances, the necessity of analyzing the composition of the signal into its dimensional constituents. Whether the perceptual difficulties were inherent in the use of electrical stimuli or whether the dimension of locus was overloaded in this system is not clear. Such failures of synthesis were not present in vibratese once the learning process had advanced to a high enough level.
The same laboratory that attempted electrocutaneous vibratese also tried out a kind of body braille (Alluisi et al., 1965), with three electrodes on each side of the body, and also an electric braille confined to six fingers (Foulke, 1968). The coding principle of keying only to locus seems to have won out in this competition, and there was subsequently developed a novel system in which the characters to be coded, instead of being drawn from the English alphabet, were those belonging to the Katakana Syllabary (Foulke and Sticht, 1966). These language symbols, together with Hiragana, the script form, total 48 basic characters which, along with two diacritics that bring the number to 73, are well known to the average literate Japanese and are used in teaching reading and writing to children. Employing single-, double-, and triple-locus patterns with exclusively fingertip electrocutaneous stimulation, experiments were carried out with eight Japanese subjects familiar with kana. Early learning was rapid, and combinations of syllables forming words were ultimately mastered so that connected prose could be received successfully. But, as with so many other attempts to synthesize a facile cutaneous language, final word rates were disappointingly low and error rates were high; this promising effort has to be judged as having led to generally unsatisfactory performance.
The path through the forest of tactile language symbols has been lengthy and somewhat tortuous. It has, at every turn, offered choice points between two major approaches, mechanical stimulation of the skin or creation of sensory signals by direct application of electrical stimuli. There can be little doubt that, on the whole, the mechanical techniques have offered better solutions than have the electrocutaneous ones.
But, as was indicated at the beginning of our journey, there is a third possibility, that of creating mechanical motions of some complexity at the skin surface by appeal to the electrostatic properties of tissues. Experience here is not extensive, but some things are known, and there has even been a proposal for a working system that might mediate sophisticated communication.
Early interest in electrostatic stimulation of the skin was displayed in Piéron's Paris laboratory by Chocholle (1948). Some of the essential conditions for successful vibration of the skin were worked out by him, including the requirements that the skin be absolutely dry if the condenser property is to emerge. Moore (1968) showed how thresholds vary from finger to hand to arm and how electrode area was an important variable in such experiments. Recently, there has been a somewhat more elaborate application of the electrostatic principle by Strong and Troxel (1970), who devised a matrix of closely packed metal pins, a 10 X 18 array 1.0" wide and 1.8" long, which may be explored conveniently by the finger. Each of the 180 pins, the flat distal ends of which constitute a condenser "plate," separately completes a circuit by way of finger and hand to a common electrode at the heel of the hand. Bipolar square-wave pulses, repeated at the rate of 200 pps, elicit what is described as a "texture" sensation whenever the finger is used as a probe to explore the array of pins. The "feel" is dependent to some extent on the amplitude of the pulses and partly on the pulse repetition rate. No "texture" is felt unless the finger is moving.
Attempts to assess the possible utility of the texture patterns have involved, in addition to the obvious measures of absolute threshold and decrements in them with increase of electrode area, the size of the just-discriminable intensity increment (5-10 percent), and that of the frequency change (40 percent, with 200 pps as a base), experiments in which the internal spatial features of the matrix have been varied. It appears that two points or two lines within the array can be discerned as two if there are as few as three intervening "dead" pins. A point can be discriminated from a line if only two blanks separate them. The measurements were relatively crude by ordinary psychophysical standards, but the result that pattern differences of this degree of discriminability could be found at all holds out a hope that the electrostatic method may one day come into its own.
Tracking and Monitoring
Is it possible to track a moving target with the aid of purely tactile signals? The answer is certainly positive, but the qualifications that have to surround it depend on what, precisely, is meant by "track," "moving," and "target." Specification of one or more of these usually comes when the question is put in the form, "How does tactile tracking compare with visual or auditory tracking?"
Anyone with a predilection for a particular answer could find support in the literature for it. Cutaneous tracking has been shown to be as good as, better than, and worse than visual tracking; it has been demonstrated to be on a par with, better than, and worse than auditory tracking. It all depends on the kind of situation selected for test, how strictly analogous the tasks are in any two modalities being compared, and what aspects of the tracking response (time on target, errors, etc.) are selected as indicators of proficiency. In some situations, tactual tracking is as good as visual but only so long as target speed is low; let the task demand more rapid following and vision promptly exerts its superiority (Howell, 1960).
In one experiment (Hofmann, 1968) utilizing a simple task involving steering to avoid going off an erratic path and in which continuous error feedback was delivered either visually (two white lights—one to the left, one to the right), by auditory means (white noise in one ear or the other), or tactually (weak electrocutaneous pulse to one or the other side of the neck), an overall tracking efficiency score distinctly favored the auditory and tactile modes over the visual. In another experiment (Schori, 1970) in which fifteen subjects tracked by vision, another fifteen by hearing, and a third fifteen by feel, the three tasks being strictly analogous, there proved to be no reliable difference among the group scores. As a practical matter, one channel was as effective as any other. However, when a secondary task was set up, that of monitoring an extraneous signal that had to be responded to even as the tracking performance was continuing, there was an appreciable decrement in the primary task being carried out with tactile cues. Visual and auditory tracking performances were much less affected. The tactile channel apparently exacted much greater attentional effort than did the other two.
The role of cutaneous perception in tracking behavior is best approached from the standpoint of cooperation rather than substitution. Except for the predicament in which neither sight nor hearing is available to provide the needed cues, the tactile contribution presumably should be that of supplement rather than surrogate. As a matter of fact, few experiments on tracking behavior have come even close to replicating the complexities of real-life situations. There are a great many modern tasks, both public and private, that put unusual demands on flexibility of attention, rapidity of reaction, soundness of judgment, and coordination of muscular response. As a case in point, let us consider the task of the aircraft pilot. He "tracks" a great deal, considering that many of the data dictating an airplane's flight are nowadays processed automatically. The most important thing a pilot can do when flying an airplane is to insure that he will remain current with respect to air traffic conditions in both his near and remote environments. A survey of the circumstances surrounding mid-air collisions in military flying shows that four out of five of them occur in daylight and with normal visual ground contact in force. By and large, it is not material failure or poor visibility or even the fact that air speeds have been increased tremendously in recent years that accounts for collisions, as one might suspect.
While this is not the place to attempt a fullscale analysis of the pilot's task, it should be noted that the visual demands on him are prodigious, literally surrounded as he is with flight and engineering instruments, many of which have to be consulted with some frequency. If sensory information other than the visual could relieve some of the monitoring tasks, allowing more freedom for search of the air space, it would be a great boon. Or, quite apart from freeing vision, if other sensory data providing much the same information that only the eyes are privy to could be supplied in a supplementary way, just to provide confirmation and reassurance, there would undoubtedly be a net gain on the side of reduction of observational fatigue and increase of overall efficiency. Tactile communication could be of real help in cutting down the complexities and contributing to reduction of stress.
Some of the tactile tracking experiments have clearly shown how. Durr (1961) demonstrated that a five-vibrator signaling system, arranged domino-fashion on the chest (and, later, with an increase of accuracy, more widely spread out on the body), could furnish a suitable bidimensional display that could give much the same directions as are now supplied by the ILS (instrument landing system, a visual blind-flying guide). Triggs et al. (1974) have proven the efficacy of both mechanical and electrocutaneous displays to yield basic "pitch-and-roll" data. Hill (1970) has shown the relative superiority of a "ripple" type of display (rapidly successive pressures spread out over the skin, analogous to some automobile turn signals), superior at least in the promptness with which the receiver of the message responds to it. And there are a number of simpler things, e.g., provision of a tactual signal to replace or amplify a visual one in warning situations, such as "change gas tanks" and "too much pressure on the brakes" in taxiing operations.
It must be obvious that the guidance of aircraft does not constitute the sole area of application of cutaneous tracking in the service of vehicular and mechanical tasks. Cutaneous signaling can be employed successfully wherever amounts, directions, rates, or even forms of relational information, such as are revealed by reference to coordinates, are to be transmitted (Geldard, 1960). This means that the potentialities of tactile communication have not even begun to be realized. There is much on the shelf to be put to use, but precious few are casting glances at the shelf.
Alluisi, E. A., Morgan, B. B.; and Hawkes, G. R.; 1965. Masking of cutaneous sensations in multiple stimulus presentations. Perceptual and Motor Skills, 20:39-45.
Anderson, A. B., and Munson, W. A., 1951. Electrical excitation of nerves in the skin at audiofrequencies. J. Acoust. Soc. Amer., 23:155-59.
Békésy, G. von, 1955. Human skin perception of traveling waves similar to those on the cochlea.J. Acoust. Soc. Amer., 27:830-41.
Biber, K. W., 1961. Ein neues Verfahren zur Sprachkommunikation über die menschliche Haut. Thesis, Universität Erlangen, Germany.
Bledsoe, W., 1972. Braille: a success story. In: Evaluation of Sensory Aids for the Visually Handicapped. Washington, D.C.: National Academy of Sciences, pp.3-33.
Bliss, J. C., and Crane, H. D., 1969. Tactile perception. In: Research Bulletin of the American Association for the Blind, no. 19, L. L. Clark, ed., pp.205-30.
Bliss, J. C.; Crane, H. D.; Mansfield, P. K.; and Town-send, J. T.; 1966. Information available in brief tactile presentations. Perception and Psychophysics, 1:273-83.
Bliss, J. C.; Katcher, M. H.; Rogers, C. H.; and Shepard, R. P.; 1970. Optical-to-tactile image conversion for the blind. IEEE Trans. Man-Machine Systems, MMS-11:58-65.
Breiner, H. L., 1968. Versuch einer elektrokutanen Sprachvermittlung. Z. exper. angewandte Psychol., 15:1-48.
Cassirer, E., 1944. An Essay on Man. New Haven: Yale University Press. ix+237pp.
Chocholle, R., 1948. Emission d'ondes acoustiques audibles par l'épiderme sous l'action d'un courant électrique alternatif. C. r. Soc. Biol., Paris, 142:469-71.
Collins, C. C., 1970. Tactile television—mechanical and electrical image projection. IEEE Trans. Man-Machine Systems,MMS-11:65-71.
Craig, J. C., 1974. Pictorial and abstract cutaneous displays. In: Cutaneous Communication Systems and Devices, F. A. Geldard, ed. Austin, Texas: The Psychonomic Society, pp.78-83.
Dalgarno, G., 1680. Didascalocophus, or the deaf and dumb man's tutor. In: The Works of George Dalgarno. Reprint ed., Edinburgh: T. Constable, 1834, pp. 113-59.
Dalrymple, G. F., 1973. Development and demonstration of communication systems for the blind and the blind/deaf. In: Final Report of Project I4-P-550I6/1-03. Cambridge: Sensory Aids Evaluation and Development Center, M.I.T., pp. 17-29.
DuPont, M., 1907. Sur des courants alternatifs de périodes variées correspondant à des sons musicaux et dont les périodes des mêmes rapports que les sons; effets physiologiques de ces courants alternatifs musicaux rythmés. C. r. Acad. Sei., 144:336-37.
Dudley, H. W., 1936. The Vocoder. Bell Lab. Ree., 18:122-26.
Durr, L. B., 1961. The effect of error amplitude information on vibratory tracking. Master's thesis, University of Virginia.
Estes, W. K., and Taylor, H. A., 1964. A detection method and probabilistic models for assessing information processing from brief visual displays. Proc. Nat. Acad. Sei., 52:446-54.
Farrell, G., 1950. Avenues of communication. In: Blindness: Modern Approaches to the Unseen Environment, P. A. Zahl, ed. Princeton: Princeton University Press, pp.313-45.
Finkenzeller, P., 1973. Hypothese zur Schallcodierung des Innenohres. Habilitationsschrift, Erlangen, Germany. 99pp.
Foulke, E., 1968. Communication by electrical stimulation of the skin. In: Research Bulletin of the American Foundation for the Blind, no. 17, L. L. Clark, ed., pp.131-40.
Foulke, E., and Brodbeck, A. A., Jr., 1968. Transmission of Morse code by electrocutaneous stimulation. Psychol. Ree., 18:617-22.
Foulke, E., and Sticht, T. G., 1966. The transmission of the Katakana syllabary by electrical signals applied to the skin. Psychologia, 9:207-209.
Gault, R. H., 1926. Tactual interpretation of speech. Sei. Mo., 22:126-31.
Gault, R. H., and Crane, G. W., 1928. Tactual patterns from certain vowel qualities instrumentally communicated from a speaker to a subject's fingers.J. Gen. Psychol., 1:353-59.
Geldard, F. A., 1960. Some neglected possibilities of communication. Science, 131:1583—88.
Geldard, F. A., 1962. The language of the human skin. Proc. XIV Internat. Congr. Appl. Psychol., 5:26-39.
Geldard, F. A., 1966. Cutaneous coding of optical signals: the optohapt. Perception and Psychophysics, 1:377-81.
Geldard, F. A., 1968. Pattern perception by the skin. In: The Skin Senses, D. R. Kenshalo, ed. Springfield, 111.: Thomas, pp.304-21.
Geldard, F. A., 1970. Vision, audition, and beyond. In: Contributions to Sensory Physiology, vol. 4., W. D. Neff, ed. New York: Academic Press, pp. 1-17.
Geldard, F. A., and Sherrick, C. E., 1968. Temporal and spatial patterning. Princeton Cutaneous Research Project, Report no. 12, pp.6-8.
Geldard, F. A., and Sherrick, C. E., 1970. Modified vibratese. Princeton Cutaneous Research Project, Report no. 16, pp.2-11.
Geldard, F. A., and Sherrick, C. E., 1972. Discriminability of finger patterns. Princeton Cutaneous Research Project, Report no. 20, pp.2-4.
Gilson, R. D., 1968. Some factors affecting the spatial discrimination of vibrotactile patterns. Perception and Psychophysics, 3:131-36.
Goff, G. D., 1967. Differential discrimination of frequency of cutaneous mechanical vibration. J. exper. Psychol., 74:294-99.
Guelke, R. W., and Huyssen, R. M. J., 1959. Development of apparatus for the analysis of sound by the sense of touch. J. Acoust. Soc. Amer., 31:799-809.
Henneman, R. H., 1952. Vision and audition as sensory channels for communication. Quart. J. Speech, 38:161-66.
Hill, J. W., 1970. A describing function analysis of tracking performance using two tactile displays. IEEE Trans. M an-Machine Systems,MMS-11:92-101.
Hofmann, M. A., 1968. A comparison of visual, auditory, and electrocutaneous displays in a compensatory tracking task. Ph.D. diss., University of South Dakota.
Howell, W. C., 1956. Training on a vibratory communication system. Master's thesis, University of Virginia.
Howell, W. C., 1960. On the potential of tactile displays: an interpretation of recent findings. In: Symposium on Cutaneous Sensitivity,USAMRL Report no. 424, G. R. Hawkes, ed., pp.103-13.
Katz, D., 1930. The vibratory sense and other lectures. Maine Bulletin, 32:90-104.
Keidel, W-D., 1968. Electrophysiology of vibratory perception. In: Contributions to Sensory Physiology, vol. 3, W. D. Neff, ed. New York: Academic Press, pp.l-79 (esp. pp.69-79).
Keidel, W-D., 1974. The cochlear model in skin stimulation. In: Cutaneous Communication Systems and Devices, F. A. Geldard, ed. Austin, Texas: The Psychonomic Society, pp.27-32.
Kirman,J. H., 1973. Tactile communication of speech. Psychol. Bull., 80:54-74.
Kringlebotn, M., 1968. Experiments with some visual and vibrotactile aids for the deaf. Amer. Ann. Deaf, 113:311-17.
Levine, L., et al., 1949-1951. "Felix" (Sensory replacement). Quart. Progress Reports, Research Lab. of Electronics, M.I.T. Cambridge: Massachusetts Institute of Technology.
Lindner, R., 1936. Physiologische Grundlagen zum elektrischen Sprachtasten and ihre Anwendung auf den Taubstummenunterricht. Z. Sinnesphysiol., 67:114-44.
Melen, R. D., and Meindl, J. D., 1971. Electrocutaneous stimulation in a reading aid for the blind. IEEE Trans. Bio-Med. Engin.,BME-18:1-3.
Miller, G. A., 1956. The magical number seven, plus or minus two: some limitations on our capacity for processing information. Psychol. Rev., 63:81-99.
Moore, T. J., 1968. Vibratory stimulation of the skin by electrostatic field: effects of size of electrode and site of stimulation on thresholds. Amer. J. Psychol., 81:235-40.
Pickett, J. M., and Pickett, B. H., 1963. Communication of speech sounds by a tactual vocoder. J. Speech Hear. Res., 6:207-22.
Rollman, G. B., 1974. Electrocutaneous stimulation. In: Cutaneous Communication Systems and Devices, F. A. Geldard, ed. Austin, Texas: The Psychonomic Society, pp.38-51.
Rösier, G., 1957. Über die Vibrationsempfindung. Literaturdurchsicht und Untersuchungen in Tonfrequenzbereich. Z. exper. angewandte Psychol., 4:549-602.
Rousseau, J-J., 1762. Émile, ou de T Éducation. Oeuvres de Jean-Jacques Rousseau, tome VII. Amsterdam: Neaulme.
Saunders, F. A., 1974. Electrocutaneous displays. In: Cutaneous Communication Systems and Devices, F. A. Geldard, ed. Austin, Texas: The Psychonomic Society, pp.20-26.
Schori, T. R., 1970. A comparison of visual, auditory, and cutaneous tracking displays. Ph.D. diss., University of South Dakota.
Sherrick, C. E., 1965. Simple electromechanical vibration transducer. Rev. Sei. Instr., 36:1893-94.
Sherrick, C. E., 1975. The art of tactile communication. Amer. Psychologist, 30:353-60.
Smith, G. C., and Mauch, H. A., 1968. The development of a reading machine for the blind. Summary Report, VA Prosthetic and Sensory Aids Service. iv+70pp.
Spector, P., 1954. Cutaneous communication systems utilizing mechanical vibration. Ph.D. diss., University of Virginia.
Starkiewicz, W., and Kuliszewski, T., 1963. Active energy radiating systems: the 80-channel Elektroftalm. In: Proc. Internat. Congr. Technol. Blindness, vol. I, L. L. Clark, ed. New York: American Foundation for the Blind, pp. 157-66.
Starkiewicz, W., and Kuliszewski, T., 1965. Progress report on the elektroftalm mobility aid. In: Proc. Rotterdam Mobility Research Conference, L. L. Clark, ed. New York: American Foundation for the Blind, pp.27-38.
Strong, R. M., and Troxel, D. E., 1970. An electrotactile display. IEEE Trans. M an-Machine Systems,MMS-11:72-79.
Triggs, T. J., Levison, W. H.; and Sanneman, R.; 1974. Some experience with flight-related electrocutaneous and vibrotactile displays. In: Cutaneous Communication Systems and Devices, F. A. Geldard, ed. Austin, Texas: The Psychonomic Society, pp.57-64.
Virginia Cutaneous Project, 1948-1962, 1962. Ann Arbor, Mich.: University Microfilms (OP 16, 352).
White, B. W.; Saunders, F. A.; Scadden, L.; Bach-y-Rita, P.; and Collins, C. C.; 1970. Seeing with the skin. Perception and Psychophysics, 7:23-27.