Deafblind Communities May Be Creating a New Language of Touch | New Yorker | May 12 2022 | Andrew Leland

My lovely friend Jo continues to send me excellent articles on Protactile. This is great. It is copied wholesale from https://www.newyorker.com/culture/annals-of-inquiry/deafblind-communities-may-be-creating-a-new-language-of-touch because I don’t understand how or when their paywall operates, I make no money from this blog (zero views!), it’s two years old, and it’s important.


When John Lee Clark was five years old, in 1983, he entered a small Deaf program within a public school near his home in Eden Prairie, Minnesota. Clark was a dreamy kid who dressed in tucked-in button-downs and pressed slacks. He came from a large Deaf family—his father and brother are DeafBlind, his mother and sister are Deaf and sighted—and the family had communicated in American Sign Language (or A.S.L.) for generations. On Clark’s first day of kindergarten, his mother, worried, followed his school bus in her car. When she surprised him at school to ask if he was O.K., Clark said that he was fine but that the bus driver had forgotten how to speak. His mother laughed and reminded him that the driver didn’t know how to speak: she was hearing! “This is a common story among Deaf families,” Clark told me recently. “The gradual dawning that all those mutes could actually talk with one another, but in a very different way.”

In third grade, Clark began a bilingual Deaf program. Instruction was in A.S.L., but students were grouped on the basis of their ability to read English, a second language that Clark accessed only in print. “My literacy was abysmal,” he said. He still has a workbook from that time, in which he answered questions—“What is your favorite sport?” “Who are the members of your family?”—with drawings instead of in English. But he was gifted in A.S.L., and teachers would ask him for help with tricky words. He sometimes pranked them by inventing ostentatiously elaborate versions. The word “heaven” is difficult for A.S.L. learners, involving a precise looping of the hands; Clark added several gratuitous loops.

At twelve, Clark began attending a residential Deaf school, many of whose students came from Deaf families. But, around this time, he began to go blind. Hundreds of thousands of people in the U.S. have some combined hearing and vision loss, but most are older adults and have spent the bulk of their lives hearing and sighted. A much smaller group—about ten thousand, according to some estimates—become DeafBlind earlier in life; a leading genetic cause is Usher syndrome. Clark, his father, and his brother have Usher, which can cause a person to be born deaf and to gradually go blind. At fourteen, Clark started to lose track of A.S.L. conversations. “I was this boy who always said, ‘Say again?,’ who might collide into you,” Clark told me. “So pathetic.” He began reading in Braille, which his father had encouraged him to learn as a child, and started walking with a white cane.

In high school, Clark stopped trying to follow A.S.L. visually and began using tactile reception, feeling words with his hands. This helped, but miscommunication was common. A.S.L. is a fundamentally visual language. The dominant-hand gestures for the words “stamp” and “fun,” for instance, look very similar, except that “stamp” begins near the mouth, whereas “fun” starts at the nose. Yes-or-no questions are signified with raised eyebrows, and sentences can be negated with a shake of the head. When Clark would reply in A.S.L., he’d have no idea how the person was responding, or whether she was still paying attention at all; he said that it was like “talking to a wall.” He attended Gallaudet, a Deaf university in Washington, D.C., with his future partner, Adrean, a sighted-Deaf artist. “It was really when I got married that I noticed more serious problems,” he told me. He would come home from the store without the items that Adrean had requested, and misunderstood the timing of their appointments: “It’d blow up on me, how that information in ASL had failed to register.”

On September 11, 2001, Clark went to a literature class at the University of Minnesota, where he was working toward his bachelor’s degree. When he arrived, his interpreters made the hand shape for “airplane,” and ran it into a raised finger twice. Clark interpreted this as “an airplane hitting two poles,” and assumed that he was hearing about a local-news story—perhaps about a hobbyist in a prop plane hitting telephone wires. It wasn’t until he got home that he learned how much he must have missed: the tear-streaked faces, the TV footage running on loop. (I heard remarkably similar stories about 9/11 and other cataclysmic news events from several DeafBlind people.)

In 2013, Clark attended a training, in Minneapolis, in Protactile, a new movement that was encouraging DeafBlind people to reject the stigma, in American culture, against touch, which often leaves them cut off from the world around them. According to Protactile’s principles, rather than waiting for an interpreter to tell her about the apples available at the grocery store, a DeafBlind person should plunge her hands into the produce bins. If a sighted friend pulls out her phone in the middle of a conversation to check a weather alert, she should bring her DeafBlind interlocutor’s hand to her pocket as well, to understand where the weather forecast is coming from.

Protactile includes a set of practices to make tactile communication more legible. One of its creators, a DeafBlind woman named Jelica Nuccio, showed Clark how it worked. They sat facing each other, their legs touching, and Nuccio rested Clark’s hand on her knee, explaining that, as she spoke, he should tap to indicate that he understood, like nodding—a practice called back-channelling. Nuccio articulated words into Clark’s hand, but also directly onto his arms, back, chest, and lower thighs. In A.S.L., pronouns are articulated as points in space; you might designate Minneapolis as a spot in the air near your left shoulder, and Seattle as a spot near your right, and then those gestures stand in for the cities. Nuccio showed Clark how to indicate them as points on the body instead: a two-fingered press on each shoulder.

“It didn’t feel like a lightning-bolt moment,” Clark told me. “It was all too natural.” But after the training he noticed changes in his household. He and Adrean began using a Protactile principle called co-presence: if she came into a room, she would brush him to let him know that she was there. Before, they’d sat around the table, and whoever sat next to Clark interpreted what the rest of the family said. Afterward, they began eating in informal clusters, allowing for tactile group conversations.

In the years since, Protactile has spread across the country. Today, most DeafBlind adults have heard of Protactile’s call to place touch at the center of their lives. Clark, who has become a leader in the movement, compared it to the Deaf Pride movement of the nineteen-eighties, when more Deaf people began speaking A.S.L. in public, insisting that hearing people gesture back. A few hundred people use Protactile’s communication practices daily—a very small group. Still, several linguists have come to believe that, among some of its frequent users, Protactile is developing into its own language, with words and grammatical structures that have diverged from those of A.S.L. “I am totally convinced that this is no tweak of A.S.L.,” Diane Brentari, one of the premier linguists of sign language, who teaches at the University of Chicago, told me. “This is a new language.” Clark believes that Protactile has the potential to upend centuries of DeafBlind isolation. “It’s an exciting time to be DeafBlind,” he has written. “The single most important development in DeafBlind history is in full swing.”

This past December, I met Clark in an old stone building that houses some of the University of Chicago’s linguistics labs. Clark is tall, with a youthful face. He lives with his partner and their three children, who are hearing and sighted, in St. Paul. He writes poems that are published regularly in Poetry magazine; he won a National Magazine Award, in 2020, for a piece on tactile art, and has both a poetry collection and a book of essays forthcoming from Norton. When we met, I was struck by the similarity between his presence in person and the way he comes across over e-mail; in both, he is affectionately didactic. I had assumed that he would speak through my interpreter, but he insisted that he address me directly while she watched and translated, so that I could experience the feel of Protactile.

I have a condition called retinitis pigmentosa, the visual component of Usher syndrome, which is causing me to slowly go blind. (My hearing is unaffected.) As Clark and I faced each other, our white canes leaning in a corner of the room, he kneaded my shoulders, and instantly found my baseball cap, which I use as a sort of cane for my face—it saves me from slamming my head into open cabinet doors. “Lots of people with Usher syndrome and R.P. will use these kinds of caps,” Clark said. He playfully pulled it off my head and pressed it to my chest. “If you want to up your game in Protactile,” he said, “then what you’re going to need to do is get rid of that cap and get your hands busy.”

The title of Clark’s new poetry collection, “How to Communicate,” captures what has always been the central problem for DeafBlind people. DeafBlind children living in linguistic isolation can spontaneously develop home signs that their immediate families understand. Laura Bridgman, who lost her sight and hearing to scarlet fever in eighteen-thirties New Hampshire, had signs for “father” (her hand drawn across her cheeks, describing his whiskers) and “spinning wheel” (a rotating hand). But, without a wider community, home signs can’t grow into full languages. In 1837, the educator Samuel Gridley Howe recruited Bridgman to attend what would later be called Perkins, the first American school for the blind, in Massachusetts. Howe had previously visited Hartford’s American Asylum for the Deaf and Dumb, an incubator for what would soon emerge as American Sign Language, but dismissed signing as little more than pantomime. Instead, he and others at Perkins taught Bridgman to read and write English, using raised letters, which she quickly mastered. He relentlessly publicized this achievement, and Bridgman became an international celebrity.

Forty years later, Helen Keller’s mother read Charles Dickens’s account of meeting with Bridgman, and reached out to Perkins, which sent a recent graduate, Annie Sullivan, to educate Keller. Sullivan finger-spelled English words into Keller’s hands, hoping that she would slowly pick up the language, the way infants pick up spoken language. The story of Keller’s breakthrough, as her teacher placed her hand under a stream of water while finger-spelling W-A-T-E-R into the other, is a canonical scene in American history. There’s a bronze statue of Keller at the water pump in the U.S. Capitol, and the moment was immortalized in the 1962 film “The Miracle Worker.” The “miracle” is Sullivan’s feat of bringing language to a DeafBlind person—someone understood to be, as Howe described Bridgman, consigned to the “darkness and silence of the tomb.”

Clark has no patience for this sacred image of Keller’s DeafBlind epiphany. “There was already a word for water,” he said. Keller had developed dozens of home signs with her family before Sullivan arrived, for words such as “ice cream” (pretending to turn the crank of the freezer, then shivering) and “bread” (“I would imitate the acts of cutting the slices and buttering them,” Keller wrote). “What Helen learned to do was to perform a stunt,” Clark has written. “Annie was attempting the equivalent of forcing Helen Keller to utter a pentasyllabic word . . . whenever she wanted water. If you’re thirsty, say ‘ideology’ or ‘specification’ or ‘liability.’ ”

In Keller’s lifetime, other methods of DeafBlind communication arose. In public, Keller used the Tadoma method, in which she placed a thumb on the throat of her interlocutor and the rest of her fingers across that person’s lips and jaw—a kind of tactile lip-reading. There were several variations on an “alphabet glove,” printed with English letters so that a sighted person could tap out a message, but communicating letter by letter was cumbersome and slow. Today, some DeafBlind people communicate orally, many using hearing aids or cochlear implants, which usually offer only partial access to speech. Tactile sign language is also used, but issues with intelligibility remain. A 1995 study found that DeafBlind people understand as little as sixty per cent of a sentence conveyed through tactile sign language. Various systems have been devised to improve tactile communication. In the nineteen-nineties, Trine Næss, a DeafBlind Norwegian woman, standardized Haptics, a system of touch signals for common words: eleven for colors, eight for drinks. “For a time, some people were, like, ‘Do you support Haptics or P.T.?’ ” Clark told me. “But you really cannot compare the two—it’s not a Pepsi-vs.-Coke situation, but Pepsi vs. Cadillac.”

In 2005, Jelica Nuccio took over as the first DeafBlind director of Seattle’s DeafBlind Service Center (or D.B.S.C.), which offered social services to about a hundred people in the region. Thirty years earlier, a nonprofit called the Seattle Lighthouse for the Blind had established a program that employed DeafBlind people to do industrial work, and, in the decades since, the city had become a kind of DeafBlind mecca. Nuccio is fifty-seven, with long, dark hair and a bright laugh. “I was ready to move to Seattle and start a new chapter,” she said.

Nuccio had come to sign language late. As an adolescent, she attended a school in St. Louis that taught the oralist method, drilling Deaf students in gruelling exercises to learn how to read lips and produce speech. “The nuns said if you signed, you were stupid,” she said. “If you point at something in sign, you look like an animal.” She learned A.S.L. only in college, at the Rochester Institute of Technology, after her Deaf classmates mocked her speech by using a derogatory word for “oralism” in A.S.L., two horizontal forearms coming together like giant lips flapping: blah blah blah. In 1996, as Nuccio was becoming increasingly blind, she went to the Helen Keller National Center, a training facility on Long Island. But she felt that it was run like a prison. In the cafeteria, “they immediately started shovelling food at me,” she said. “They weren’t even communicating with me. I was in a feeding trough. I was, like, ‘I have degrees, people!’ ” (A spokesperson for the center told me that, for fifty-five years, “thousands of DeafBlind individuals have benefitted from HKNC’s programs” and added that its staff is “knowledgeable, helpful and kind.”)

Nuccio was disappointed by what she found at the D.B.S.C. Sighted employees and interpreters dominated life there. Blindness is enormously stigmatized in Deaf culture, and many of the DeafBlind people at the D.B.S.C., whom Nuccio called the “tunnel-vision people,” clung to their dwindling eyesight, continuing to use visual A.S.L. even as it grew difficult. The “tactile people” ate in a separate group at lunch and were treated with pity and condescension. DeafBlind people often use interpreters to interact with the hearing-sighted world, but, in Seattle, they used them in DeafBlind groups, too: each client would speak to her own interpreter, who would repeat the message to other interpreters, who would then relay it to their clients. “I didn’t understand why they would call Seattle ‘the DeafBlind mecca’ when it was run that way,” Nuccio said. “Yes, there are a lot of DeafBlind people here. But so what? Why is that a mecca?”

Nuccio hired aj granda, who is DeafBlind and had worked on and off for the D.B.S.C. The pair recalled that the interpreter program at a nearby community college had posted a sign on the wall that said “ASL Zone”: when you entered the room, you agreed to abide by the rules of Deaf space by “turning off” your voice. They decided to make the D.B.S.C. a DeafBlind-friendly zone, modelled on the same principle. But what were the rules of DeafBlind space?

The first rule that they established came to be called “air space is dead space.” DeafBlind people at the D.B.S.C. were continually left out of A.S.L. conversations among sighted people. Now, whether you were DeafBlind or not, all communication needed to happen in the realm of touch. Granda told me that their conversations with Nuccio had become so adapted to tactile reception that sighted friends could no longer follow them. To make tactile words even more expressive, the pair gradually expanded the canvas of touch to include the back, arms, lower thighs, and upper chest. Back-channelling emerged to capture what A.S.L. speakers communicate through facial expressions—a limp hand laid on the knee could signify exhaustion, and a tense grip might indicate terror. “Everything was kind of clunky, and everyone was awkward with how we were using each other’s bodies,” Nuccio said. “ASL was in the mix, and it was a mess. It was a great, messy start.”

Nuccio and granda called their method Protactile, and, within a few years, they were holding trainings. But, for the most part, the sight-reliant people were set in their ways. They often arrived, found a chair, and sat down, waiting for their interpreters. “I said, ‘If you need to know where anything is, you can ask a DeafBlind person,’ ” Nuccio said. She would take their hands, and together they’d touch the drinks and the snacks. Nuccio and granda encountered tremendous resistance among employees and clients at the D.B.S.C. “DeafBlind people are oppressed by Deaf people in the Deaf community,” granda said. “People who are oppressed tend to oppress others.” Nuccio ended up firing much of her staff, including many of her friends. But she and granda believed that they were developing a new political framework to achieve DeafBlind autonomy.

The pair hadn’t set out to alter the linguistics of A.S.L., but, as DeafBlind people in Seattle took Protactile’s methods home, words began to change in their hands. Granda said, “they realized ASL was no longer their language.” The A.S.L. word “yes,” for instance, is a fist bobbing in space, like a nodding head. But by touch it felt wrong. “We knew what it meant because we knew the ASL word, but it was weird,” Clark told me. “A head rubbing itself against a wall? It did not make natural sense in contact space.” The A.S.L. word “no,” a two-fingered pinch, was similarly off-putting. “It felt like an ostrich trying to pluck some hair off your head,” Clark said. “We never had a meeting to invent any new words. Life went on, and we had to say yes and no a thousand times every day!” In time, the community replaced these A.S.L. words with words that felt more tactilely intuitive: “yes” became an affirmative patting, and “no” felt like a hand swiftly erasing a message from a whiteboard. “Those P.T. words are so simple, duh-worthy, so elegant. And they have absolutely no relation to ASL or the ASL words ‘yes’ and ‘no,’ ” Clark said. “Not a shred in common.”

The A.S.L. word “vehicle” is made with a hand turned on its side so that the thumb is like a driver piloting a craft through the air. By touch, all you can feel is a pinky grazing your leg. Over time, the Protactile word became a flat palm driving across the lower thigh. And speakers developed ways of elaborating on these new words. “Instead of describing the size of a vehicle in terms of how big it looks,” Nuccio and granda wrote, the tactile word can describe a vehicle “in terms of how heavy it is, or how much friction it generates on the road”—the features more relevant to touch. To signify a large vehicle, the speaker presses a flat palm down hard on the receiver’s leg. For a compact car, she’d use a lighter touch.

Some worried that Protactile’s intense tactile immersion could feel inappropriate, including to DeafBlind survivors of sexual or domestic violence, an objection that its creators have had to grapple with. Granda has taught Protactile to numerous DeafBlind people whose prior traumas made them resistant to touch. “We care about survivors and want to make sure that those people feel safe,” granda said. But they argued that anyone can feel comfortable and safe in Protactile. “There is a natural form of appropriate consent built into the language that, with constant conversations, actually can bring about healing,” they said.

By the mid-twenty-tens, Protactile had evolved from a set of communication practices into a national movement. Granda and Nuccio made Braille bumper stickers, released videos, and travelled the country giving workshops and hosting “P.T. happy hours,” where locals could learn the basics. Nuccio and granda eventually drifted apart, and granda has spent time working at the Seattle Lighthouse and teaching Protactile at Seabeck, an annual DeafBlind retreat near the city. In 2014, Nuccio established an organization dedicated to Protactile training called Tactile Communications. Around the same time, Clark joined the Protactile movement, and has led trainings that have reached hundreds of people.

This past December, a half-dozen of Protactile’s most fluent speakers met up at the University of Chicago. They had come at the invitation of Terra Edwards, a linguistic anthropologist who is studying Protactile with her colleague Brentari, the sign-language linguist. By visual standards, the lab had a drab, provisional air: it was empty aside from a haphazard scattering of metal folding chairs and a table pushed against the wall. But, in DeafBlind space, this was a comfortable arrangement, ideal for generating ad-hoc clusters of tactile conversations, with no armrests or conference tables to separate people’s bodies. Nearly all of the DeafBlind people were in stocking feet. “With shoes, everything feels the same,” Hayley Broadway, who had flown in from Austin, said. “I don’t feel the ground. I can’t feel if it’s dirty or if it’s rough.” Earlier that year, Broadway had married her husband, who is also DeafBlind, in a Protactile ceremony. They walked down the aisle in an intertwined cluster of friends. For the exchange of vows, the officiant spoke in Protactile to both Broadway and her husband, forming a three-way conversation. Everyone at the wedding was barefoot, and the couple served sushi. “We just wanted finger food,” she said, “something you can eat with one hand while you could stay in communication with the other.”

Clark walked into the room wearing a burgundy shirt. He had a co-navigator with him, who joined him in interactions with the hearing-sighted world of airline attendants, cabdrivers, and cashiers. But the co-navigator trailed behind as Clark strode into the room, reaching out to explore his environment. He found Nuccio, spoke his Protactile name onto her back—two quick downward strokes—and they hugged. My interpreter put her hands on their backs, signalling her presence. This was, I realized, what it meant to be communicating in contact space: I was sitting a few feet away, but my observation was covert; it was only when I laid my hands on the group that I was actually present with them.

Clark now speaks to his partner and children in Protactile. Jaz Herbers, who retired after fifteen years working in I.T. because of his changing vision, saw early videos explaining Protactile in 2013. “I was, like, ‘That’s it!’ ” he told me. “That’s the answer to my life now.” Today, he leads Protactile trainings around the country. Rhonda Voight-Campbell, a forty-nine-year-old instructor at the Rochester Institute of Technology, attended a residential training in Protactile after she became increasingly blind, and felt the full possibilities of conversation return. “I ate at the dinner table with several DeafBlind peers in the dark,” she said. “Hands and feet patting, groping, and stomping.” Oscar Chacon, who works part time in Edwards’s lab, told me that it annoys him when hearing-sighted people, upon learning about Protactile, say they find it “inspiring.” “We’re human beings,” he said, “using language the way humans use language.”

Since the Protactile conversations that I observed all passed in a flutter of movements that I didn’t understand, Clark took a moment to demonstrate a word—“oppression.” He took two hands and pressed them down onto mine. I tried to repeat it back: Is this “oppression”? “P.T. isn’t a code where two hands pressing down equals oppression. It could also be something like this,” he said, and dragged his hand slowly down my arm. “This person is oppressed,” he said, and gripped my chest, crushing something invisible there. He cycled through a range of other movements. My interpreter became uncharacteristically overwhelmed. “I can’t think of enough English words to equal what he’s giving you,” she said. At first, I interpreted Clark’s demonstration as suggesting that Protactile lacked precision. But each variety of oppression that Clark had shown me—which my interpreter scrambled to translate as “repression,” “suppression,” and so on—intuitively connoted “oppression”: they were all forms of dragging, weighting, gripping. It’s just that they had no direct correspondence with English.

Unlike with spoken language, which can be transcribed or taped, or visual sign language, which can be filmed, there is still no way to make a tactile recording. This means that the only way to communicate in Protactile is in person. At one point, graduate students demonstrated new devices that could send taps and presses from a distance—a kind of primitive haptic FaceTime. But the DeafBlind group was unimpressed by the technology, which could transmit only slow, single taps on a limited patch of the body, and had none of the rich array of squeezes and presses that Protactile deploys. Today, many DeafBlind people stay in touch using a Braille display, which has dots that pop up and down to render text from a computer or phone. (I’m learning to use one, too.) Navigating cluttered Web pages can be nightmarish in Braille, but the DeafBlind world thrives in the plain-text realm of e-mail Listservs. In lieu of “LOL,” Protactile e-mailers type “LOY,” for “Laughing on You,” invoking the Protactile mode of laughter, a spidery tickle. Clark teaches college-level seminars entirely by e-mail. He once wrote, “Before PT came along, I had my most fun, found the most joy, experienced life the most on listservs.”

Clark has considered applying for teaching positions at universities, but told me that he wishes that they hired “environments”—groups of DeafBlind colleagues following the rules of contact space—rather than individuals. In Chicago, I noticed that the DeafBlind people carried their Protactile conversation with them like a miniature weather system as they made their way through the campus. They remained in contact with one another and explored their environment, touching walls, trees, and the raised letters on signs, sharing their impressions. At lunch, they occupied a large communal table at a café on campus. Clark felt his way to one side and ended up with his hands on the back of a hearing-sighted woman at another table. She tapped back on the communal table, trying to signal where he should go, and then continued her conversation with her lunch partner. When Clark made it back to his seat, he announced, “I found two mutes!”

In 2006, just as the Protactile movement was beginning, Terra Edwards, then a graduate student, was at Seabeck, the annual retreat near Seattle. Outside, she saw a DeafBlind person forcefully correcting her interpreter. “This was highly abnormal,” Edwards said. “I could tell that was a shift in the authority structure.” But Edwards was also interested in the correction itself. The interpreter had pointed at something in the air, and the DeafBlind person, with some degree of “angst and irritation,” told her to instead draw a diagram on her palm. “People had pretty strong opinions about whether or not you were doing it right,” Edwards said. “To me, that suggested that there was some kind of system at play.”

Edwards (and, eventually, Brentari) spent the following years filming some of Protactile’s most fluent speakers telling stories and describing objects, and found an increasingly conventionalized system, with an emerging lexicon of its own, organized by new phonological rules. When Edwards shared these rules with DeafBlind people, they knew exactly what she meant, even if they’d never had a reason to spell it out, just as English speakers are able to follow complex grammatical rules without having any idea what an indefinite clause is. By 2014, Edwards believed that, among those who had immersed themselves in Protactile, the practice was evolving into its own language. Other linguists I spoke to agreed. Molly Flaherty, a developmental psychologist and sign-language linguist at Davidson College, told me, “How amazing is it that language is something that’s flexible enough to work in yet another modality?”

In the nineteen-fifties, the linguist Noam Chomsky identified what he came to call the “poverty of the stimulus,” the idea that language learners receive vanishingly few clues for how linguistic systems work. Ann Senghas, a cognitive scientist at Barnard, told me, “Someone gives you a pie, and you have to figure out how to make it.” Chomsky concluded that our brains are endowed from birth with aspects of grammar, allowing us to reproduce language without formal instruction. More recent theories hold that we are simply incredibly good at unconscious statistical analysis of linguistic patterns. Whatever the case, the human brain is a superb language-decoding machine.

In the absence of a shared language, people will create new ones. In the seventeenth century, French colonizers brought enslaved Africans to what would eventually be called Haiti. These Africans brought their languages—Igbo, Fongbe, Bantu, and many others—with them. As they communicated, their language converged, drawing from the varieties of French that were spoken on the island, and incorporating elements of West African grammars. In the course of the eighteenth century, a new language, today known as Haitian Creole, or Kreyòl, emerged. Michel DeGraff, a linguist at M.I.T., told me, of early speakers of Haitian Creole, “They’re not sitting down and taking language classes. They’re learning and innovating on the go.”

Starting in the seventies, when several new schools for young Deaf children were established in Nicaragua, students arrived with their own sets of home signs. But, within a few years, their signs began to evolve. Senghas told me that the process was “like going through hundreds of years of language change in just a decade.” The word for “rice” began as a pinching motion, showing the grain’s size, followed by a flicking gesture that mimicked the process of removing stones from the rice before cooking it, and another demonstrating how it’s eaten. In the eighties, the word simplified to just the flicking motion, its most distinctive element. “It’s not even the most salient thing about rice,” Senghas said. But “a system survives because it’s learnable.”

Edwards and Brentari believe that Protactile is in the very early stages of such an evolution. They found that much of Protactile’s “archival lexicon” comes from A.S.L., but the rules governing how these words are articulated have changed significantly. Edwards and Brentari have studied gestures that make up Protactile words—the equivalent of phonological units like “puh,” “buh,” “shuh”—and catalogued them: you can trace, grip-wiggle, slap, and so on. There are rules for how these movements can be combined. A single-finger tap followed by a two-finger tap never happens in Protactile, as it can be difficult to distinguish the two, whereas a single-finger tap could easily be followed by a two-finger press. These rules emerged intuitively, without conscious codification. But when they’re broken it doesn’t feel right, just as if an English speaker tried to combine a “P” and a “B” sound without a vowel separating them.

The linguists also observed new words being created. In A.S.L., “king” is made by taking the manual alphabet’s “K” and sliding it down one’s chest, like a royal sash. But the “K” was hard to recognize by touch. A new version emerged during a Protactile training in 2018. Everyone noticed that Jaz Herbers, the former I.T. specialist, had particular preferences. For instance, he positioned himself closest to the air-conditioner on a hot day, and bought a king-size bag of M&M’s. Other DeafBlind people began jokingly giving him a tactile crown. By the end of the training, he’d received his Protactile name: a one-fingered circle followed by a downward movement that evoked the crown and its weight. A few years later, Edwards noticed a group of DeafBlind people talking about getting a fast-food lunch, using the A.S.L. word “burger” followed by Herbers’s P.T. name. “Soon, any time anyone said ‘king,’ that’s the word they were using,” Edwards said. If Protactile continues to spread, there’s a chance that future speakers will trace the etymology of “king” back to Herbers, much as English speakers today owe the name for a lunch of meat between slices of bread to John Montagu, the fourth Earl of Sandwich.

Edwards and Brentari found that Protactile was doing things that other languages couldn’t. Protactile is full of a kind of tactile onomatopoeia, in which a hand resembles the feel of the thing it’s describing. In what the linguists call “proprioceptive constructions,” the speaker recruits the receiver’s body to complete the word, say, by turning her hand into a tree (five fingers as branches) or a lollipop (fist as candy). At one point, I asked Nuccio where she was from, and she told me to make my hand into a fist, which represented the globe. “You and I are in America, over here,” she said, touching my first knuckle. “And this is the ocean.” She traced a finger to my wrist to find the country where she was born, Croatia. She accomplished all of this in a series of movements that Edwards said followed consistent grammatical rules. At another point, Nuccio described how difficult her life had been when she’d worked as a technician in a genetics lab as she went blind. She had me point my finger up, and told me that it was now the flame of the Bunsen burner that she’d used in her lab. She demonstrated how to adjust the flame on one of my knuckles, and how delicate the apparatus was. I was astonished by the precision of this tactile illustration, which felt, in the moment, more vivid than any verbal description could have.

Some linguists remain skeptical that Protactile has yet emerged as an independent language. “I think it’s fascinating what’s happening,” Wendy Sandler, a sign-language linguist at the University of Haifa, told me. “But I have a lot of questions about how it’s going to develop.” She said that many of the functions of A.S.L.—for example, the way that parts of sentences are separated spatially on the body—still hadn’t made it into Protactile’s system. “Most P.T. users in the U.S. already know ASL very well and can mentally ‘fill in the gaps,’ ” Sandler said. “But these gaps do not yet seem to be filled in by P.T. itself.” Another linguist told me that she believed that Protactile is more like a dialect of A.S.L., similar to how there are many dialects of American English. There is no single test for whether a form of communication has emerged as a language, and the debate is ongoing. Senghas compared A.S.L.’s influence on Protactile to the presence of French or Latin in English. “It’s got its seeds in A.S.L. in many ways, but it’s a different language,” she said. “If there’s no English, there’s no Morse code. Whereas, with Protactile, if there’s no A.S.L., there’s still P.T.”

On their last night in Chicago, the Protactile group gathered at a local’s house for a party. I was one of a handful of hearing people there, and one of only a few who didn’t know Protactile. My interpreter wanted to visit with a friend, and as soon as she left the room I felt like Clark’s kindergarten bus driver: I’d forgotten how to speak. In the kitchen, the host was blasting bass-heavy eighties hits, and I felt the vibrations in my chest, which is how many Deaf people listen to music. Elsewhere, it was quieter. I sat on the couch in a room packed with a dozen people all engaged in a silent but lively conversation that I couldn’t understand.

The party was supposed to feature a tactile game called a P.T. Hat Slam. The game never materialized, but the host had cleared out a bedroom and laid out his extensive hat collection for the guests to admire. Clark offered to give me a tour, without my interpreter’s help. As he guided my hands over the hats, I thought that I detected the presence of language: notice how this hat can fold out; watch out for the spikes on the gladiator helmet. I had no way of knowing how much was Protactile and how much was just basic gestural communication; that line is still thinner in Protactile than in established languages. Later, Clark criticized my performance during his tactile hat tour. “You didn’t know how to really feel!” he told me. “Maybe you were looking at them with your eyes. You didn’t go beyond my hand to touch, to explore. That’s one skill that has to be taught.”

Protactile continues to grow. Nuccio and Clark recently received a two-million-dollar grant to expand a Protactile interpreter-training program. There are weeklong retreats on cruise ships and at Florida resorts (“Breezin’ P.T. Weekend”), and experiments in Protactile theatre: in 2018, a Gallaudet professor staged a Protactile version of “Romeo and Juliet.” A handful of Europeans have studied Protactile and taken its techniques back to France and the Netherlands. But some linguists wonder whether Protactile will ever fully develop into its own language. Protactile lacks a dense, in-person DeafBlind community, like the residential Deaf schools that incubated the development of A.S.L. I was surprised to learn that several active members of the Protactile movement live with Deaf spouses and children who resist using Protactile with them.

Most DeafBlind people in the U.S. who have encountered Protactile understand it as a broadly “pro-tactile” philosophy, but haven’t adopted it as a new language. George Stern, a writer in West Texas, told me, “In a lot of my activities, whether it’s ballet dancing or practicing salsa, or cooking, I incorporate touch.” When Stern had a hearing-blind girlfriend, he taught her a series of tactile signals—fingers walking across the back, for example—to coördinate passing each other in their narrow kitchen. But, to Stern, who usually uses hearing aids and communicates orally, the linguistic component of Protactile still feels rarefied and out of reach. “I’m glad that there are people developing P.T. as a language where they are,” he said. “But how is it going to function where I am? I don’t live in a DeafBlind community. I live in a primarily hearing-sighted world, in an American culture that’s generally averse to touch.” Chris Woodfill, the associate executive director of the Helen Keller National Center, told me that, though he focussed on learning tactile modes of communication as his own vision declined, many of his clients communicate orally, using hearing aids and other assistive listening devices. An increasing number have received cochlear implants as children—a practice that remains controversial in the Deaf community—and never learned visual sign language. As a result, he noted, the center doesn’t push tactile communication on its clients: “We lay out the menu, and it’s à la carte.”

Language development is most productive when it’s passed through a new generation, whose infant learners refashion a language as they learn it. But most people with Usher syndrome don’t become blind until early adulthood, so few would be children when they learned Protactile. Many young children who are DeafBlind have other disabilities, such as charge or congenital rubella syndrome, which can cause cognitive delays that affect communication. And many have hearing-sighted parents who don’t know A.S.L. themselves, let alone Protactile. “They understand that the tactile world is important to a DeafBlind kid,” Deanna Gagne, a researcher who is studying language acquisition in DeafBlind children, told me. “They just don’t know how to implement it.” During the pandemic, Edwards, Brentari, and Gagne received an emergency grant from the National Science Foundation to introduce Protactile to DeafBlind children isolated in their homes. Nuccio played with one DeafBlind boy for about five months. At first, he was reluctant to have his hands touched, but, over time, communication improved: Nuccio ran a toy car up and down his arm, and then used the P.T. word “car” and made the same motion, trying to connect the word with the object. Later, the boy made the same word on her arm.

The richest Protactile environments are still the ones inhabited by the movement’s leaders. At Nuccio’s training center, which she calls P.T. House, visual A.S.L. is forbidden and her dogs respond only to tactile commands. And, in their apartment in St. Paul, Clark’s family has turned the archival A.S.L. vocabulary “way, way down,” to encourage invention. The result has been an efflorescence of new words. During his bedtime ritual with his children, Clark has forced himself to discard the A.S.L. phrases he grew up with, and to come up with Protactile ones instead. To say good night, he places his hands on a child’s shoulders, and brings them together in the center of the child’s chest. “I thought I was ‘gesturing,’ but somehow still conveying the sentiments,” he said. “My ASL mind hadn’t recognized those as actual words.” Recently, he told a Protactile Theory seminar that he conducts over e-mail about these new words. He acknowledged that they may forever remain home signs. But they might seep out into the community, as Clark converses with the hundreds of people he touches every year. “At any rate,” he concluded, “if you want to get rid of ASL words for ‘good night,’ ‘I love you,’ and ‘sweet dreams,’ I have drafts for you!”

Jim Cromwell
Will this ever stop?

I received this request today from a colleague:

Do you know of any research I could quote that evidences the need for native / strong levels of BSL if working with someone who is minimal language / language deprivation / mental health? I’m working with someone who is in hospital and the ex manager is telling people signing isn’t important, she can get by.
— BSL Interpreter

My reply is here for posterity because I do not doubt I will receive the same request again in no time:

This is just disgusting. It’s not for anybody’s manager (particularly ex!) to decide on their behalf what an individual’s linguistic needs might be.

It’s been a long time since I pointed at my chest and said PAY ATTENTION TO ME but as a qualified clinical psychologist with 35 years working in BSL with Deaf people I can say with absolute certainty that:

A person’s requirement for grammatically correct language provision is INVERSELY proportional to their fluency.

That is, a highly fluent BSL user will have the neurocognitive development to be able to decode and perform all the additional processing to work out what sub-optimal communication is provided to them. A less-than-fluent BSL user needs language provision in which the lack of ambiguity is baked-in. That is, language based upon grammatical structures that have evolved over thousands of years in vivo. And not, for example, Makaton, lip-reading, Franglais, or bloody Widgit symbols…. Where ambiguity inevitably surfaces in an interaction, the fluent signer/interpreter will be able to do the metalinguistic processing to ensure comprehension, because somebody with minimal language/language deprivation/impaired mental health will be relatively unable to do so.

Inevitably, because we are talking about the Deaf Community, I can’t find anything properly relevant to point at for this. You’d think it would be obvious though, and anyone wanting “proof” of this argument is just trying to knock the argument into the weeds. They should prove THEIR side. #russellsteapot

I’ve included the Convention on the Rights of Persons with Disabilities though cos it’s possibly helpful (and we ARE signed up to it cos UK signed up prior to Brexit and Brexit has no mechanism for unsigning as of yet. Deaf stuff is usefully highlighted by me in yellow.

Feel free to quote me and either Bowdlerize or add fruity language as you see fit.


Jim Cromwell
BSL Photography

I’ve been playing with BSL sign photography because for years I’ve really liked the shapes in space and time that some signs make. These are attempts to capture that. They largely fail, but I like them for what they are…

Jim Cromwell
Access to Work

I'm very much on the fringes of the Access to Work debacle because, being a debacle, I try not to get involved. Nevertheless, I'm putting this out there:

It is ridiculous to form a business contract with one person (the deaf client), however formally one chooses to make that, and then to have to chase ATW - with whom we have no contractual agreement - for payment. I'd be fascinated to know if this model occurs in any other profession.

I don't mind who contracts me to work, so long as that person, or the organisation formally represented by that person, remains responsible for adhering to the terms and conditions - my minimum standard for which being at some point, without further prompting, that that person gives me the money they had agreed.

The injustice of ATW for interpreters, it seems to me, is that we really only have legal recourse to the deaf person who agreed the work, and few if any of us want to resort to threats of court when that agreement is not met. I believe we can't take DWP through small claims because it is a governmental department, and even if we could, we have a case that is scant at best because we have no legally binding contract with them.

I'm sure there is a reason for this - please enlighten me - but why can't the deaf person's employer be the interpreter provider who then contracts it out to interpreters? Our contracts are then with the person responsible for payment. ATW require a few quotes before agreeing providers with the deaf person, which MIGHT be problematic in this context, but I for one would be more than happy to quote exorbitant fees that would cover the pain and heartache of the usual ATW system and that would therefore be astronomically beyond ATW's ability to accept.

Jim Cromwell
Against Access | John Lee Clark

I stole this. I stole it from McSWEENEY’S 64. I’d say it has caused me to dig deep into what I do and how I go about doing it more than anything else I’ve read. It’s beautiful, enlightening, and important. I want to add my thoughts. I want to have things to say but… It’s all here. All I feel I can do is point to it and say

“Look! Look at this!”

John Lee Clark

John Lee Clark

An autographed game-used baseball—bearing personalized inscriptions by two players on the Minnesota Twins, Chuck Knoblauch and Hall of Famer Kirby Puckett—is the sole surviving physical evidence of a childhood consumed by sports. Although it’s not inaccurate to say I was born DeafBlind, since I have the progressive-blindness condition known as Usher syndrome, it’s often more helpful to say I was born Deaf and gradually became blind, growing into my DeafBlind identity. As a kid, I had tens of thousands of baseball cards that I would strain my eyes to read. My shrines to that time are all gone now, save for this one baseball. I never thought I would entertain the idea of giving it away, but here I am. Should I keep it? It doesn’t take up much space. But what does it mean to me now? It’s like a moon rock, a lonely object out of space and time.

Jim Fuller, a staff writer for the Minneapolis Star Tribune, was the one who, beaming, presented me with the baseball. He first came to our house to interview my father about his efforts to establish a bilingual Deaf charter school. Jim soon discovered that I was a sports fanatic. We got to scribbling notes back and forth about our beloved Twins. Jim said he would have a surprise for me the next time he stopped by.

Although he gave me the baseball in the summer of 1993, it evokes my happiest sports memory, which took place two years earlier, when the Twins played in the greatest World Series ever. When Kirby walloped it out of the park to take us into Game Seven, I could hardly breathe. Then it was John Smoltz and Jack Morris taking turns on the mound. In the bottom of the tenth, with Smoltz out of the game and Alejandro Pena in relief, a Twin stretched a bloop into a double. A few at-bats later, the bases were loaded, and when the next batter made contact, the first tore off. As Dan Gladden, a.k.a. Clinton Daniel Gladden III, a.k.a. the Dazzle Man, flew down the home stretch, the universe, the whole world, my very being rushed toward him. Nothing can do justice to the moment he leaped and landed on home plate except witnessing it with your own eyes. Any attempt to describe it is futile. Description can only serve a roundabout purpose.

It took me a long time to realize this. I continued to follow sports after I could no longer witness events with my eyes. It pleased me to believe I still had access through sports news, box scores, occasionally enlisting someone to sit next to me and relay games, and, above all, reading fine sportswriting. Wasn’t baseball synonymous with literature? It therefore baffled me when I found myself keeping up with sports less and less. I skipped the Super Bowl a few years after the television screen ceased to be legible, breaking a tradition going back as far as memory. I unsubscribed from ESPN: The Magazine even though the list of magazines available in hard copy Braille is short and precious. Then it was down to one sportswriter, Bill Simmons, a most diverting raconteur. I accepted that I now required good writing to maintain my interest in sports. But even the Sports Guy’s columns began to lose their charm after a while. What was going on?

At first, what I read or listened to live through an interpreter teemed with players I had worshipped with my own eyes. I knew their faces, their tics, the way they licked their upper lips or groaned or stared or gasped in horror or with joy. As they faded into retirement, there was less and less poetry in what I gathered, replaced by new and strange and meaningless names. Direct experience goes a long way. It meant that sports did resonate with me for years after my last eloquent encounter. But without direct experience, I learned I couldn’t access the same life.

Disability rights activists have long fought for access, most often in the form of basic and unobtrusive accommodations. Today, billions of dollars are poured into projects seeking to increase inclusion. It’s not that I don’t appreciate it when a restaurant has a Braille menu. A device attached to a streetlight post that vibrates when it’s safe to cross the street is huge for me. Programs created in the name of access make it possible for me to write this essay. Ramps, elevators, wide doorways, flashing lights, railings, benches, assistants, care workers, and myriad technologies make all the difference in the world. But the way those things are lobbied for, funded, designed, implemented, and used revolves around the assumption that there’s only one world and ignores realms of possibility nestled within those same modes.

The question I am asked most frequently by hearing and sighted people is “How can I make my [website, gallery exhibit, film, performance, concert, whatever] accessible to you?” Companies, schools, nonprofits, and state and federal agencies approach me and other DeafBlind people all the time, demanding, “How do we make it more accessible?”

Such a frenzy around access is suffocating. I want to tell them, Listen, I don’t care about your whatever. But the desperation on their breath holds me dumbfounded. The arrogance is astounding. Why is it always about them? Why is it about their including or not including us? Why is it never about us and whether or not we include them?

In my community, we are in the midst of a revolution. We have our first truly tactile language, called Protactile. We insist on doing everything our way, fumbling around, groping along, touching everything and everyone. We are messing with traditional spaces, rearranging them to suit us better, rather than the other way around. The Protactile movement is obsessed with direct experience. As Robert Sirvage, a DeafBlind architect, put it in a recent conversation, the question we begin with is not “How do we make it more accessible?” Instead, we start by asking, “What feels beautiful?” When hearing and sighted people join us, they pick up Protactile and learn how to work and socialize with us in our space. They often find themselves closing their eyes, either literally or by dimming their visual processing, because sight isn’t necessary. Bodies in contact become as normal to them as they are to us.

When the word access comes up, it usually refers to tools or avenues that complement the sensory experience people already enjoy. Captions for movies, TV shows, and videos are excellent examples. They are said to provide access for Deaf people, who, I need to stress, already have a relationship with the images flitting across the screen. When blind people ask for audio descriptions, this accommodation merely supplements what they already hear. For example, the audio description might helpfully note that “the King is waving his sword, his cloak billowing in the wind” when a king shouts, “Follow me, ye good knights!” But then there are the efforts to feed captions into Braille displays so DeafBlind people can have “access” to radio, TV, and film. This isn’t complementary access. It’s a replica, divorced entirely from the original. This is how we frequently find accessibility features—as sorry excuses for what occasioned them in the first place. Access itself is too often all we have, a dead end, leading nowhere: captions without images, lyrics without music, raised lines without color, labels without objects, descriptions without anchors.

In the United States, there are tens of thousands of American Sign Language interpreters, who are trained to facilitate communication in the most accurate and impartial manner possible. You could say they are human captions. The rigor with which they strive to translate between ASL and English, and between various cultural frames of reference, may be a wonderful way for sighted Deaf people to gain access—which is to say, complementary access—to many settings. But ASL interpreters are an atrocity for DeafBlind people, constantly inserting themselves between us and other people in order to facilitate conversations, but instead getting in the way of direct connections. This was one of the things that unwittingly helped give birth to Protactile in 2007.

Protactile took root only when a group of DeafBlind leaders in Seattle decided to conduct meetings and workshops without any interpreters. DeafBlind community members were shocked by how well those events went, with participants communicating directly and rotating from cluster to cluster. This success emboldened us to break many taboos related to touch, including touching one another’s bodies instead of just moving our hands in the air. A grammar soon developed to coordinate all that contact. A new language was born. It’s no accident that this explosion occurred when we took a break from the most prevalent manifestation of access in our midst: ASL interpreters.

My experiences on September 11, 2001, provide an illustration of why, before the Protactile era, it was so frustrating to work with interpreters. I went to my postcolonial literature class at the University of Minnesota without having read any news online earlier that day. I found my two ASL interpreters already there. They immediately asked me, “Did you hear about an airplane hitting two poles?”

I laughed. “No, but that’s funny. So today they’ll be talking about Rudyard Kipling’s Kim. How about we give Kipling this ASL name and Kim this ASL name? To distinguish between the author and the character? Good?”

A long pause.

I repeated, “Good?”

“Yes... that’s fine,” they said. They were acting strange. When the professor arrived, the energy was weird. He asked if everyone was all right. Did anyone have family in New York? Did anyone need to leave class?

It wasn’t until hours later that I read the news and understood what had happened. People must have been upset and crying. All the TV screens running the same footage over and over. And my interpreters had failed—miserably—to convey any of it to me in a meaningful way. Why? Because they were there only to “provide access,” primarily to the spoken English content of the class.

Fast-forward to one of the Protactile movement’s biggest achievements to date: creating a new kind of interpreter. In 2017, we launched the DeafBlind Interpreting National Training and Resource Center and began hosting week-long immersion trainings for interpreters, led by DeafBlind Protactile experts. When my colleagues and I started developing the program, we quickly realized that the point wasn’t just to help interpreters become fluent in Protactile. It had to facilitate a complete reinvention of their role. Instead of providing “accurate and objective information” in a way that unsuccessfully attempts to create a replica of how they’re experiencing the world, Protactile interpreters must be our informants, our partners, our accomplices. Typically, ASL interpreters are system-centered, leashed to a platform or classroom or meeting room or video call, jerked into action every time someone speaks in English. There’s usually a power imbalance, such as between a hearing teacher and a Deaf student, a hearing doctor and a Deaf patient, or a hearing boss and a Deaf employee. With this power imbalance in mind, we can understand why ASL interpreters often “belong” to the hearing party more than to the Deaf party. This is problematic for sighted Deaf people, but it is devastating for DeafBlind people. Protactile interpreters, by contrast, are consumer-centered, firmly aligned with us, following our lead as we figure out how to hack into situations. We recognize that there’s little value for us in most distantist spaces—that is to say, spaces where people are rarely touching but remain visible to one another. The question in working with an interpreter for us then becomes: What do we want to get out of it? What we want is never what hearing and sighted people plan or propose to do, because they never ask us, “What shall we do together? How do you want to do this?” They merely wish to include us.

A story to illustrate what has changed with this new role for interpreters: Early in the COVID-19 pandemic, a DeafBlind friend told me about her recent experience working with a Protactile interpreter I’d helped train. She had a doctor’s appointment, and the Protactile interpreter met her at the entrance to the building.

“Wow,” the interpreter said as they entered the waiting room, “everyone here is tense and talking about COVID. The TV over there: it’s on COVID. Do you want me to relay that, on the TV, or eavesdrop on what the doctor over there is saying to a cluster of people… something about masks?”

My friend dismissed it all with a sweep of her hand across the interpreter’s chest. “Not interested. So how was your trip to—”

“Yes, yes,” he interjected, “we can talk about my trip, but I just want to make sure. Do you know what COVID is?”

“I have no idea.”

“Whoa. Okay, okay, okay. Listen, COVID is earthshaking news.” He grasped her shoulders to mock-shake them for emphasis.

After he explained COVID-19, my friend was awed and now wanted to know what the TV was saying, and had many questions for her doctor.

Here, the Protactile interpreter operated as my friend’s partner, making subjective yet vital contributions. When he found her dismissal of COVID-19 odd, he pressed her on the topic. An ASL interpreter would never have done that, unless they allowed their instincts to overrule their training.

When I teach ASL interpreters that they must share their opinions and assessments, they always protest, “But I don’t want to influence the DeafBlind person!”

“If you’re worried about influencing us,” I reply, “you give yourself too much credit and us too little.”

Another thing ASL interpreters habitually do is describe the whole of things. Upon entering a room, for example, they stop and say, “This is a midsize room with a few tables, here, there, and over there. There are… let’s see, one, two, three, four, five, six, okay, six windows—”

Here I stop them. “Why are you telling me, telling me, telling me things? Your job isn’t to deliver this whole room to me on a silver platter. I don’t want the silver platter. I want to attack this room. I want to own it, just like how the sighted people here own it. Or, if the room isn’t worth owning, then I want to grab whatever I find worth stealing. C’mon, let’s start over. What we’ll do is start to touch things and people here, together, while we provide running commentaries and feedback to each other.”

Although I travel places and enter spaces alone all the time, interacting with the environment and people I encounter according to how things unfold, it’s often nice to have a sighted co-navigator, such as an interpreter. It may mean being able to approach someone who is not standing where I’d typically be exploring, along a wall or from landmark to landmark. If the person doesn’t know Protactile, the interpreter can translate my quick explanation of what I need them to do—put their hand on my hand and give me feedback with their other hand—and why I am using their upper chest or arm or leg to describe something with. I can establish that contact with strangers without an interpreter, but it may take a few false starts before they “get it.” They may forget to give me adequate feedback, so the interpreter will relay to me their reactions to our exchange.

Giving quick reads without getting bogged down in details is an important skill for a spy doing live reconnaissance. But ASL interpreters are at first inhibited by notions of neutrality and objectivity. They start by offering something like “Walking by over there is a tall, thin, light-skinned person with curly dark hair down to here, wearing a white tank top, blue jeans, and brown boots…” They’re pleased that they’ve avoided race and gender.

“No, no, no.” I brush my hands back and forth across their arm in vigorous negation. “That’s not the way to do it. The very same description could be applied to a gorgeous Latina in chic, expensive boots, who oozes money, or to a pasty, rangy white man, hair a mess, boots falling apart, maybe looking angry. Like, they’re the opposite of each other? Yet they share the same sanitized description.”

Because of their fear of bias, I discuss four implicit safety nets to help them feel better about uttering an assessment:

First, we’re not so fragile that saying something wrong will topple us. We know what we are doing. We—not they—are in charge of our missions. Responsibility lies with us, not with them.

Second, I tell them a story about the best interpreter I worked with before the Protactile era. He was a volunteer rather than a professional interpreter, and because of this, his commentary was so unvarnished that I picked up a ton through him. He also happened to be a racist and misogynistic Deaf man, but I was able to separate his bias from the information he gave me. I ask my interpreting students, “Are you an unabashed bigot? No? Then you have that much less to be worried about.” This interpreter wasn’t good at his job because he was bigoted; rather, he was good because he functioned as an open channel of information, and so everything in his brain was revealed, his bias along with it. “You don’t want his bigotry,” I tell my students, “but you want his talent for not thinking twice.”

Third, if they’re so terrified of letting slip their own opinions, I tell them, then they should consider what I call “collective subjectivity.” Suppose a hundred sighted people see someone sauntering into a room. In that Gladwellian blink of an eye, they all come to a hundred slightly different conclusions based on their own life experiences. An interpreter may happen to be a fashion maven and know the person’s expensive-looking boots are knockoffs, for example. But nevertheless, there will be certain cultural signifiers that are recognizable to the majority of those hundred people, however correct they may or may not be. The question is: What is it that is being broadcast to the collective? We don’t have time to listen to a long deposition, the thousand words that a picture is rumored to be worth, for us to reach a reasonable conclusion—if we can even reach such a conclusion, since ours is not a visual world. It’s so helpful to have an aide de camp to tell us whether someone is receptive to us or if our charm is being wasted.

Fourth, there’s the Gladwellian blink of an eye, and then there’s the Gladwellian—or Clarkian!—slide or pat or jiggle of the hand. By bumping into, sniffing, tapping, brushing past, we are gathering intelligence of our own. That’s why we shouldn’t stop while our interpreter attempts to construct a replica but should instead continue picking up important information that may confirm, contradict, or qualify what an interpreter contributes.

After nudging two hundred–plus ASL interpreters through the travails of rebirth as Protactile interpreters, I began to understand why people who work around access cling to the concept of accuracy. This commitment to accuracy, to perfect replication, is a commitment to the status quo. We are expected to leave it untouched, or, if it must be altered, then to do so as little as possible. Access, then, is akin to nonreciprocal assimilation, with its two possible outcomes: death by fitting in or death by failing to fit in. The Protactile movement is the latest pulling away from replication. In Deaf history, generations of hearing educators have tried to use sign language expressly to represent the dominant written language, first by finger spelling letter for letter and, after tiring of this, word for word. It was always shaped around the dominant language and never about what real sign language had to offer. One unintentionally hilarious attempt at accuracy was a system called Signing Exact English. In blind history, reading by touch started with raised lines that followed, exactly, the lines of printed letters. When the lines proved painfully slow to trace with one’s fingers, sighted educators grudgingly allowed for them to be more blocky and a bit easier to feel. Braille—as a different world, a world of dots as opposed to lines—was long from developing at that point, and hasn’t, in fact, been fully embraced as a medium in its own right even to this day, with so many people still concerned about representing print accurately. Of course, the problem isn’t accuracy, per se, but whose accuracy.

In recent years, there has been a rush on the internet to supply image descriptions and to call out those who don’t. This may be an example of community accountability at work, but it’s striking to observe that those doing the most fierce calling out or correcting are sighted people. Such efforts are largely self-defeating. I cannot count the times I’ve stopped reading a video transcript because it started with a dense word picture. Even if a description is short and well done, I often wish there were no description at all. Get to the point, already! How ironic that striving after access can actually create a barrier. When I pointed this out during one of my seminars, a participant made us all laugh by doing a parody: “Mary is wearing a green, blue, and red striped shirt; every fourth stripe also has a purple dot the size of a pea in it, and there are forty-seven stripes—”

“You’re killing me,” I said. “I can’t take any more of that!”

Now serious, she said it was clear to her that none of that stuff about Mary’s clothes mattered, at least if her clothes weren’t the point. What mattered most about the image was that Mary was holding her diploma and smiling. “But,” she wondered, “do I say, Mary has a huge smile on her face as she shows her diploma or Mary has an exuberant smile or showing her teeth in a smile and her eyes are crinkled at the edges?”

It’s simple. Mary has a huge smile on her face is the best one. It’s the don’t-second-guess-yourself option. My thinking around this issue is enriched by the philosopher Brian Massumi’s concept of “esqueness.” He exemplifies it by discussing a kid who plays a tiger:


One look at a tiger, however fleeting and incomplete, whether it be in the zoo or in a book or in a film or video, and presto! the child is tigerized… The perception itself is a vital gesture. The child immediately sets about, not imitating the tiger’s substantial form as he saw it, but rather giving it lifegiving it more life. The child plays the tiger in situations in which the child has never seen a tiger. More than that, it plays the tiger in situations no tiger has ever seen, in which no earthly tiger has ever set paw.


Just as the child and an actual tiger are not one bit alike, the words Mary has a huge smile on her face have nothing in common with the picture of Mary holding her diploma. Yet the tiger announces something to the world, its essence, and a kid can become tiger-ized and be tiger-esque, their every act shouting, I am a tiger. The picture of Mary at her graduation is shouting something, and the words Mary has a huge smile on her face are also shouting something. It is at the level beyond each actuality, in the swirl that each stirs up, that the two meet.

We would do well to abandon the pretense that it’s possible to reproduce base things in realms other than those that gave birth to them. Instead, we can leave those things well alone where they belong, or, moved by possibilities, we can transgress, translate, and transform them. We can give foreign things new purposes, which may be slightly or extremely different from their original intent. Take the card game UNO, one of the games widely available in a Braille version. The standard cards have dots at the corner that say things like “Y5” for a yellow card with the number five. In practice, playing the game with the Brailled cards is painfully slow. If Protactile hadn’t given us permission to rip sighted norms into shreds, I would still be fingering those dots like a fool. The way to go is with textured shapes, as in our homemade version of UNO, called Textures and Shapes. In this Protactile version you feel the player ahead of you hesitate and make a joking gesture before depositing a velvet star. Now the attention shifts to you, with some hands feeling yours as you deliberate, while a couple of knees tauntingly jostle your knees. Should you unload your velvet square or your rayon star? But the transformation doesn’t end there. Ideally, there are up to four players, who can feel everything at all times if they want to follow the action, or can chatter in three-way Protactile while the fourth attends to their turn. With four players as the ideal limit, there are somewhat fewer textured shapes than there are UNO cards in a set, and there are further tweaks to the rules. And the “wild card” is a delightful eruption of fabrics! It’s a different game, and one that is naturally more inclusive than UNO could ever be. Our environment has endless potential for life. For centuries, however, much of our vitality was forbidden. We were forced to stick with the effects of the hearing and sighted world. Now, though, we are all in varying stages of flight.

Sighted and hearing people have always had a hard time accepting that we are happy for them. Why have they never been happy for us? They wish only to be happy for themselves through us. Part of the fear many of them feel when encountering DeafBlind people comes from the way we naturally decline so much of what they cherish. They seek relief from this anxiety by insisting that we take in their world. Then they ask us a rhetorical question: “It’s great, isn’t it, this world of ours?” This is the awful function of access: to make others happy at our expense. Until Protactile plunged us into the churning currents of being, we didn’t know what we were giving up by consuming access. And the sighted and hearing didn’t know what they were missing out on by not entering our world.

In April of 2020, I made a small but telling gesture. An online journal wanted a photo of me to go along with three poems it was publishing. I had long wanted to do something about this photo business, even if there were an image description to make it “accessible.” Since I don’t see author images, I’m not immersed in the conventions of that particular species of media. So why should I provide a headshot as if I knew what it conveyed and knew that it was what I wanted to convey? To my surprise, the magazine agreed to my request: No photo! Instead, a few words, a tactile description suggestive of what it’s like to touch me in person. I now tinker with it like I do with my bio. My current line goes something like this: “Short hair of feline softness. Warm and smooth hands. A scent of patchouli. Flutters betray his exhilaration.”

A few months later, I went aflutter when Terra Edwards, a dear hearing-sighted friend and a leading Protactile researcher, told me she wanted to henceforward avoid images as much as possible for the materials we publish related to Protactile. Our research team set up a website called the Protactile Research Network. No pictures, icons, or graphics. Text only! Under “People,” where our bios and CVs reside, there is a “tactile impression” of each member of the team. I love Terra’s: “Strong hands. Heats up in conversation. Frequent and enthusiastic tapping likely.”

And here is Hayley Broadway, a DeafBlind researcher currently working with DeafBlind children on Protactile language acquisition: “Wears fashionable, textured attire. Wiggles her fingers on you when she is deep in thought. When you talk to her, you feel a steady stream of taps and squeezes. Sometimes, when she is really excited, she slaps you. Engage at your own risk.”

As always, Jelica Nuccio—my dearest friend, personal hero, and rock of the Protactile movement—has the last hug: “Her stories are smooth and come with the scent of lavender. She draws you in slowly and then grips. When she laughs on you, you can’t help but laugh too.”


John Lee Clark is a National Magazine Award–winning writer and a 2020–21 Disability Futures Fellow. His first collection of essays is Where I Stand: On the Signing Community and My DeafBlind Experience (Handtype Press, 2014), and he is at work on his second collection. He was a featured writer at the Deaf Way II International Cultural Arts Festival, and has won grants and fellowships from the Minnesota State Arts Board, VSA Minnesota, the Laurent Clerc Cultural Fund, Intermedia Arts Center, and The Loft Literary Center. He was a finalist for the 2016 Split This Rock Freedom Plow Award for Poetry and Activism. His work is included in the anthologies Beauty Is a Verb: The New Poetry of Disability, Deaf American Prose, St. Paul Almanac, and The Nodin Anthology of Poetry. He makes his home in St. Paul, Minnesota, with his family.

Jim Cromwell