It’s been said the eyes are windows to the soul, and most of us can recognize much from looking into another’s eyes. We can sometimes communicate intricate thoughts and feelings through our eyes. Imagine, though, they were your only window to the world; the only method by which you could communicate, with anyone. Further, what if you were in a position where it was nearly impossible for you to initiate a conversation and, therefore, unless you had a way to get someone’s attention, you had to wait for others to anticipate your needs? Worse yet, once anticipated, that other person would have to use a method requiring them to initiate nearly every aspect of the conversation.
This is precisely the situation for thousands of people suffering from ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s disease), MS (Multiple Sclerosis), TBI (Traumatic Brain Injury), or any one of several other conditions that result in the inability to move and talk . . . or write . . . or use sign language. All they can do is acknowledge your greetings or answer your questions with a wink, perhaps a nod. How would you communicate?
Recently, I had the opportunity to meet someone in that position. His name is Ismail Tsieprati, and he is one of the longest-surviving sufferers of ALS, having had the disease for thirty years. I first met Ismail’s caregiver and wife, Cheryl, at a local Chamber of Commerce networking breakfast. She had recently left a long-time position in corporate training and was looking to establish a web-based business as a training consultant. She asked if I could help her promote her business through social media, and also suggested she help turn some presentations I had done into webinars. I agreed, and we worked together for a short while. As life would have it, before we could finish she was offered a full-time position in corporate training, and I was coming to the painful realization I had been chasing a pot devoid of gold.
We trekked off on our separate trails, as casual business relationships often do, but we remained in occasional contact, through Facebook and comments Cheryl would make in response to one of my sporadic blog posts. One of the several projects I had been working on was editing and proofreading a book – Age of Context – authored by Shel Israel, a longtime author and tech journalist, and Robert Scoble, an authority on bleeding-edge technology and startups.
Toward the end of the project, I was given the opportunity to invite a number of friends to read an early, limited release of the book. In exchange, I asked them to provide an honest review on Amazon.com. Since Cheryl had been one of the people to occasionally comment on my blog, I thought of asking her to read and review the book. She was an excellent prospect because she is not a techie and readily admits to being somewhat befuddled with technology.
It turned out to be a good choice, in more ways than one. Cheryl was one of the few people, of the many to whom I sent a link to the early release, who took the time to write a review—a thoughtful, useful review at that—not merely a quick stab at providing an obligatory yay or nay. She also was hit between the eyes with some of the stuff she read about. Particularly relevant to her was Google Glass. Shel and Robert had devoted an entire chapter to it, and there were numerous references as well throughout the book. I’ll let her explain, in this excerpt from a chapter of the book about their 30-year journey with ALS she and Ismail are working on:
“The greatest fear for people living with ALS is there will be a day when they become “locked in” – so completely paralyzed in the late stages of the disease that even the simplest of muscle movements have been stolen from them and they will be unable to communicate in any way with the outside world. Those who are “locked in” survive in complete isolation. They can see and hear everything around them, comprehend everything, think complex thoughts, experience joy, sorrow, and pain, and yearn to communicate with their loved ones and caregivers – to express their feelings, their discomfort, and their desires, but they are hopelessly trapped inside their bodies with no way of communicating with anyone. For Ismail and me, as well as for others with ALS and their families, overcoming communication barriers and fighting off the “locked in” syndrome is a lifelong battle; an absolutely essential one. After all, Ismail loves to talk, and he has a lot to say. He’s not going to let ALS or anything else shut him up!”
– Tears, Laughs, and Triumphs: A Thirty-Year Journey with ALS *
* Since this article was first posted, Ismail and Cheryl’s book as been published. It’s final title is “One Blink at a Time.” It can be purchased at Amazon, here.
What struck Cheryl was the knowledge that Glass could recognize eye movement. She was very excited about the possibility of making eyeglasses with eye gaze technology available to Ismail and people in similar situations. Current eye gaze augmentative communication technology involves much larger and more complex systems, which require careful placement and frequent adjustment. Cheryl wanted to talk about it, and I wanted to know more. We met for coffee and, after a bit of discussion, I told her I’d like to write about their situation. We decided I should meet Ismail to better understand his world and I was invited to their home.
I had mixed emotions about the upcoming meeting. I was to meet Ismail for the very first time, and I was somewhat nervous about the prospect. I had never met someone with ALS, and mostly knew about it through the story of baseball legend Lou Gehrig and one of my favorite theoretical physicists, Steven Hawking.
Cheryl had invited me to lunch. After meeting Ismail and talking to him and Cheryl about this post, I sat down at the dining room table while Cheryl went into the kitchen briefly to dish up the meal and bring it out for us to enjoy. In doing so, she left me sitting alone with Ismail. The silence was deafening. I felt extremely uncomfortable, as I was struggling with how to approach this unique (for me) situation. I had never communicated in this fashion before and it was awkward.
I hadn’t learned the process Ismail and his caregivers use, which involves what they call a “spelling chart”. Ismail can initiate communication in only two limited ways. One is by grinding his teeth, which ensures getting Cheryl’s or his other caregiver’s attention, provided they’re within earshot. The other method is to modulate his breathing so as to trip the alarm on his ventilator. This works at a greater distance than tooth grinding, but likely is more stressful on Ismail.
Once Ismail has someone’s attention, he communicates by using his spelling chart. Everyone who communicates directly with Ismail has memorized this chart, which appears below. Since the chart is in everyone’s head, there is nothing to carry around, set up, or adjust. It’s a quite simple and efficient system, but it’s time consuming and requires patience. This is how it works: The alphabet is divided up into six rows. Row number 1 is “A-B-C-D,” row number 2 is “E-F-G-H,” and so on. The person talking to Ismail calls out the number of each row of letters until Ismail blinks to select the number of the line containing the letter he wants to use. The person then calls out each letter in the selected row, until Ismail blinks again to select the letter he wants. The person then starts all over again, calling out numbers of rows, then numbers or letters, as Ismail builds words, then sentences, then paragraphs. If Ismail selects number 7, he’s telling the person it is the end of a word or a paragraph. If he selects number 8, he wants to give the person a number or a date. People can also ask him “yes” and “no” questions. One blink means “yes,” and two blinks mean “no.”
There are other methods of communication available to people who cannot speak or use their hands. For example, years ago Ismail wrote a screenplay by using an infrared switch attached to his glasses that he operated by blinking his eye. The switch triggered the selection of letters and numbers on an alphabet grid displayed on a computer monitor. The design of the chart was similar to the one Ismail and his caregivers use today. Scanning technology highlighted letters one at a time to allow the user to select one. The program had word prediction, which made sentence-building faster, and a voice synthesizer that could speak words he typed or had programmed into the system. The software program and voice synthesizer were similar to those used by Stephen Hawking.
Since Ismail’s eye blink may eventually grow too weak to be a reliable method of communication, eye gaze is the technology that provides hope for the future for him and others like him. He has been practicing with an eye gaze system that is a communication device, speech generating device, Windows XP computer, and environmental control unit, all in one. The system allows Ismail to select letters and numbers by gazing at them for a programmable pre-set number of seconds. He can also select icons or images that trigger a voice synthesizer to speak words and sentences for him. The system also has the capability of accessing Windows and the worldwide web. But Cheryl says that Ismail becomes tired after practicing with this system for only a few minutes and grows frustrated trying to navigate around its screens. It takes time to set up the equipment and reposition it properly every time he moves from one place to another, so the device is bulky and time-consuming to use except for those times he’s sitting in one place. Although it’s possible to attach the unit to his wheelchair with a special bracket, the machine still needs to be moved out of the way during transfers, then readjusted again.
Enter Glass. What if it had the same total eye-gaze control capabilities that Ismail’s bulkier, less user-friendly equipment has? What if a complex communication, web-surfing, environmental control system could be worn in a pair of eyeglasses and operated by the movement and gaze of an eye? Since Glass contains a built-in display, there would no longer be a need for the much bulkier external devices that are currently used.
“My eye gaze equipment can be slow and tiring,” Ismail says. “It is difficult and time-consuming to use. I hope there will come a day when technology will improve for many people like me who are paralyzed but want to continue to talk to the world. I hope that day comes soon.”
The question now is not if, but when. It’s also a question of priorities, I suppose. I’m not sure if Glass has the capabilities Ismail and others require, but it surely won’t be long before they’re realized. The other question is, are there developers who have both the skills and the desire to create such an app. Also, Glass is not the only device that can provide the necessary functionality. In addition to wearables like Glass, there are companies working on interpreting brain waves. Emotiv is one of them and they are working on a device called Epoc that currently provides limited capability. Right now, they are concentrating on game playing and some forms of gross manipulation, but it shouldn’t be long before their system (and others like them) become more sophisticated. They are also licensing an SDK for people who wish to write their own systems.
If you have any interest in this kind of thing, there are some golden opportunities out there. Perhaps there isn’t a fortune to be made, but the possibilities of helping tens of thousands of people live more productive and engaged lives despite severe disability are immense and will be, I have no doubt, enormously satisfying.