Brain Braille

Creating a Passive Haptic Learning System to facilitate Brain Braille Learning in Patients with ALS who can think but cannot interact with the outside world


Project Manager

UX Designer

Hardware Developer


Team of 4 students

Working Under GT Faculty: Thad Starner

January 2020 - May 2020


people with ALS maintains their cognitive ability but lose the ability to control their muscles: they can think but not communicate. Brain-Computer Interfaces, or BCIs, offers these patients a way to communicate with the outside world by detecting only their brain function. Our project, BrainBraille, is one of these communication systems

Here’s how it works:

  • A patient goes under a brain scanner, like an fMRI.

  • When the patient experiences the neurological intent to move a particular region of the body, the scanner can identify the relevant muscle region by analyzing brain patterns.

  • BrainBraille monitors six regions of the body, representing the six dots of the Braille alphabet.

  • By activating a combination of these muscle regions, a patient can spell out letters and entire words.


Our contribution to BrainBraille involves the development of a Passive Haptic Learning, or PHL, interface which would make it easier for patients to learn the BrainBraille alphabet. Our objective is to create an array of wearable, vibrating motors which attach to the six BrainBraille muscle regions and then create synchronized patterns of vibration for each letter. The hope is that the PHL interface would allow patients to learn the BrainBraille alphabet through muscle memory alone, reducing the cognitive effort and length of the learning process

Thank you! :D

©2020 by Shelby Reilly