top of page
pt.png

Brain Braille

Creating a Passive Haptic Learning System to facilitate Brain Braille Learning in Patients with ALS who can think but cannot interact with the outside world

THE CHALLENGE

MY ROLE

THE GOAL

People with ALS maintains their cognitive ability but lose the ability to control their muscles: they can think but not communicate. Brain-Computer Interfaces, or BCIs, offers these patients a way to communicate with the outside world by detecting only their brain function. Our project, BrainBraille, is one of these communication systems

Here’s how it works:

  • A patient goes under a brain scanner, like an fMRI.

  • When the patient experiences the neurological intent to move a particular region of the body, the scanner can identify the relevant muscle region by analyzing brain patterns.

  • BrainBraille monitors six regions of the body, representing the six dots of the Braille alphabet.

  • By activating a combination of these muscle regions, a patient can spell out letters and entire words.

Team of 4 students

Working with GT Faculty: Thad Starner

January 2020 - May 2020

Scope

UX Designer

Hardware Developer

My contribution to BrainBraille involves the development of a Passive Haptic Learning, or PHL, interface which would make it easier for patients to learn the BrainBraille alphabet.

 

My objective is to create an array of wearable, vibrating motors which attach to the six BrainBraille muscle regions and then create synchronized patterns of vibration for each letter.

 

The PHL interface would allow patients to learn the BrainBraille alphabet through muscle memory alone, reducing the cognitive effort and length of the learning process

Anchor 1
bottom of page