Brain-machine interfaces for people in Los Angeles with ALS

Status
Not open for further replies.

brentg

New member
Joined
Jul 6, 2023
Messages
2
Reason
Learn about ALS
Diagnosis
00/0000
Country
US
Hi, I'm an engineer at UCLA working to create non-invasive brain-machine interfaces as part of the Neural Engineering and Computation Lab . More details below, but high-level idea is to create devices for people with paralysis to use computers and control robotic arms with only their thoughts!

If you are in the Los Angeles area and have ALS and think you might be interested in learning more about the research study, please shoot me a DM! Or if you have family or friends with ALS who are interested, please have them reach out as well, or they can directly email the professor running the lab, Jonathan Kao, at [email protected]. Experiments will be done at UCLA, or potential for our team to travel to you as well.

Our research uses EEG and EMG electrodes on the scalp and arms (the scalp electrodes work through hair - no head shaving, worn the EEG myself a lot) in order to pick up neural and attempted motor activity, and then we decode those signals to control a computer cursor or robotic arm. Important to note we're doing a research study, not a clinical study, so that means there won't be any direct benefit to participating in the study - instead the point of the study is to help advance science so future devices can be created using the technology that will directly help people with paralysis. Also, I can say from personal experience its pretty dang cool to try and control a computer just by thinking about it.

Happy to answer any questions in comments or via DM - and here's the website for our lab if you want to poke around! Neural Engineering and Computation Lab
 

Attachments

  • bci study info.pdf
    105.6 KB · Views: 95
I emailed this to your professor. Hi my name is Jim. I was diagnosed with ALS in 2015 I'm 54. I'm a Former machinist in biomed. Now have a trach and can only move my head slightly. I'm in Torrance 90501. I don't have the resources to leave my apartment. I'm still using head tracking and my most used device is my android phone using Eva facial mouse pro. I'm not looking forward to eye gaze because it's so unforgiving in device positioning. I'm interested in brain to computer interfaces. I have extensive home automation in place. I even wired up a controller for my recliner so I can move myself.
I'm not into social media. But I did make a YouTube video a few years ago.

View: https://www.youtube.com/watch?v=QV9hAXK4w4I

Lmk if I can help. Thank you for your research.
 
That is such a cool system, thanks for sharing! The system we're working on is designed to allow similar interactivity for as long as possible, but is in an early stage - with some audio feedback added, we're hopeful there's no limit on how long it could work (i.e., should keep working even after eye gazing is no longer easy or possible).

We can absolutely travel out to you, and have been doing so with other ALS participants. Jonathan (the PI) mentioned your email as well, so I know he got it. Hope to meet you soon, thank you again for reaching out and being open to help with early stage research that has big potential to help future generations!
 
Status
Not open for further replies.
Back
Top