Deep Dives

Computers Will Soon Read Your Mind

Technology will help patients suffering from ALS or strokes.

Computers Will Soon Read Your Mind

Originally published in The Wall Street Journal on December 14, 2023.

It’s been almost a century since psychiatrist Hans Berger made the first electroencephalogram, providing a glimpse into the electric nature of the human brain. EEG readings have helped countless people struggling to recover from ailments ranging from epilepsy and sleep disorders to head injuries and brain tumors. Technology has come a long way since then, and artificial intelligence may soon give us a new brain technology revolution, with advances in the treatment of ALS, strokes and other conditions.

As a teenager in a mentorship program, I decided to study the brain after watching a neurosurgeon implant an electrode deep into the brain of a patient with Parkinson’s whose tremors were making it impossible for her to hold a pen or drink from a cup. The surgeon implanted the electrode — designed to deliver the right amount of electricity to the exact part of the brain responsible for the tremors — and awoke the patient, her skull still open, to adjust the implant’s settings. A few turns of a dial and the shaking stopped. Her tremors were cured.

While the discovery of EEG signals was revolutionary, they can be noisy and difficult to interpret, requiring expensive equipment and controlled environments. With recent advances in sensor materials, we are approaching the point at which brain signals can be read throughout the day with comfortable and discreet wearable devices, as a Fitbit or Apple Watch measures our heart rates. Advances in computing and AI mean we could interpret these brain signals in real time.

The possibilities include thought-to-speech and thought-to-movement assistive technology for ALS or paralysis patients and accelerated, customized recovery protocols for those suffering from strokes, post-traumatic stress disorder and brain trauma. Brain-computer interfaces could also help personalize teaching and training protocols to fit a learner’s cognition and memory processes, eliminate the need for usernames and passwords with a seamless “brain ID,” and enable you or a mental-health professional to monitor your emotional state throughout the day.

I was part of the team that tried to develop a brain-computer interface for the astrophysicist Stephen Hawking, who suffered from ALS. Hawking’s Intel-designed eye-tracking and cheek-click method relied on a level of muscular control that couldn’t be taken for granted given his condition. He participated in the project, as he put it, “to assist in research, encourage investment in this area, and, most importantly, to offer some future hope to people diagnosed with ALS and other neurodegenerative conditions.” He died in 2018.

Today implant-based systems are increasingly powerful and noninvasive, and wearables are improving quickly too. Many of us in the field believe we are nearing an inflection point when countless people will see the fruits of decades of research. The stakes are high. Although every new technology carries promises and risks, few are tied so intimately with who we are.

Mr. Furman is a founder and CEO of Arctop, which makes brain-decoding software.

Source: The Wall Street Journal — “Computers Will Soon Read Your Mind”

About the author

Dan Furman
Dan Furman

Dan Furman began his interest in neuroscience in high school, where he co-developed, with neurosurgeon Christopher Duma, a minimally invasive surgical method that used brain imaging to predict the spread of malignant brain tumors and proactively halt tumor growth using gamma radiation. Dan went on to earn his A.B. in Neurobiology at Harvard. After graduating, Dan joined a company that physicist Stephen Hawking worked with on an experiment that aimed to allow him — long paralyzed by ALS — to communicate by merely thinking. Dan then joined a PhD program in Neuroscience at Technion's Evoked Potentials Laboratory to better understand the hard problem of noninvasive brain-to-computer communication. There, Dan, along with his advisor Hillel Pratt, became the first in the world to demonstrate that non-invasive brain sensor data could be used to control the movement of individual neuroprosthetic fingers. This level of granularity had never been accomplished before, and proved that brain signals measured from the scalp carry far more information than previously believed. Dan met co-founder Eitan at a startup accelerator hosted by Google, where they decided to join forces to build Arctop's cutting edge brain-decoding software platform.

LinkedIn →