Observations and thoughts on BCIs & Neurotech

Published the day before the first Neuralink announcement

As a neurotech enthusiast, I can’t fall asleep after I saw Neuralink’s tweet recently. Tomorrow, they are finally going to reveal their secretive research — something that might make one of the biggest leaps in human cognition.

Being actively involved in neurotech projects and built a Brain-Computer Interface product (Muse 2, one of the most advanced consumer BCIs) before, I want to share some thoughts based on my understanding of the field. This essay was originally written for my application to the Jerome Fisher Program in Management and Technology at the University of Pennsylvania.

Identify a disruptive technology, one that many consider could drive truly massive economic and societal transformations in the coming years. Argue why the technology may not be as successful as observers think and suggest ways to address the concerns.

(CorTec)

Brain-computer interfaces (BCIs) are devices that communicate with the human brain. They have significantly helped the disabled to regain vision and motor control. Recently, they have gained more attention after Elon Musk announced his ambitious goal of launching a consumer-level BCI that enhances cognitive functions in the next eight to ten years.

WaitbutWhy article on Neuralink, highly recommends it! (WaitbutWhy)

Although I appreciate Musk’s aspiration of merging brain and computer to save humanity from the competition with Artificial Intelligence, his goal is unrealistic. We are far from high-bandwidth brain-to-computer communication. Neuroprosthetics implants, the most advanced BCIs nowadays, still cannot accurately detect intent. The invasive interfaces Musk envisions will take much longer to realize, not to mention the pervasive consumers’ fears of brain implants.

BCIs have great potential, but instead of brain implants, we should first focus on non-invasive, passive wearables. These devices use electrodes to pick up electrical signals (EEG/Electroencephalography) from the scalp. Although those signals are not as accurate as ones collected from implants, they can be used to infer intent and study cognitive functions using signal processing and machine learning techniques. More importantly, the technology is completely safe and much more mature.

EEG-based wearables are not very present in our daily lives, likely due to a lack of meaningful applications. BCIs like MIT Media Lab’s AlterEgo and Facebook’s brain-typing project can detect intended speech and provide a faster and more intuitive way to type. (Note: some people might argue that they are not exactly BCIs, as they mostly use EMG signals for training.) However, consumers are unlikely to switch to those devices unless they provide interactive experiences that cannot be created with existing interfaces. BCIs, providing an additional layer of information, could enable completely new experiences for regular use. For example, headphones with electrodes can collect and analyze users’ moods and alpha frequencies to recommend songs. Elements in games, such as shooting accuracy, can be influenced by players’ concentration and become more realistic. Mental states during periods of sleep and focus can be tracked and improved real-time with electrical or auditory stimulations. Above all, BCIs have the greatest potential in Virtual Reality (VR) and Augmented Reality (AR). Using fingers and controllers to input can be tiring and awkward to do in public. Instead, neural signals are much more efficient for input and selection. VR and AR headsets are also perfect platforms for BCIs as they are already in contact with the scalp.

Introducing a consumer-facing BCI is crucial to the rapid development of the technology itself. Since current techniques for intent detection depend on machine learning, further advancement in BCIs requires large amounts of brain data. The current EEG collection processes in labs and hospitals are expensive, time-consuming and inconsistent. However, a BCI device used daily by consumers can provide enormous amounts of consistent data from daily activities at a low cost. Although less accurate, the quantity of data can provide significant insights into analyzing users’ intent. Those data, when referenced with specific activities such as visual stimuli, also provide insights into human reactions and decision-making processes, making them valuable for target marketing and customization purposes.

Keyboard input through intent detection using P300 wave (Research Gate)

While using data collected to advance this field, we should take steps to protect the users’ privacy, because the consequences of a data breach would be significant. By analyzing users’ brainwave responses to different digits, for instance, credit card information could be stolen. Stewardship of these data would also entail the understanding of users’ reactions to information. Safeguarding this information is paramount. Strict industry standards and legislation are needed to oversee the extent of data collection and protect consumer privacy, while allowing the technology to develop.

I aspire to help build a fantastic, rather than dystopian, future. Promoting EEG-based BCIs is a more responsible approach to bringing BCIs to the market. We should commercialize it with more meaningful applications while ensuring data security to allow healthy development. Faster market adoption of EEG-based devices will not only enhance the technology itself, but also build the regulatory environment and societal acceptance needed for brain implants.

Muse 2, the EEG headband I helped design and develop last summer at Interaxon. It composes music based on the user’s mental state and other biometrics to provide real-time meditation feedback. I believe in turning these technologies into meaningful and beautiful products that have tangible and important use cases. (Muse)

Can’t wait for tomorrow’s livestream and what’s next for this field!

Accelerating Deep Tech | Robotics, Blockchain, Neurotech | EECS @UCBerkeley | Teaching @CalBlockchain, Director @BB_Xcelerator | prev @hax_co, @SOSV, @Interaxon