Why we want to teach the world how to read brain waves

by Geordan King

As our team continues to search for the technological pulse of the future, we spend a lot of time working with new technologies that show promise.

This past year, we got our hands on a few Brain Computer Interface (BCI) devices and built an app that teaches neuroscience by displaying a user’s brainwaves in real time. Why? Because we know that BCIs will be a significant part of the future, and it is a perfect time to get involved and encourage others to start thinking about the possibility of advanced neuro-tools breaking into the mainstream.

Why Neuroscience?

Neuroscience is a hot topic in the public realm, and with good reason: everyone has an interest in their own brain. The brain is the most advanced supercomputer known to man, which everyone gets at birth, and it doesn’t come with an instruction manual.

While widespread interest in BCI has had the unfortunate side effect of creating ‘neurohype’ (the marketing of products based on a dubious attachment to neuroscience), humans are now closer than ever to a point where brain interface technology is sufficiently advanced to be useful in everyday life. Evidence for this can be seen in the soaring number of patents for neurotechnology that have come out in recent years. Even Facebook has started a brain-machine interface initiative.

Despite this, neuroscience as a whole remains a mystery to most  consumers and software developers. The technology for collecting brain data is more available than ever, but a lack of education limits the scope of innovation and diversity of contributors. This ultimately slows the progression of the industry as a whole. As such, we thought it would be helpful to teach some of the basic building blocks of neuroscience and brain-computer interfaces. For now, we’re focusing our efforts on EEG (electroencephalography) because it is the most accessible way to access detailed brain data in real time.

The EEG Ecosystem

EEG is the oldest, cheapest, and most widely available brain monitoring technology. Based off recording small electric fields from the surface of the scalp, EEG is non-invasive and quick to set up. A flood of consumer-focused, affordable EEG headsets have come to market in the last few years (Muse, Epoc, OpenBCI, Aurora, BrainBit, Kokoon). Thus, if a user is willing to wear an EEG device (which soon might be no more intrusive than wearing a pair of reading glasses), it is possible to serve insights about brain activity and create software to that responds to predictable events.

During our initial exploration we were surprised to notice that even with the large number of devices with developer support, there is only a very small software ecosystem around EEG-based neurotechnology. As we investigated the situation locally in Toronto, we found that most developers simply didn’t know where to start: what technology to use, what EEG could be used for, and how to analyse the data. Those we met who knew EEG well, the neuroscientists, were either unaware that these devices existed or simply didn’t know how to leverage their capabilities with code.

With this in mind, our mission became clear: teach the building blocks of the neuroscience behind EEG while giving researchers and developers an open-source platform for implementing their own ideas.

Our app: EEG 101

The app that we built in collaboration with the NeuroTechX community and with the support of the team at Muse, teaches users the basics of EEG while displaying their own brain data in real time.

With EEG 101, we hoped to give users a taste of what it feels like to do actual neuroscience by taking a look at raw brain data and distilling that into meaningful output. As a former neuroscience major, I was really excited about the opportunity to present neuroscience in a tangible, hands-on way to those who would otherwise never be able to see it. Furthermore, we wanted to share all of our code so that future developers could get a head start on their own projects.

As the user navigates a series of lessons covering topics such as neurobiology, hardware, and signal processing, they are able to see their own brain data streaming from a Muse headband. This allows us to present the technical and somewhat abstract process of interpreting EEG data in a visual, interactive way, showing how each step in the EEG signal processing pipeline affects their data.

Next Steps

Neurotechnology is still in its infancy. In order for it grow, the world will need developers who are unafraid of exploring unfamiliar terrain and diving into complex, noisy data. Creating tools that increase efficiency for onboarding will be key to enabling talented minds to contribute.  With open access to good code and proper training in the fundamentals of neuroscience, we hope the next generation of engineers can be empowered to push the boundaries of brain-machine interfaces further and faster.

Our team had a lot fun building EEG 101 and look forward to having more fun with it in the future. Continued updates to the app, including an offline mode (that doesn’t require a Muse) and an iOS version are in the works.

Related Links

wpChatIcon