Got a bright idea? Find out how bright, with this lightbulb that detects your mind's focus!
More precisely, we're using a Muse EEG sensor to detect beta-range frequencies in the spectrum of electrical pulses that are constantly emitted by your brain! Then we pipe that through a NodeBots system to turn on a light – even over WiFi. AWESOME.Muse Setup
First, get familiar with your Muse before attempting this project: the better your connection with the electrodes, and your control over your brainwaves, the better the demo will be.
Install the Muse Monitor app on your phone: https://musemonitor.com
You'll need to put in streaming settings, as described in the Arduino section below. For now, play around with the graphs and learn about the different EEG frequency bands (delta, theta, alpha, beta, gamma) and what they mean.
In Alex's case, she practices turning the LED on and off for 5-10 seconds at a time, to convince the skeptics that the brain control is indeed real. One easy way to drive up your beta activity is to simply will the light to turn on. Try not to move too much or clench your muscles. You can also watch the numbers or lines on the screen and will them to increase. If you don't have visual feedback, you can also practice doing difficult math in your head – anything that requires strong focus.
You could also take a more relaxed approach and use the alpha frequency band instead of beta for this. That relates to a calm mental state, and gets a big boost when you close your eyes.Arduino
First, we built this on an Arduino Pro Mini, late at night in Brooklyn after Maker Faire. We had gotten dessert at a place that serves drinks in lightbulb-shaped cups, and cajoled them into selling us the cups separately.
Moheeb bought a low-power LED string and cut it in half, then stuffed it inside his cup. Voilà, idea lightbulb! It can run off one of the Arduino's output pins. (He soldered jumper wires to it on the table at a fancy tea restaurant.)
Firmata is a special "listener" sketch that sits on the microcontroller, and receives data over a USB serial connection. You can learn more by searching for NodeBots!
Meanwhile, Alex got the Muse device to stream data with the Muse Monitor iOS app. It gives some pretty great real-time data graphs right on your phone, but can also stream to another IP address on the same WiFi network.
She tested first using the MuseLab application for Mac (http://developer.choosemuse.com/tools/mac-tools/muselab), which can open a UDP receiver port and visualize the EEG data coming in from your brain.
Then, we put the pieces together! Moheeb updated his code to pull in the streaming data and scrape it for the beta frequency channel, then set a threshold of attention at 0.29. The script also prints the current beta level to your terminal console. Above the threshold, the LED turns on; otherwise, it's off.
This turned out to be the perfect level for Alex to focus and control the light bulb. Eureka!
And then, it was 5am and Alex headed home. :) Sweet dreams for all!
Running the script
To run the code, open the script directory in your terminal and type ifconfig to find your computer's IP address on the network. Enter this in Muse Monitor (with your phone on the same network) and put that IP in the streaming settings. Match the streaming port to the index.js script.
Then, type node index.js to launch the script on your computer and have at it :)Particle
Alex decided to present this at Particle's Spectra conference, so she mashed up Moheeb's code with the Particle platform for NodeBots: https://github.com/rwaldron/particle-io
With some tweaking, this should eventually enable you to carry the Thinker and Blinker parts around wirelessly, while the computer does all the processing from a dark corner.
To replicate the Arduino functionality with Particle involved updating NPM and the particle module; then, editing the setup portion of the script to match Rick's example code; and finally, updating the pin to D7 to use the Photon's built-in LED. (And adding comments!)
The environment variables in the code allow us to update it on GitHub without cleaning out personal data first ;) A highly recommended step!Demo
It takes a few steps to get this demo up and running!
Plug in your microcontroller; in the case of the Photon, that can be any power source.
Connect the Muse headset to your phone over Bluetooth.
For the Particle version, start your phone's hotspot and make sure your Photon and computer are connected to it. I've found this to be more reliable than using other networks, since there's no extra authentication step and you can run it anywhere with cell service! (It surprised Alex that you can use tethering, since she expected the phone to complain about streaming data to other devices over its own hotspot. Neat.)
Then, open the code directory in your terminal, and run:
(For Particle, if you get an authentication error, re-export the environment variables and you should be good to go.)
I've also boosted the threshold to 4.0 for turning on the lamp. Here's a (somewhat creepy) demo, now that I've attached the Photon to my own lightbulb lamp:
In this demo, I first tried using my hand to indicate when I'm thinking it "on". You can also see the readout in the Muse Monitor app – a little more visual than the streaming data in the terminal, although they're both (in theory) showing the same thing. Note that the 0-100 range of the app doesn't correspond to the "beta_absolute" OSC value being received by the computer.
Afterwards, I did it without moving a couple of times, just to show that the gesture isn't triggering anything.
I'm really happy with how well this project works – especially considering that the data is being streamed over wifi TWO WAYS between my brain and the bulb!