How-tos

Taking your 1st steps to using ‘The Force’

Share this post:

Ever since I saw Yoda raise the X–Wing out of the swamps of Dagobah in the Empire Strikes Back, I’ve been craving an opportunity to use ‘The Force’ in real life. Fortunately technology is now at the point where anyone can purchase devices that make it possible. Well, pretty much possible. Check out this video:

This blog is going to take you through all of the parts you need to have your own version of mind controlled robot. The great thing about the Internet of Things is that you’re able to substitute any device or sensor easily, so should you want to modify this in any way – feel free to!

Connecting your thoughts to the cloud!

Below is a diagram of how all the pieces connect together:

And below is the required hardware and software.

Required hardware

Required software

If you don’t have a Windows 8+ machine, Ubuntu 14+ will work. Just make sure you have BTLE SMART. Additionally, you don’t have to use a Pi to connect to the BB-8. I used it because it can be battery powered transported easily. Don’t worry if you haven’t got the Insight yet, the Emotiv SDK Lite lets you simulate the actions.

Implementation details

Go to Bluemix and create an Internet of Things app using the boilerplate. Give it a name and let it spin up. This will create an instance of Node-RED that you can access from your browser. We’ll need this to connect our two devices together.

Training the Emotiv Insight

Once you have your Insight, and are familiar with how to wear it to get strong connection (see Emotiv Headset control panel for details), the next step will be to either use the emotion’s recognized straight out of the box (excitement, focus, etc, etc) or to train your headset. Emotiv has lots of documentation and a forum, so I won’t go over that here. I wanted to personally train my Insight; it took me several hours to get the headset to be trained to a satisfactory level, but your mileage may vary. I would echo the recommendation that you should get one command working before trying to add another. In my demo I got push and pull, but you can always add more commands. Apparently up to 12!

Receiving data from the Emotiv Insight

The next step will be down download the SDK that comes with your headset. Inside the SDK there are plenty of options in terms of language choice; I used Java, which also means the IoT library will also be Java. There are some quirks to working with the SDK, e.g., making sure you give your code access to included DLLs if you’re using the Java instance.

Following the Emotiv instructions should suffice and get you to the point where you can connect to your headset and receive data from it no problem. You should also be able to recognised trained commands too, though the Emotiv 3.3.0 SDK Documentation may also be of help.

Once you have your SDK set up and you’re able to receive data from the Insight to running code, the next step will be to get that data to the cloud using IBM’s Internet of Things. You can download the zip from IoTF Java Client Library Version 0.0.4 and extract the libraries into your project. Then go to IBM’s Internet of Things Foundation to setup the credentials you’ll require to send/receive data in the cloud; you’ll need your applications’ Org ID and an API key to register your device.

With the above information, you will need to create a client within the application you’re creating. You can follow the instructions found in IoTF docs to do so. The next step is to recognize when you’re communicating an event and then publish that to your Bluemix instance. You can check that you’re sending the data correctly by going to your address for the IoT boilerplate found in your Bluemix dashboard.

To do that, head to [yourappname].mybluemix.net and launch Node-RED. You can learn the basics of Node-RED by perusing the official documentation  You’ll want to drag an IoT node and a Debug node across. Fill in the IoT node details, moving from quickstart to API key, enter the details from IoT Foundation. Wire it up to your debug node and run your application. Once data is being sent, you will see it appear in your Node-RED instance:

Sending data from Bluemix to the BB-8

Depending on how you structure the data you are publishing from your IoT Doundation client, you may need to use Node-RED to create instructions that will be received by the Pi that’s connected to the BB-8. In my example, I use a JSON object that contains push, pull, and neutral. Two of those will be 0 and one will be an integer that increases based on the length of time that thought stays. That data is then sent to the Pi using another IoT Foundation node.

The Pi will need the code that will connect to the BB-8. You will essentially need to replicate what you did with your code for the Insight, but instead of publishing data, you will be subscribing to that data. Before that, you will need to connect to your BB-8. To do this you will need the BlueZ (www.bluez.org/) package that will enable you to perform a Low Energy Bluetooth scan. By using the command hcitool lescan, you will be presented with the MAC addresses of local BTLE devices—if your BB-8 is active you will see it listed. It is likely to show up as an ‘(unknown)’ or – in some cases – a “BB” device. In the BB_driver file, change line 244 to the Mac address of your BB-8. You will probably want to quickly run BB8test.py to verify that you’ve got a connection (BB8 will light up some colours!).

You will need to get the IoT Foundation Python library onto the Pi. Once you’ve got the library installed, you will need to create a new client in Python that connects to IoT Foundation, and imports the BB-8 Driver.

You will need to run a conditional statement against the received data to work out which command is being received, then use the roll() command to move the BB-8. You can use the roll() command to stop the BB-8 and you can also use it to change direction.

Conclusion

Two pieces of hardware connected with IBMs IoT Foundation on Bluemix, it’s a pretty neat way to make you feel like Yoda! Next step is to connect it to an X-Wing!

More seriously, there are a few possible next steps. For example, the BB-8 can light up different colors, offering plenty of communication display options. How about Watson’s Speech-to-Text telling the BB-8 to light up different colors? How about having BB-8 reflect your current sentiment or mood by using Watson and Twitter data? What about also using the accelerators in the headset to move the BB-8 left and right? The options are pretty endless.

Should you need help with this tutorial, I’ll happily field questions in the comments or via @josh_schwaa on Twitter.

More How-tos stories
May 1, 2019

Two Tutorials: Plan, Create, and Update Deployment Environments with Terraform

Multiple environments are pretty common in a project when building a solution. They support the different phases of the development cycle and the slight differences between the environments, like capacity, networking, credentials, and log verbosity. These two tutorials will show you how to manage the environments with Terraform.

Continue reading

April 29, 2019

Transforming Customer Experiences with AI Services (Part 1)

This is an experience from a recent customer engagement on transcribing customer conversations using IBM Watson AI services.

Continue reading

April 26, 2019

Analyze Logs and Monitor the Health of a Kubernetes Application with LogDNA and Sysdig

This post is an excerpt from a tutorial that shows how the IBM Log Analysis with LogDNA service can be used to configure and access logs of a Kubernetes application that is deployed on IBM Cloud.

Continue reading