Brain-computer interface – the journey from sci-fi to high street

To most people control chips embedded in our brains still sound like creepy science fiction. However, Elon Musk recently unveiled pig Getrude having a computer chip in its brain. This might be one of the next big steps in getting brains to communicate with machines and thus helping people with neurological conditions to control their technical devices.

No longer Sci-Fi, but already known for some time, are wearables that electroencephalography (EEG). They create a brain-computer interface by placing skin probes on the head to measure microscopic voltage changes.

Most of these wearables work by having users think about a specific action, for example, "turn on the lights", and then recording the thought-waves that occur in correlation with that thought. When the device detects the same thought-waves again, it can automatically turn them into the corresponding action within fractions of a second.

The big question is, beyond the novelty factor or specific use cases like gaming, whether the technology has any practical applications or adds value. After all, even in today's Alexa and Siri-enabled world, the vast majority of televisions and lightbulbs are still turned on and off using switches and remote controls.

Using Smart technology to improve the quality of life

One area where the mind-controlled smart home could have an immediate impact is in helping people with physical limitations, such as tetraplegia or motor neurone disease, to have control over various aspects of their environment. Just as recent advancements in this space have enabled communication tools, the same EEG technology could allow people with physical limitations to turn on the lights, change the temperature in their home, listen to music or open the window blinds.

“Solving a strong pain point is key for the success of radical innovation;” says Dr. Markus Grote, Head of Connected Life at E.ON Innovation. “Initial solutions show that mind control truly is a game changer for people with physical limitations. We expect substantial advances within the next years rather than decades.”

That's not to imply that mind-controlled interfaces won't become the norm for the rest of the population too. However, like all AI-enabled technology, it's likely to gain traction and usefulness as more and more people start using it, allowing it to act on predictions, feelings and emotions rather than orders.

No matter how advanced or well-integrated it becomes, using mind control may not become the one interface to rule them all. The short-term future is likely a combination of touch, voice, lens and thought. But, if Musk and Zuckerberg's vision of technology-enhanced humans is to come true, then EEG wearables could be the next step in that evolution very soon.

You might also like

Human-machine partnerships

Human-machine partnerships

Let’s huddle together: Using body heat as an ancient source of energy

Let’s huddle together: Using body heat as an ancient source of energy

Subscribe to our newsletter

Join the community of innovators and stay up to date on the latest news about the future of energy.

* indicates required
Please fill out this field
Marketing Permissions

E.ON Group Innovation GmbH will use the information you provide on this form to be in touch with you and to provide updates and marketing. Please let us know all the ways you would like to hear from us:

Please accept this option.

You can change your mind at any time by clicking the unsubscribe link in the footer of any email you receive from us, or by contacting us at innovation@eon.com. We will treat your information with respect. For more information about our privacy practices please visit our website. By clicking below, you agree that we may process your information in accordance with these terms.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.

Something went wrong. Please again later.

Nearly there.

Follow us on social media