French startup NextMind SAS turned a few heads at the Consumer Electronics Show in Las Vegas today with the debut of a “real-time brain-computer interface” that translates signals from the visual cortex into digital commands.

The company also announced plans to release a developer kit for the brain-computer interface, which it will ship in the second half of the year.

The device itself is a small disc-shaped object that sits on the back of the user’s head and weighs just 60 grams. It contains eight electrodes that are meant to measure brain activity and translate the user’s thoughts into actions.

To use the device, the wearer needs to focus on a specific object. Doing so creates a specific response in the visual cortex, which generates machine-readable brain wave patterns. The neurosynchronism between the object and the corresponding brain waves is then used to translate visual attention into computer commands.

“We have pretty good understanding of how the brain works, and especially how visual consciousness, perception, attention works,” NextMind Chief Executive Officer Sid Kouider told VentureBeat. “This is what allowed us to invent this new approach that we call digital neurosynchrony. We use your top-down attention as a controller. So when you focalize differentially toward something, you then generate an attention of doing so. We don’t decode the intention per se, but we decode the output of the intention.”

NextMind’s headset also uses artificial intelligence software that improves its accuracy over time, so that the more frequently it’s used, the more accurate it becomes.

VentureBeat was able to test the headset in a number of demos. First, the software was calibrated for the users. For this, the tester had to concentrate on a recurring green triangle pattern on a screen. After a few minutes, the device generated a neural profile for that user.

In the demos that followed, VentureBeat’s tester was able to operate a television in which he drew his attention to green triangles. In the second demo, he controlled a simple hopping game using the same principle. In the third demo, he played a modified version of the Nintendo Entertainment System classic “Duck Hunt” and shot the ducks solely by visual focus. The demos weren’t perfect, but it’s clear the technology works, VentureBeat concluded.

Kouider told VentureBeat the biggest limitation right now is on the hardware side. However, he said the company is working to make its device even more precise.

The developer kit will be sent to selected developers and partners this month. After an early access period, a second tranche of hardware is to be sent to developers in the second half of 2020. Those interested can join the waiting list here.

With the developer kit, Nextmind is pursuing two goals: On the one hand, it wants to collect more data to improve its AI system. On the other hand, it wants developers to explore new applications with the device. One idea is that manufacturers of self-driving cars could install electrodes in their seats, so that comfort functions of the vehicle can be activated using brain signals alone.

Another application NextMind mentioned was a brain computer interface for augmented reality glasses that uses eye tracking, gestures and speech to supplement the brain signals.