Mind control: 3D-patterned sensors allow robots to be controlled by thought
It seems like something from a science fiction movie: a specialized, electronic headband and using your mind to control a robot.
A new study published in the journal ACS Applied Nano Materials took a step toward making this a reality. The team produced "dry" sensors that can record the brain's electrical activity despite the hair and the bumps and curves of the head by constructing a specific, 3D-patterned structure that does not rely on sticky conductive gels.
University of Technology Sydney (UTS) researchers have developed biosensor technology that will allow you to operate robots and machines entirely by thought control.
The enhanced brain-computer interface was created with the Australian Army and the Defence Innovation Hub by Distinguished Professor Chin-Teng Lin and Professor Francesca Iacopi of the UTS School of Engineering and IT.
In addition to military applications, the technology has tremendous potential in industries such as sophisticated manufacturing, aerospace, and healthcare, such as allowing persons with disabilities to control wheelchairs or operate prosthetics.

Electroencephalography (EEG) is a technique doctors use to monitor electrical signals from the brain by implanting or placing specialized electrodes on the surface of the head. EEG not only aids in diagnosing neurological problems but may also be used in "brain-machine interfaces," which use brain waves to operate an external object such as a prosthetic limb, robot, or even a video game.
Most non-invasive versions employ "wet" sensors that adhere to the scalp using a gloopy gel that can irritate the scalp and occasionally cause allergic responses.
Researchers have been working on "dry" sensors that do not require gels as an alternative, but none have performed as well as the gold-standard wet kind. Although nanomaterials such as graphene may be a viable choice, their flat and often flaky nature makes them incompatible with the uneven curves of the human skull, especially over long periods.
As a result, Francesca Iacopi and colleagues set out to develop a 3D graphene-based sensor based on polycrystalline graphene that could accurately monitor brain activity while being stick-free.
The researchers produced numerous 3D graphene-coated structures with varying shapes and patterns, each approximately 10 m thick.
A hexagonal pattern performed the best of the designs examined on the curved, hairy surface of the occipital region — the location at the base of the head where the brain's visual cortex is located. Eight of these sensors were combined into an elastic headband that kept them against the back of the head.
When used with an augmented reality headset that displayed visual cues, the electrodes could recognize which line was being observed and then work with a computer to translate the signals into commands that controlled the mobility of a four-legged robot – fully hands-free.
However, the new electrodes did not perform as well as the wet sensors; the researchers believe their study is a first step toward building robust, easily deployed dry sensors that will help expand the applications of brain-machine interfaces.
The complete study was published in ACS Applied Nano Materials and can be found here.
Study abstract
The availability of accurate and reliable dry sensors for electroencephalography (EEG) is vital to enable the large-scale deployment of brain–machine interfaces (BMIs). However, dry sensors invariably show poorer performance compared to the gold standard Ag/AgCl wet sensors. The loss of performance with dry sensors is even more evident when monitoring the signal from hairy and curved areas of the scalp, requiring the use of bulky and uncomfortable acicular sensors. This work demonstrates three-dimensional micropatterned sensors based on a subnanometer-thick epitaxial graphene for detecting the EEG signal from the challenging occipital region of the scalp. The occipital region, corresponding to the visual cortex of the brain, is key to the implementation of BMIs based on the common steady-state visually evoked potential paradigm. The patterned epitaxial graphene sensors show efficient on-skin contact with low impedance and can achieve comparable signal-to-noise ratios against wet sensors. Using these sensors, we have also demonstrated hands-free communication with a quadruped robot through brain activity.