First-Ever AI Neural Network Made From 2D Materials, 'Sees' Handwriting
Researchers have developed the world's first neural network for artificial intelligence using 2D materials, according to a recent study published in the journal Advanced Materials.
RELATED: THIS NEW ELECTRONIC MATERIAL IS STRETCHABLE, SELF-HEALING, AND ILLUMINATING
First-Ever neural network AI made from 2D materials
Two-dimensional materials are matter with a thickness of only a few nanometers (or less), and often consist of a single sheet of atoms. The resulting machine vision processor can capture, store, and identify more than 1,000 different images, according to a blog post on Harvard University's website.
The first-ever neural network AI made with 2D materials comes from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), who worked with Samsung Advanced Institute of Technology.
"This work highlights an unprecedented advance in the functional complexity of 2D electronics, said the Gordon McKay Professor of Electrical Engineering and Applied Physics at SEAS and the paper's senior author Donhee Ham. "We have performed both front-end optical image sensing and back-end image recognition in one, 2D, material platform."
2D material-based transistors still relatively primitive
Since graphene was discovered in 2004, researchers have worked to find new ways of harnessing the innovative electronic and optoelectronic properties of atom-thin, 2D semiconductors as the basis for a wide range of exciting applications.
Transistors made from 2D materials have seen use in simple digital logic circuits and photodetectors, but integration on large-scales for complex computing — like AI — is as of yet untenable.
As of writing, researchers have successfully integrated roughly 100 transistors made from 2D materials onto one chip. To clarify: a standard silicon integrated circuit like the ones in a smartphone possesses billions of transistors.
"Two-dimensional material-based devices exhibit various exciting properties, but low integration level has restricted their functional complexity," said Houk Jang, a SEAS research associate and first author of the recent paper, according to the Harvard blog post. "With 1,000 devices integrated on a single chip, our atomically thin network can perform vision recognition tasks, which is a remarkably advanced functionality of two-dimensional material-based electronics."
Three-atom thick framework works like a human eye
The team of researchers used a 2D material known as molybdenum disulfide, a three-atom thick semiconductor that shows effective interactions with light. They organized these photosensitive transistors into what's called a crossbar array — which takes inspiration from neuronal connections in human brains.
This ostensibly simple framework lets the device work as both an eye to view an image and a brain to store and identify images at a glance.

2D material-based AI converts images into electrical data
The front-end of the instrument uses a crossbar array like an image sensor, taking in an image like an eye. The 2D materials' photosensitivity enables the device to convert and store the image as electrical data. On the flip side, the same crossbar array carries out networked computing with the electrical data to identify the image.
To prove their method, the researchers exposed the device to 1,000 images of handwritten digits. The internal processor successfully recognized and identified images at a 94% accuracy.
"Through capturing of optical images into electrical data like the eye and optic nerve, and subsequent recognition of this data like the brain via in-memory computing, our optoelectronic processor emulates the two core functions of human vision," said co-author of the paper and SEAS graduate student Henry Hinton.
Soon the team will scale up the device even more to produce a 2D material-based and high-resolution imaging system. Advances like this — which sits on the intersection of AI, graphene, and 2D materials science — could one day help lead to artificial vision, possibly even as a basis for future bipedal or human-like AI robots, living among us like a second intelligent life-form. And we're here for it.