Researchers from University of Washington showed that human brain signals could be decoded at almost the speed of perception by using brain implants and sophisticated software. Technology allowed the scientists to view a two-dimensional image on a page or computer screen.

Then the image was immediately transformed into something that the human mind could immediately recognise. The experiment allowed the team to better understand the neurological process that used to be a mystery to science.

The study, published in PLOS Computational Biology, used in the experiment seven epilepsy patients whose medication could no longer alleviate their seizures. The patients received temporary brain implants and the electrodes used to locate the seizures’ focal point, reports Gizmodo.

Since the research would get anyway the electrodes whatever the outcome of the study, the team gave the seven patients extra tasks to do during their confinement. They were made to view photos in random sequences showing homes, human faces and blank grey pages. The images were on computer screens and shown at intervals of only 400 millisecond.

The patients were instructed to look for a photo of an upside down house. During this time, the electrodes in their brains were connected to the software which extracted two properties of brain signals that were distinct. The first signal occurred when large batches of neurons lit up simultaneously as a response to an image which the team called “event related potentials.” The second signal are those signals that linger after an image was viewed, called “broadband spectral” changes.

The computer sampled and digitised the incoming brain signals as the photos were shown on the monitor at 1,000 times per second. The software determined the combination of electrode locations and signals correlated best to the images seen by the patients.

According to Rajesh Rao, a neuroscientist, and Jeff Ojermann, neurosurgeon, and the leaders of the study, the team received different responses from various electrode locations. In some cases, the responses showed more sensitivity to faces, in others, to houses, Rao says.

The scientists then trained the software and exposed the patients to a different new set of images. Despite new photos shown to the patients, the computer predicted with a 96 percent accuracy rate and at almost the speed of perception if it was a grey screen, house or face seen by the patients.

But the authors admit despite the interesting results, it was “exceptionally limited,” and a much larger set of images with different categories would be a true test of the system. After the system is refined, the team could use the kind of brain decoding to construct communication mechanisms for patients who suffer from paralysis or stroke and are “locked in.” It could also help brain mapping wherein neuroscientists could pinpoint in real time areas in the brain responsible for certain types of information.