Neuroscientists Are Using AI Models to Create "Toy Versions" of the Human Brain

Jack Crosbie
MARCH 29, 2018
cognitive neuroscience,machine learning,deep learning,hca news

Human beings have been cracking open heads to see what’s inside for thousands of years, but the brain remains modern medicine’s most complex mystery. New research, however, suggests that there may be a way to study the brain in a more controlled format than the human body—through a digital simulation.

Researchers at the 25th annual Cognitive Neuroscience Society meeting earlier this month revealed that they’d made significant progress in teaching a machine learning algorithm to analyze images and other data in a similar way to the human brain.

In one study, the artificial intelligence (AI) simulation of a brain was shown 10 million different images of daily life, and it managed to learn some interesting patterns. Researchers initially thought it would pick up on common associations that people make every day, like how the shape of a bed or mattress was associated with a bedroom. But, surprisingly, the digital brain went even further, learning to recognize dogs, especially when they were in parks, and cats at rest in living rooms. Unlike a human brain learning to recognize the same patterns in infancy, researchers can watch every calculation and computation that the digital model makes, providing insights into how a real brain might function.

Still, it’s not perfect.

“The wiring between neurons in the human brain has a level of complexity that current neuroscience technology cannot measure yet,” Aude Oliva, PhD, the principal research scientist at MIT’s Computer Science and Artificial Intelligence Lab, told Healthcare Analytics News. “A computer model is a small approximation of some of the operations and the wiring, but it remains a toy version.”

Oliva described the neural network brain as a “sort of super-power statistician.” It’s primarily good at analyzing patterns and applying them to similar data. That means the algorithms can get pretty good at recognizing images and showing researchers what they have learned. (That sometimes yields surprising results, like the cat example, as researchers didn’t know the computer understood how to spot pets.) Still, “the notion of common sense, abstraction, theory of mind are cognitive concepts that cannot be captured by current computer models,” Oliva noted.

“The brain is a dynamical organ, ‘alive’ in the sense that it changes form (neural activity) all the time,” Oliva said in an email.  “Characterizing the dynamics of the brain network is mandatory to know what the brain may calculate and when.”

The next steps, Oliva said, are to teach the network to analyze video and motions, so it can recognize specific actions. The neuroscientists can then analyze how the human brain develops.

But the software also has immediate practical uses. Oliva said the visual pattern recognition can be used in cancer research, for example, to detect abnormalities in MRI images or x-rays. So while the digital brain may be a toy for brain surgeons, it could be a useful tool for doctors in general. Just don’t expect it to have any common sense yet.

Get the best insights in healthcare analytics directly to your inbox.

Ethical Concerns for Cutting-Edge Neurotechnologies
Stroke Study Shows Machine Learning's Promise in Healthcare
The Barriers to True Healthcare AI

Become a contributor