Scientists can now observe the brains of lab animals in microscopic detail as the animals go about some action. A technique called two-photon imaging, in particular, allows neuroscientists to watch thousands of neurons working in concert to encode information.
The trouble is, two-photon imaging requires the animal’s head to stay fixed in place. That would seem to preclude watching the brain as the animal does anything of much interest.
One creative solution is virtual reality—a computer-generated environment experienced through a headset. A few years ago neuroscientists started designing tiny virtual-reality systems to fool mice into thinking they were navigating a maze when they were really running on the top of a large ball, their heads fixed in position.
Until now, however, mice didn’t run on the ball until they’d had weeks of training. Jeremy Freeman, working with colleague Nicholas Sofroniew and others at the HHMI Janelia Research Campus in Virginia, created a virtual maze the mice seem to understand right away: they navigate through virtual corridors without training. It’s designed to exploit the way mice navigate in nature, Freeman says. Instead of relying primarily on their eyes, mice rely heavily on their whiskers to feel their way through the world.
In the whisker-oriented virtual reality, the walls move to give the mouse the illusion that it is running down winding corridors, he says. But the whole time, the rodent’s head is stationary.
This approach doesn’t translate neatly to the human world. Mice rely heavily on their whiskers to get around, and the neural imaging requires genetically altering mice to produce fluorescent proteins. However, this mouse-sized VR could still shed plenty of light on autism and other conditions that affect decisions, learning and the senses.