Cornell University researchers have created the MouseGoggles Duo 3D VR headset for laboratory mice using 3D-printed parts and open-source software running on a Raspberry Pi 4. Mice using the Duo see a virtual world of paths, obstacles, and rewards while running endlessly on top of a spherical treadmill. The headset eliminates the need for bulky video projectors and physical mouse mazes.
Mice are often used in research labs to unveil how the brain works because of their fast adaptability to unknown environments and quick learning capabilities. These features can be used to help researchers understand neurodegenerative diseases like Alzheimer's that result in memory and control loss in humans. Difficulties facing researchers include the time and cost of building actual mouse mazes and the challenges of creating a convincing simulation in virtual reality.
The MouseGoggles Duo headset was designed with these factors in mind. The enclosure is 3D-printed for low cost and rapid design iteration. The case houses two 1.09 in. (2.76 cm) circular LED displays focused by 0.5 in. (1.27 cm) Fresnel lenses that provide a wide 230-degree horizontal and 140-degree vertical field-of-view (FOV). Because the headset is quite large compared to the head, it is mounted in front of the face of a restrained mouse during experiments.
A Raspberry Pi 4 single-board computer (SBC) runs 3D simulations generated by the open-source Godot game engine running on top of Raspberry Pi OS. The system is able to generate frames at 80 fps with an input-to-display latency under 130 msec for full-screen updates.
When mice were subjected to VR tests such as finding rewards, researchers found the focus, VR object positioning, and other factors of the MouseGoggles Duo compared well to traditionally-projected displays. The mice's brains were directly monitored using two-photon calcium imaging of the visual cortex and electrophysiological recording of the hippocampus to capture validation data.
Readers who want to run endlessly in VR worlds can use VR treadmills like the Kat Walk C2 series sold on Amazon. Those who are too tired from walking all day long can relax with a pair of lightweight AR glasses like the Xreal AR glasses sold on Amazon. Readers who don't even want to lift a single finger can put their names on Elon Musk's Neuralink wait list to have a brain-computer interface (BCI) implanted so they can game and Tweet by thoughts alone.
Are you a techie who knows how to write? Then join our Team! Wanted:
- News Writer (Romania based)
Details here
Source(s)
MouseGoggles offer immersive look into neural activity
By David Nutt, Cornell Chronicle
December 18, 2024
Thanks to their genetic makeup, their ability to navigate mazes and their willingness to work for cheese, mice have long been a go-to model for behavioral and neurological studies.
In recent years, they have entered a new arena – virtual reality – and now Cornell researchers have built miniature VR headsets to immerse them more deeply in it.
Cornell researchers built miniature VR headsets to immerse mice more deeply in virtual environments that can help reveal the neural activity that informs spatial navigation and memory function.
Cornell researchers built miniature VR headsets to immerse mice more deeply in virtual environments that can help reveal the neural activity that informs spatial navigation and memory function.
The team’s MouseGoggles – yes, they look as cute as they sound – were created using low-cost, off-the-shelf components, such as smartwatch displays and tiny lenses, and offer visual stimulation over a wide field of view while tracking the mouse’s eye movements and changes in pupil size.
The technology has the potential to help reveal the neural activity that informs spatial navigation and memory function, giving researchers new insights into disorders such as Alzheimer’s disease and its potential treatments.
The research, published Dec. 12 in Nature Methods, was led by Chris Schaffer, professor of biomedical engineering in Cornell Engineering, and Ian Ellwood, assistant professor in neurobiology and behavior in the College of Arts and Sciences. The study’s lead authors are postdoctoral researcher Matthew Isaacson and doctoral student Hongyu Chang.
“It’s a rare opportunity, when building tools, that you can make something that is experimentally much more powerful than current technology, and that is also simpler and cheaper to build,” Isaacson said. “It’s bringing more experimental power to neuroscience, and it’s a much more accessible version of the technology, so it could be used by a lot more labs.”
Schaffer’s lab, which he runs with Nozomi Nishimura, associate professor of biomedical engineering, develops optics-based tools and techniques that can be used, along with other methodologies, to investigate the molecular and cellular mechanisms that contribute to loss of function in neurodegenerative diseases. One particular line of research has been studying the unexplained reductions in brain blood flow in mice with Alzheimer’s disease. By unblocking tiny capillaries and increasing that flow, the researchers have shown that memory function in mice improves within hours.
“That was very exciting from the perspective of, hey, maybe there is something you could do in Alzheimer’s disease that could recover some cognitive function,” Schaffer said. “The next steps are to uncover how blood flow improvements are improving the function of neurons in the brain. But to do those experiments, we needed new capabilities compared to what existed in the world before.”
Beginning about a decade ago, researchers began rigging up cumbersome – and quite costly – projector screens for mice to navigate virtual-reality environments, but the apparatuses are often clunky, and the resulting light pollution and noise can disrupt the experiments.
“The more immersive we can make that behavioral task, the more naturalistic of a brain function we’re going to be studying,” Schaffer said.
Isaacson, who previously designed display systems for fruit flies, set about assembling a stationary VR setup that would be simpler but even more immersive, so the mice could learn more quickly. It so happened that many of the components he needed – tiny displays, tiny lenses – were already commercially available.
“It definitely benefited from the hacker ethos of taking parts that are built for something else and then applying it to some new context,” Isaacson said. “The perfect size display, as it turns out, for a mouse VR headset is pretty much already made for smart watches. We were lucky that we didn’t need to build or design anything from scratch, we could easily source all the inexpensive parts we needed.”
The goggles aren’t wearable in the traditional sense. A mouse stands on a treadmill, with its head fixed in place, as it peers into a pair of eye pieces. The mouse’s neural activity patterns can then be fluorescently imaged.
Working with Ellwood’s lab, the team conducted a battery of tests on begoggled mice. On the neurological front, they examined two key regions in the mouse brain: the primary visual cortex, to ensure the goggles form sharp, high contrast images on the retina; and in the hippocampus, to confirm that the mouse brain is successfully mapping its virtual environment. Other tests were more tech-oriented, to see if the goggle displays updated quickly and were responsive to the mouse’s movements.
And most importantly, the researchers needed to observe how the mice behaved in their new eyewear. One of the most effective tests was tricking a mouse into believing that an expanding dark blotch was approaching them.
“When we tried this kind of a test in the typical VR setup with big screens, the mice did not react at all,” Isaacson said. “But almost every single mouse, the first time they see it with the goggles, they jump. They have a huge startle reaction. They really did seem to think they were getting attacked by a looming predator.”
The researchers received an unexpected contribution when they submitted their findings to Nature Methods. An anonymous reviewer pushed the researchers to add a set of cameras in each eye piece that could record the mouse’s pupils and verify the animal’s engagement and arousal.
The request was both a difficult task and a fortuitous blessing.
“They challenged us to do something really hard and make it all work,” Schaffer said. “In the last year, there’s been now three papers published with VR goggles for mice. You know, the field was ripe for this to happen. But we’re the only one with pupillometry and eye tracking, and that is a critical capability for much of neuroscience.”
The researchers are looking to further develop the goggles, with a lightweight, mobile version for larger rodents, such as tree shrews and rats, that can include a battery and onboard processing. Schaffer also sees the potential of incorporating more senses, such as taste and smell, into the VR experience.
“I think five-sense virtual reality for mice is a direction to go for experiments,” he said, “where we’re trying to understand these really complicated behaviors, where mice are integrating sensory information, comparing the opportunity with internal motivational states, like the need for rest and food, and then making decisions about how to behave.”
Co-authors include doctoral student Rick Zirkel; postdoctoral researcher Laura Berkowitz; and Yusol Park ’22 and Danyu Hu ’22.
The research was supported by the Cornell Neurotech Mong Family Fellowship program; the BrightFocus Foundation Alzheimer’s disease fellowship program; the Brain and Behavior Research Foundation; and the National Institutes of Health.