Mobile Microscopes to Study Complex Activity of Brain

By building a tiny microscope small enough to be carried around on a rats` head, scientists at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany, have found a way to study the complex activity of many brain cells simultaneously while animals are free to move around. With this new technology scientists can actually see how the brain cells operate while the animal is behaving naturally, giving rise to immense new insights into the understanding of perception and attention. (PNAS, Online Early Edition, November 2nd, 2009)

New data from rats with head-mounted microscopes shed light on how we put the world together seamlessly while we move around. Image: MPI for Biological Cybernetics

The majority of our life is spent moving around a static world and we generate our impression of the world using visual and other senses simultaneously. It is the ability to freely explore our environment that is essential for the view we form of our local surroundings. When we walk down the street and enter a shop to buy fruit, the street, shop and fruit are not moving, we are. What our brain is probably doing is constantly updating our position based on the information received from our sensory inputs such as eyes, ears, skin as well as our motor and vestibular systems, all in real time. The problem for researchers trying to understand how this occurs has always been how to record meaningful signals from the brain cells that do the calculations while we are in motion.

To get around this problem researchers at the Max Planck Institute for Biological Cybernetics in Tübingen have developed a way of actually watching the activity of many brain cells simultaneously in an animal that is free to move around the environment. By developing a small, light-weight laser-scanning microscope, researchers were able to, for the first time, image activity from fluorescent neurons in animals that were awake and moving around, while tracking the exact position of the animal in space. The microscope uses a high-powered pulsing laser and fiber optics to scan cells below the surface of the brain, eliminating the need to insert electrodes, which are traditionally used. Because of this, the microscope is non-invasive to the brain tissue.

The traditional approach to solving these sorts of questions is to restrain the animal and present it with a series of scenes or movies or images. The miniaturized microscope allows the researchers to turn this paradigm around and allow the animal to freely move around in its environment, while still allowing the scientists to monitor the activity of the brain cells responsible for processing visual information. It is clear that the brain does not work one cell at a time to recognize the environment, so the microscope records from many cells at a time, allowing for the first time the ability to look at how the brain is able to generate an internal representation of the outside world, while using natural vision.

"We need to let the animal behave as naturally as possible if we want to understand how its brain operates during interaction with complex environments. The new technology is a major milestone on the way to helping us understand how perception and attention work", says Jason Kerr, lead author of the study.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.