ISSN: 2161-0940
+44 1300 500008
Perspective - (2022)
Microscopic examination of histologic brain sections gives crucial information about the nervous systems organization and structure, which is particularly useful for neuro genomics, transcriptomics, proteomics, and connectomics research. The brain Initiative focuses on finding unique anatomic and physiologic features of brain cells in this setting. A systematic approach of allocating cells and axonal projection targets to appropriate brain regions is necessary to achieve this systematically and accurately. Unfortunately, using currently available methods (printed and digital brain atlases) to identify unique brain regions in a tissue segment under research (hence, "experimental section") is time-consuming, inefficient, and prone to error.
An example from the GENSAT (Gene Expression Nervous System Atlas) project illustrates the problems experienced by researchers utilizing brain atlases to identify brain areas in experimental sections for the mouse brain. Many factors make it difficult to identify the regions of interest (green and yellow arrows) using commonly utilized methods. To begin, an investigator must go back and forth between the plates of a mouse brain atlas, looking for a part that is most similar to the experimental portion. The three-dimensional (3D) boundaries of the regions of interest must then be inferred and translated from the atlas static medium to the experimental section, which can then be seen as a picture or under a light microscope. After that, the procedure must be repeated for each experimental portion under investigation. Second, the cutting angle of histologic sections can diverge from canonical planes, making it difficult to identify a mouse brain atlas with scale and resolution that matches the experimental section image. When sectioning mouse brains, this problem is virtually inescapable. When an experimental segment is not properly aligned with the atlas, the process of matching a picture to a plane in the atlas is difficult and results in extremely subjective assessments. There is currently no freely available method for researchers to objectively and automatically match oblique histological sections to various planes of a brain atlas.
We developed a novel mouse brain navigation system (NeuroInfo) to overcome the challenges of subjective judgments in atlas referencing. It 1) automatically registers images of experimental mouse brain sections with a 3D digital mouse brain atlas that is essentially based on the third version of the Allen Mouse Brain Common Coordinate Framework (CCF v3); 2) retrieves graphical region delineations and annotations from this 3D digital mouse brain atlas; and 3) superimposes the two. The process of data annotation consists of two steps: section-toatlas registration and atlas-based segmentation.
The term "3D digital mouse brain atlas" refers to a collection of 3D digital documents that include anatomic mouse brain images and associated differentiations, as well as the following elements: a 3D reference image embedded in a physical coordinate system, spatially-aligned 3D annotations, and a related ontology that describes the relationship between the regions. The CCF v3 3D reference picture is based on a 3D, 10 m isotropic, highly detailed population average of 1675 mouse brains photographed with serial two-photon tomography on a TissueCyte 1000 system and aligned to a common frame. The CCF v3 3D annotations provide delineations of 738 areas based on multimodal references and the associated ontology.
We present a detailed description of NeuroInfo's major components and their functionality, as well as the results of extensive validation studies performed on different mouse brains generated in two different laboratories and imaged with either fluorescence or bright field microscopy.
Citation: Saga Y (2022) A Short Note on Automatic Navigation System for the Mouse Brain. Anat Physiol. S8:383.
Received: 22-Apr-2022, Manuscript No. APCR-22-17334; Editor assigned: 25-Apr-2022, Pre QC No. APCR-22-17334 (PQ); Reviewed: 09-May-2022, QC No. APCR-22-17334; Revised: 16-May-2022, Manuscript No. APCR-22-17334(R); Published: 23-May-2022 , DOI: 10.35248/2161-0940.22.12.383
Copyright: © 2022 Saga Y. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.