Twitter Email Top

Vision and Mission

Neuroimaging methods rely on accurate brain models as ground truth to develop reliable approaches for probing the brain. Researchers in brain simulation and AI have a growing need for detailed descriptions of the internal organisation of brain regions in terms of local morphology, cell densities, or connectivity. Currently established computer-based 3D neuroimaging tools cannot reproduce the anatomical details available from freshly cut brains, particularly for very convoluted cortical regions and in the subcortical areas. With the advent of the BigBrain - a human post-mortem brain that has been sectioned, stained for cell bodies, scanned at very high resolution, and then digitally reconstructed in 3D (Amunts, Evans et al. 2013), we believe that there will be improvement in the precision and quality of neuroimaging support for qualitative and quantitative investigation of the brain. We aim to extend this model by further increasing its resolution and integrating multimodal data, working closely with the neuroimaging, brain modelling, and AI communities to unleash its potential for research. To enable collaboration, we build a transcontinental data sharing and computing platform in close interaction with the European "Human Brain Project" and the Canadian “Healthy Brains for Healthy Lives” program.

The BigBrain Goals

Hiball goal: multimodal atlas Building a highly detailed multimodal atlas
Building a highly detailed multimodal atlas

By developing novel AI-based methods for analyzing high-resolution image data, we create highly detailed maps of brain areas, cortical layers, and subcortical structures for our microscopic brain models. We complement the cytoarchitectonic model by modalities that cover fibre- and chemoarchitecture, working towards a multimodal characterization of the human brain at the microscopic scale. Furthermore, we work towards a model with a resolution in 1 micron range, which resolves individual neuronal cell bodies.
Hiball goal: multimodal atlas Build on the existing 3D BigBrain as a microscopic reference template
Build on the existing 3D BigBrain as a microscopic reference template

We integrate our ongoing developments with the original BigBrain model (Amunts, Evans et al. 2013) in order to foster a common microscopic reference template. This includes efforts on careful spatial registration across subjects and modalities, and propagation of maps of brain regions.
Hiball goal: multimodal atlas Building a distributed platform for big neuroscience data
Building a distributed platform for big neuroscience data

By increasing the resolution to the level of individual neuronal cell bodies, the size of whole brain models reaches the Petabyte scale. To address such Big Data challenge, we establish specific workflows for distributed data management and processing, and tools for remote visualization and annotation of 2D and 3D data. This also requires standardization efforts for data and metadata formats. We align these infrastructure development efforts with the developments of the Human Brain Project and the HBHL program to maximize compatibility with the international initiatives.
Hiball goal: multimodal atlas AI link to data science and brain inspired computing
AI link to data science and brain inspired computing

The BigBrain project is entangled with AI research in two fundamental ways - by developing novel image analysis methods based on Machine and Deep Learning, and by contributing knowledge about the microstructural organization of biological neural networks in the brain. The project initiates early cooperation with researchers in brain inspired AI and computing to incorporate information about the human brain into the design of artificial systems.
Hiball goal: multimodal atlas Virtual BigBrain and simulation platform
Virtual BigBrain and simulation platform

The VirtualBrain (TVB) uses empirical knowledge of brain structure to constrain models that simulate network dynamics, which in turn can be validated with experimental observations of brain function. Complementing related work in the Human Brain Project, we support the development of TVB models with increased spatial resolution and regional heterogeneity by informing them with microscopic resolution data, and evaluating optimized software and hardware environments for the CBRAIN platform.
Hiball goal: multimodal atlas Open Community Resource
Open Community Resource

The BigBrain Project continues to build an open resource and a lively scientific community within a collaborative research and training environment. We will share our datasets and tools through the platform, aim to integrate them with the EBRAINS and CBRAIN ecosystems, and engage with the BigBrain user and contributor community through open project meetings, workshops, and online services.

History of BigBrain

1. What is the BigBrain

BigBrain is a freely accessible high-resolution 3D digital model of the human brain, released in June 2013 by a team of researchers at the Montreal Neurological Institute (Canada) and the Forschungszentrum Jülich (Germany). The isotropic 3D spatial resolution of the BigBrain atlas is 20 µm, much finer than other models. In 2014, BigBrain was cited in the Top 10 MIT Technology Review. Since then, it is continuously enriched by detailed maps of cortical areas and layers as well as subcortical regions. It became the basis of many neuroscientific studies.

Body donor gave written informed consent for the general use of postmortem tissue used in this study for aims of research and education. The usage is covered by a vote of the ethics committee of the medical faculty of the Heinrich Heine, University Düsseldorf (#4863).

A paraffin embedded brain was cut into 7404 histological sections, each 20 micrometers thick, using a microtome.
Copyright: Amunts, Zilles, Evans et al.

2. Acquisition, histological processing, and digitization at 20 μm

The 3D model was created from the brain of an 65-year-old body donor. The postmortem brain was obtained in accordance with ethical requirements of the University of Düsseldorf. It was fixed in formalin, and imaged in 2003 using a MR scanner. In 2005, it was cut into 7,404 sections 20 µm thick using a large-scale microtome. The brain sections were placed on glass slides and stained for cell bodies using the Merker method, resulting in darkly stained neurons and light neuropile, i.e. the space in between neurons filled with the branches of the cells and synapses. The stained sections were scanned and digitized using a flatbed scanner at 2400dpi (10 µm), creating a one Terabyte raw record. The acquisition process (wet work + scanning) took about 1,000 hours of labor.

3. Manual repairing of tissue artefacts in the images

The resulting digital images were carefully processed by a team of operators to identify and manually fix artifacts such as tears and folds, which arise inevitably during histological processing. The manual repair process was complemented by automated repairs, carefully blending information from neighbouring sections to replace missing pieces of brain tissue where required.

Digital rendering with CIVET
OHBM poster from 2016

4. 3D reconstruction using image processing with HPC

The corrected digital sections were nonlinearly registered to the postmortem MRI in an iterative process, and reassembled into a consistent 3D computer model with a spatial isotropic 3D resolution of 20 µm. The aligned images were optically balanced to remove colouring artifacts from staining. Dealing with more than 7000 images, this was a highly complex and compute intensive process which would not have been possible without supercomputers. Including all manual work, the 3D reconstruction took five years to complete.

5. Rendering and annotating the BigBrain

Standard tools for volumetric image rendering and annotation were not applicable to the BigBrain due to its size: A Terabyte of image date does not fit into a typical computer’s working memory. Specific software had to be developed which provides very efficient loading and rendering of the image data to visualize the BigBrain and allow smooth interactions and annotations in 3D space. The Atelier3D software developed by the National Research Council Canada (NRC) is one of the few available tools that were able at that time to render and interactively annotate such huge image volumes, and had been specifically adjusted for brain navigation and mapping.

6. Analysis: Mapping of the BigBrain

We have applied different approaches to map cortical areas and subcortical nuclei in the BigBrain. For the cerebral cortex, we applied image analysis and statistical tools to identify borders between areas (Schleicher et al., 2005), while nuclei were manually delineated. Most recently, automatic methods derived from artificial intelligence (machine learning and deep learning) were applied map brain areas (Spitzer et al.), and to extract cortical layers or count groups of cells (Wagstyl et al.). Check out recent work on our news feed!

7. Dissemination: Sharing the atlas with the world

The BigBrain image datasets are currently hosted by the project on an ftp server. The model has been integrated as a core reference space of the multilevel human brain atlas of the Human Brain Project. In the future, data sharing will be offered through a fully versioned online system developed within the framework of HIBALL. In Summer 2017 the first version of the Human Brain Project’s web-based 3D atlas viewer was released, building on Google’s open source project Neuroglancer. It is capable of displaying very large brain volumes, including oblique slicing, a whole brain overview, surface meshes, and maps. It enables navigating the BigBrain in 3D, exploring the growing set of highly detailed maps for cortical layers and cytoarchitectonic areas, and finding related neuroscience data. BigBrain can be explored in the viewer at Human Brain Project website.

The BigBrain Featured

Major publications featuring the dataset

Science Magazine

BigBrain: An Ultrahigh-Resolution 3D Human Brain Model

The original publication of the BigBrain

MIT Technology Review

One of the Top 10 Breakthroughs of 2014

MIT Technology Review listed the BigBrain as 1 of the Top 10 Breakthroughs of 2014.

BigBrain team

The people from the original project

Katrin Amunts
Director, Institute Structural and functional organisation of the brain (INM-1)
Forschungszentrum Jülich
Director, C. and O. Vogt-Institute of Brain Research
Universitätsklinikum Düsseldorf
Claude Lepage
Development and programming of image-processing tools for 3D reconstruction/alignment and automatic repair pipelines, manual repairs, registration, quality control and documentation
McGill Centre for Integrative Neuroscience
Louis Borgeat
Volume data modeling and integration, development of the Atelier3D remote visualization and analysis tools
National Research Council of Canada
Hartmut Mohlberg
Institute structural and functional organisation of the brain (INM-1)
Forschungszentrum Jülich
Timo Dickscheid
Institute structural and functional organisation of the brain (INM-1)
Forschungszentrum Jülich
Marc-Étienne Rousseau
Computing platform manager
McGill Centre for Integrative Neuroscience
Sebastian Bludau
Institute structural and functional organisation of the brain (INM-1)
Forschungszentrum Jülich
Pierre-Louis Bazin
Senior Onderzoeker
Lindsay B. Lewis
Manual repairs, quality control and documentation
McGill Centre for Integrative Neuroscience
Ana-Maria Oros-Peusquens
Institute structural and functional organisation of the brain (INM-1)
Forschungszentrum Jülich
Nadim J. Shah
Director of the Institute Medical Imaging Physics (INM-4)
Forschungszentrum Jülich
Thomas Lippert
Director of the Institute for Advanced Simulation, Head of Jülich Supercomputing Centre
Forschungszentrum Jülich
Karl Zilles
Institute structural and functional organisation of the brain (INM-1), Forschungszentrum Jülich
Forschungszentrum Jülich
former Director of the Institute of Neuroscience and Medicine
former director of the C. and O. Vogt-Institute of Brain Research
Alan Evans
James McGill Professor of Neurology, Psychiatry, Biomedical Eng.
McGill University

BigBrain FAQ

Frequently asked questions about the BigBrain

  • Q:

    What is the BigBrain

    A: The BigBrain is the brain of a 65yo man with no known neurological or psychiatric diseases in clinical records at time of death. The brain was embedded in paraffin and sectioned in 7404 coronal histological sections (20 microns), stained for cell bodies. The BigBrain is the digitized reconstruction of the high-resolution histological sections (20 microns isotropic).

  • Q:

    How to Cite

    A: When referring to the BigBrain Project or using the data, please cite the original Science 2013 publication:

    Amunts K, Lepage C, Borgeat L, Mohlberg H, Dickscheid T, Rousseau M-É, Bludau S, Bazin PL, Lewis LB, Oros-Peusquens AM, Shah NJ, Lippert T, Zilles K, Evans AC. BigBrain: An ultrahigh-resolution 3D human brain model. Science. 2013; 340(6139):1472-1475. doi: 10.1126/science.1235381. PMID: 23788795.

  • Q:

    What is the ethics statement for source brain

    A: Body donor gave written informed consent for the general use of postmortem tissue used in this study for aims of research and education. The usage is covered by a vote of the ethics committee of the medical faculty of the Heinrich Heine, University Düsseldorf (#4863).

  • Q:

    What are all the volumes listed in the table?

    A: The volumes are represented in either stereotaxic space (MNI-ICBM152 or MNI-SYN24) or in native histological space. The stereotaxic registration is not perfect but it is very good. The templates for registration (ICBM152 and SYN24) are made available in those tables. The BigBrain volumes are offered at 100, 200, 300, 400 microns isotropic in both MINC (.mnc) and NIfTI (.nii) formats.

  • Q:

    Why is the aligned BigBrain showing such strong asymmetry?

    A: The asymmetry results from aligning the histology sections to the MRI of the brain taken after it was extracted from the skull and set in formalin. We do not have a post-mortem MRI of the undistorted brain inside the head.

  • Q:

    The intensities are wrong in the NIfTI volumes. How can I view them correctly?

    A: There was a problem with the initial data conversion to NIfTI format. The NIfTI volumes have been replaced on Sept 3, 2013. You will have to download the new volumes to view them in fslview, mricron or afni tools.

  • Q:

    How can I download the MINC volumes? The files appear incomplete.

    A: Your browser may be limiting the maximum size of the file to transfer. This is problematic for the 100-micron MINC volumes.

  • Q:

    How can I view the MINC volumes?

    A: MINC is an imaging format developed at the MNI. To obtain binaries (mostly Linux and OSX) of the MINC tools, download the MINC Tool Kit here. The viewers are called register and Display.

  • Q:

    How do I view the volumes online?

    A: The web-based viewer is TissueStack. Documentation is available at: As you will soon discover, there is a limitation in the zoom factor in TissueStack.

  • Q:

    How can I view the OBJ surfaces?

    A: The OBJ surfaces are viewable in Brainbrowser, Surfstat (MATLAB-based) toolbox, Brain-view, or overlaid on the corresponding volume in Display, etc.

  • Q:

    Is there an MRI of the BigBrain?

    A: Yes, there is one of the fixed brain (removed from skull). It has an isotropic voxel size resolution of 0.444mm, and is available in the Raw_Data/MRI subdirectory. Note that the current version is in "processing space" with y-z axes flipped.

  • Q:

    How can I view the volumes as in the bigbrain videos?

    A: The BigBrain videos were created using Atelier3D, a licensed software which is currently not distributed. The volume read in Atelier3D is at 20-micron isotropic, which is too big for file transfers. This is why reduced volumes at 100, 200, 300, 400 microns have been created.

  • Q:

    Is there sound to the bigbrain videos?

    A: No

  • Q:

    Is there a way to mass-download the data files?

    A: Yes. As of March 12th, 2014, all sections and volumes are available on an anonymous FTP server located at the same address as this site. We recommend connecting to the FTP server using a command line or GUI client (e.g., Filezilla), but not a web browser. % ftp (enter anonymous as the username and your e-mail as the password) Use the command "ls" to list the contents, "cd" to change directory, "get" to download files.

Additional Partners and Sponsors

Those that made the BigBrain possible