BigNeuron: Revolutionizing Neuron Reconstruction with AI

Summary: Researchers published a research paper detailing their project, BigNeuron. This initiative seeks to establish standard methods for the accurate and swift automated reconstruction of neurons, using deep learning algorithms.

The project will offer an extensive set of publicly accessible neural reconstruction images and powerful tools for independent analysis. This could help researchers understand brain function and changes over time.

Key Facts:

  1. BigNeuron is an international initiative involving computer scientists and neuroscientists from multiple institutions, aiming to create a standard framework for automatic neuron reconstruction.
  2. The project will provide a vast, publicly available dataset of neural reconstruction images, along with robust tools for analysis.
  3. The team has developed an automated algorithm using deep learning to discern the shape of each neuron in an image, overcoming challenges of species diversity, brain location, developmental stages, and varying image set quality.

Source: Texas A&M

Dr. Shuiwang Ji, a professor in the Department of Computer Science and Engineering at Texas A&M University, is part of a collaborative research community that recently had its paper titled “BigNeuron: a resource to benchmark and predict performance of algorithms for automated tracing of neurons in light microscopy datasets” published in the April issue of the journal Nature Methods.

Initiated in 2015 and led by the Allen Institute for Brain Science, BigNeuron is an international initiative that brings together computer scientists and neuroscientists from a dozen institutions.

Its goal is to develop a standard framework to help researchers define the best methods and algorithms for fast and accurate automatic neuron reconstruction. Then it will “bench test” the algorithms on large-scale datasets of images using supercomputers.

This shows a neuron.
The project will result in a large set of publicly available neural reconstruction data images, along with robust tools and algorithms researchers can use for their own analysis work. Credit: Neuroscience News

The project will result in a large set of publicly available neural reconstruction data images, along with robust tools and algorithms researchers can use for their own analysis work.

In the human brain alone, there are hundreds of billions of neurons, and they are connected to each other via thousands of thin “branches,” forming a 3D treelike structure.

To understand how the brain functions and changes over time, scientists must be able to digitally reconstruct these neuronal structures to figure out the shape of each neuron in an image.

Using high-resolution microscopes to capture 3D pictures of individual neurons, scientists have worked on developing fully automated neuron reconstruction methods for nearly 40 years.

Recreating them has remained a challenge due to the diversity of species, brain location, developmental stages and quality of the microscopy image sets.

These factors make it difficult for existing algorithms to generalize effectively when they’re applied to volumes of images obtained by different labs.

To mitigate this problem, the team developed an automated algorithm using deep learning to figure out the shape of each neuron inside a particular image.

About this AI and neuroscience research new

Author: Lesley Henton
Source: Texas A&M
Contact: Lesley Henton – Texas A&M
Image: The image is credited to Neuroscience News

Original Research: Closed access.
BigNeuron: a resource to benchmark and predict performance of algorithms for automated tracing of neurons in light microscopy datasets” by Shuiwang Ji et al. Nature Methods


Abstract

BigNeuron is an open community bench-testing platform with the goal of setting open standards for accurate and fast automatic neuron tracing. We gathered a diverse set of image volumes across several species that is representative of the data obtained in many neuroscience laboratories interested in neuron tracing.

Here, we report generated gold standard manual annotations for a subset of the available imaging datasets and quantified tracing quality for 35 automatic tracing algorithms. The goal of generating such a hand-curated diverse dataset is to advance the development of tracing algorithms and enable generalizable benchmarking.

Together with image quality features, we pooled the data in an interactive web application that enables users and developers to perform principal component analysis, t-distributed stochastic neighbor embedding, correlation and clustering, visualization of imaging and tracing data, and benchmarking of automatic tracing algorithms in user-defined data subsets. The image quality metrics explain most of the variance in the data, followed by neuromorphological features related to neuron size.

We observed that diverse algorithms can provide complementary information to obtain accurate results and developed a method to iteratively combine methods and generate consensus reconstructions.

The consensus trees obtained provide estimates of the neuron structure ground truth that typically outperform single algorithms in noisy datasets. However, specific algorithms may outperform the consensus tree strategy in specific imaging conditions.

Finally, to aid users in predicting the most accurate automatic tracing results without manual annotations for comparison, we used support vector machine regression to predict reconstruction quality given an image volume and a set of automatic tracings.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.