Skip to content

Latest commit

 

History

History
219 lines (147 loc) · 11.5 KB

README.md

File metadata and controls

219 lines (147 loc) · 11.5 KB

WELCOME TO THE KASTHURI CHALLLENGE!

electron Inspired by the MTNeuro Benchmark Dataset found here, the Kasthuri Challenge introduces new annotations of synapses and membranes of neural tissue from a mouse cortex. BossBD, an open source volumetric database for 3D and 4D neuroscience data and Neuroglancer were used to derive annotations from the dataset.

Links

Background

In the past decade, there have been major pursuits in understanding large scale neuroanatomical structures in the brain. With such ventures, there has been a surplus amount of brain data that can potentially reveal different phenomenons about brain organization. Currently, many machine and deep learning tools are being pursued, however there is still a need for new standards for understanding these large scale brain datasets. To access this challenge, we introduce a new dataset, annotations, and tasks that provide a diverse approach to readout information about brain structure and architecture. We adapted a previous multitask neuroimaging benchmark (MTNeuro) of a volumetric, micrometer-resolution X-ray microtomography image spanning a large thalamocortical section of a mouse brain as a baseline for our challenge. Our new standardized challenge (Kasthuri Challenge) aims to generate annotations of a saturated reconstruction of a sub-volume of mouse neocortex imaged with a scanning electron microscope. Specifically, annotations of synapses and membranes are the regions of interest as they provide the best results and insights of how machine and deep learning are able pinpoint the unique connectivity at the microstructure level. Datasets, code, and pre-trained baseline models are provided at: TBD

Dataset Description

Kasthuri et. al. 2015 - Mouse Cortex Microstructure

The dataset contains high-resolution images from the mouse cortex acquired with a spatial resolution of 3x3x30 cubic nanometers. The total size of the dataset amounts to a whopping 660GB of images.


Somatosensory


This volumetric dataset provides detailed reconstructions of a sub-volume of mouse neocortex, encompassing all cellular objects like axons, dendrites, and glia, as well as numerous sub-cellular components. Notable among these are synapses, synaptic vesicles, spines, spine apparati, postsynaptic densities, and mitochondria.


EM


kasthuri2015


Exploration and Insights

By leveraging this dataset, the research team made significant discoveries into the structural intricacies of neural tissue at nanometer resolution. A key revelation was the refutation of Peters’ rule. This was achieved by tracing the pathways of all excitatory axons and examining their juxtapositions with every dendritic spine, thereby dispelling the notion that simple physical proximity suffices to predict synaptic connectivity.

The dataset and its associated labels are hosted publicly on BossDB. To access the data, you can utilize the Python API library, Intern. For anonymous read-only access, use the username "public-access" and password "public".

More details can be found in the cited paper below.

Challenge Tasks

Kasthuri Challenge aims to generate annotations of a saturated reconstruction of a sub-volume of mouse neocortex imaged with a scanning electron microscope. Specifically, annotations of synapses and membranes are the regions of interest as they provide the best results and insights of how machine and deep learning are able pinpoint the unique connectivity at the microstructure level.

Membranes Ground Truth Membrane Segmentation

Synapse Ground Truth Synapse Segmentation

Getting Started

Installation

To get started, clone this repository, change into the directory

1. Create a virtual environment:

Run the following command to create a virtual environment named kasthuri_env:

python3 -m venv kasthuri_env

Activate the virtual environment:

source kasthuri_env/bin/activate

2. Installing the Packages:

Now, navigate to the directory where you have cloned the Kasthuri repository and run:

pip3 install -e ./
pip3 install -r requirements.txt

The code has been tested with

  • Python >= 3.8
  • PIP == 22.1.2
  • torch == 1.11.0
  • torchvision == 0.12.0
  • numpy == 1.19.3

3. Installing with GPU Support (Optional)

For users with a compatible NVIDIA GPU, you can leverage accelerated training and inference using PyTorch on CUDA. Follow these steps for a seamless GPU setup:

a. Prerequisite GPU Libraries

b. Modify setup.py

  1. Open setup.py.
  2. Look for the install_requires section.
  3. Comment out or remove the existing torch line.
  4. Add the specific version of torch for GPU support:
    torch==1.11.0+cu116

c. Install Kasthuri Repository and Dependencies

  1. Navigate to the directory where you have cloned the Kasthuri repository.
  2. Install the Kasthuri package:
    pip3 install -e ./
  3. Install the dependencies in requirements.txt:
    pip3 install -r requirements.txt

e. Verify GPU Support

After installation, verify if PyTorch recognizes your GPU:

import torch
print(torch.cuda.is_available())

Code structure

Training Scripts

Code for executing training and evaluation for baseline networks are provided for each task in the scripts folder.

These can all be run as scripts\script_name from the main repository folder.

These can be reconfigured for different networks using the configuration files in networkconfig.

This is the easiest way to build on the example code for network development. A pytorch dataset is provided bossdbdataloader and used in our example scripts.

Instructions for adapting our test scripts for a new model are found in here.

To get started running examples, files in the scripts directory can be run following this example

python3 scripts/task_membrane.py

or

python3 scripts/task_synapse.py

Available Network Configurations

Within the networkconfig directory, you can find the following configuration files:

You can select any of these configurations using the --network argument when running the scripts.

To specify a different network configuration, use the --network argument like so:

python3 scripts/task_membrane.py --network UNet_2D_attention.json
python3 scripts/task_synapse.py --network UNet_2D_attention.json

and will load default configuration scripts and public authentication credentials. The training script will output trained network weights as a 'pt' file, and produce output figures.

Access with Jupyter Notebook

Access notebooks notebooks for each task and run cell by cell. The code, by default, saves the cutouts as numpy tensors.

Pretained Notebooks are used for inferencing using our pretrained model for each model variation. Install AWS CLI to download pretrained model weights.

Citation

If you find this project useful in your research, please cite the following paper!

Contributions

Thank you to the Benchmark Team - Travis Latchman, Tanvir Grewal, Ashley Pattammady, Kim Barrios, Erik Johnson, and the MTNeuro team!

The Boss Legal Notes

Use or redistribution of the Boss system in source and/or binary forms, with or without modification, are permitted provided that the following conditions are met:

  1. Redistributions of source code or binary forms must adhere to the terms and conditions of any applicable software licenses.
  2. End-user documentation or notices, whether included as part of a redistribution or disseminated as part of a legal or scientific disclosure (e.g. publication) or advertisement, must include the following acknowledgement: The Boss software system was designed and developed by the Johns Hopkins University Applied Physics Laboratory (JHU/APL).
  3. The names "The Boss", "JHU/APL", "Johns Hopkins University", "Applied Physics Laboratory", "MICrONS", or "IARPA" must not be used to endorse or promote products derived from this software without prior written permission. For written permission, please contact BossAdmin@jhuapl.edu.
  4. This source code and library is distributed in the hope that it will be useful, but is provided without any warranty of any kind.