TrackFormer is a unique solution to particle trajectory reconstruction that uses transformer-inspired design. Using Transformers' self-attention mechanism, this model performs track fitting, resulting in better accuracy and efficiency.
- Transformer-based Architecture: For fast efficient and accurate particle track fitting.
- Built with Lightning Integration:
- Modular Design:
- Logging and CLI integration:
- Clone the repository:
- download datasets to specific directory
- cd to appropriate script to train, test split dataset:
./split_dataset.sh /path/to/downloaded/dataset 80 10 10
- train the model using the following command:
python main.py fit --config configs/tformer.yaml
- train the model with wandb logging:
python main.py fit --config configs/tformer.yaml --config configs/trainer.yaml
To reproduce the results, you can use the following commands.
Step 1: Clone the repository
git clone https://github.com/soot-bit/TrackFormer.git
Step 2: Create an environment using conda and install the dependencies
conda create --name TrackFormer python=3.10
conda activate TrackFormer
pip install -r requirements.txt
Step 3: Download train_1.zip
from the TrackML dataset and extract it to the Data
directory.
unzip train_1.zip -d Data/Tml/train_1
Step 4: Split the dataset into train, validation, and test sets
cd Data/Tml
bash split_tml_inplace.sh train_1 80 10 10
cd ../..
Step 5: Train the models
bash run_TrackML_example.sh
Step 6: To evaluate the models, update model paths in the dataanalysis.ipynb
cells and run the notebook.