Skip to content

Official Repository for The Paper, Adversarial-MidiBERT: Symbolic Music Understanding Model Based on Unbias Pre-training and Mask Fine-tuning

Notifications You must be signed in to change notification settings

RS2002/Adversarial-MidiBERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adversarial-MidiBERT

Article: Zijian Zhao*, “Adversarial-MidiBERT: Symbolic Music Understanding Model Based on Unbias Pre-training and Mask Fine-tuning (arxiv.org)

Some parts of our code are based on wazenmai/MIDI-BERT: This is the official repository for the paper, MidiBERT-Piano: Large-scale Pre-training for Symbolic Music Understanding. (github.com).

1. Dataset

The datasets we used in the paper include POP1K7, POP909, Pinaist8, EMOPIA, and GiantMIDI.

You can refer the detail in our previous work PianoBART. To run the model, you also need the dict file in this repository.

2. Pre-train

python pretrain.py --dict_file <the dictionary in PianoBART>

To run the model, you need to place your pre-training data in ./Data/output_pretrain.

3. Fine-tune

python finetune.py --dict_file <the dictionary in PianoBART> --task <task name> --dataset <dataset name> --dataroot <dataset path> --class_num <class number> --model_path <pre-trained model path> --mask --aug

If you do not want to use pre-trained parameters, you should add --nopretrain. If you do not want to use mask fine-tuning or data augmentation, you should delete the --mask or --aug.

4. Citation

@misc{zhao2024adversarialmidibertsymbolicmusicunderstanding,
      title={Adversarial-MidiBERT: Symbolic Music Understanding Model Based on Unbias Pre-training and Mask Fine-tuning}, 
      author={Zijian Zhao},
      year={2024},
      eprint={2407.08306},
      archivePrefix={arXiv},
      primaryClass={cs.SD},
      url={https://arxiv.org/abs/2407.08306}, 
}

About

Official Repository for The Paper, Adversarial-MidiBERT: Symbolic Music Understanding Model Based on Unbias Pre-training and Mask Fine-tuning

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages