Fine tuning of various BERT-like transformer models Description: Various pre-trained language models are fine-tuned and tested on publicly available datasets. Contents Fine tuning of Hindi-BERT with Keras using a review dataset Fine tuning of T-XLM-RoBERTa-base model using transformers Trainer class on UMSAB Hindi sentiment analysis dataset A general fine-tuning recipe of (BERT like) Hugging Face transformer models using native PyTorch (training workflow is demonstrated on German UMSAB dataset) A general hyperparameter optimization recipe for BERT-like Hugging Face transformer models by using the transformers Trainer class (on UMSAB dataset).