CodeBERT is extension of BERT model developed by Microsoft in 2020. This model can be used for multiple downstream tasks using programming language and natural language, such as suggesting developer a code for particular task, aiding developers for code translations and many more.
https://paperswithcode.com/paper/codebert-a-pre-trained-model-for-programming
This repo contains code pretraining models in the CodeBERT series from Microsoft, including four models as of July 2022.