Skip to content

Latest commit

 

History

History
15 lines (8 loc) · 616 Bytes

File metadata and controls

15 lines (8 loc) · 616 Bytes

CodeBERT

CodeBERT is extension of BERT model developed by Microsoft in 2020. This model can be used for multiple downstream tasks using programming language and natural language, such as suggesting developer a code for particular task, aiding developers for code translations and many more.

Paper

https://paperswithcode.com/paper/codebert-a-pre-trained-model-for-programming

LINKS

This repo contains code pretraining models in the CodeBERT series from Microsoft, including four models as of July 2022.

https://codeserra.medium.com/codebert-83171b23c33c