Code2PseudoCode is a transformer-based deep learning model that translates programming code into human-readable pseudocode. This project aims to assist developers, students, and educators by providing a structured way to understand complex code through natural language.
- Utilizes a transformer architecture for sequence-to-sequence translation.
- Supports tokenization and vocabulary building for both code and pseudocode.
- Implements positional encoding and attention mechanisms to enhance translation accuracy.
- Uses PyTorch for efficient model training and inference.
Clone the repository and install the dependencies:
pip install -r requirements.txt
- Python 3.x
- PyTorch
- Pandas
- tqdm
Contributions are welcome! Feel free to submit issues or pull requests.