Skip to content

Jachanya/Unsupervised-deep-learning

Repository files navigation

Research Paper Implementations

This repository contains implementations of five influential research papers in the field of machine learning and artificial intelligence. Each implementation is organized in a separate directory, providing clear and modular code for better understanding and usage.

1. Attention is All You Need

Paper Reference

  • Title: Attention is All You Need
  • Authors: Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
  • Link: Attention is All You Need

Implementation

  • Directory: transformerLanguageModel
  • Description: This implementation focuses on the Transformer model, which eliminates the need for recurrence in sequence modeling tasks through the use of self-attention mechanisms.

2. PixelCNN

Paper Reference

  • Title: Conditional Image Generation with PixelCNN Decoders
  • Authors: Aaron van den Oord, Nal Kalchbrenner, Oriol Vinyals, Lasse Espeholt, Alex Graves, Koray Kavukcuoglu
  • Link: PixelCNN

Implementation

  • Directory: pixelCNN
  • Description: This implementation covers the PixelCNN model, a generative model that conditions on the entire image while generating pixel values sequentially.

3. VQ-VAE (Vector Quantized Variational Autoencoder)

Paper Reference

  • Title: Neural Discrete Representation Learning
  • Authors: Aaron van den Oord, Oriol Vinyals, Koray Kavukcuoglu
  • Link: VQ-VAE

Implementation

  • Directory: vq_vae
  • Description: This implementation focuses on the VQ-VAE model, a variational autoencoder that learns discrete representations in an unsupervised manner.

4. BPE (Byte Pair Encoding)

Paper Reference

  • Title: Neural Machine Translation of Rare Words with Subword Units
  • Authors: Rico Sennrich, Barry Haddow, Alexandra Birch
  • Link: BPE

Implementation

  • Directory: bpe
  • Description: This implementation covers Byte Pair Encoding, a data compression technique used in various natural language processing tasks, including machine translation.

How to Use

Each directory contains a README with specific instructions on how to set up and run the code. Please refer to individual README files for more detailed information.

Contributions

Contributions and improvements are welcome. Feel free to fork the repository, create a new branch, and submit a pull request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published