21 Lessons, Get Started Building with Generative AI 🔗 https://microsoft.github.io/generative-ai-for-beginners/
-
Updated
Dec 12, 2024 - Jupyter Notebook
21 Lessons, Get Started Building with Generative AI 🔗 https://microsoft.github.io/generative-ai-for-beginners/
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
CVPR 2024 论文和开源项目合集
AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Chatbot for documentation, that allows you to chat with your data. Privately deployable, provides AI knowledge sharing and integrates knowledge into your AI workflow
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
Machine Learning Engineering Open Book
Ongoing research training transformer models at scale
Semantic segmentation models with 500+ pretrained convolutional and transformer-based backbones.
💡 All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows
This repository contains demos I made with the Transformers library by HuggingFace.
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
A PyTorch-based Speech Toolkit
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."