Skip to content

๐Ÿ“ DistilBERT, a lightweight Transformer model from Hugging Face, for various NLP tasks without requiring custom fine-tuning or datasets.

License

Notifications You must be signed in to change notification settings

Md-Emon-Hasan/DistilBERT-model-with-HF-Transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

4 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Image

# DistilBERT Model with Hugging Face Transformers
A basic implementation of **DistilBERT** using the Hugging Face Transformers library to demonstrate its capabilities for NLP tasks such as text classification, sentiment analysis, and more.

---

## Features
- Load and use pre-trained DistilBERT without custom fine-tuning.
- Perform text-based tasks out of the box.
- Simple and reusable implementation.

---

## Requirements
- Python >= 3.7
- Transformers
- PyTorch

---

## Usage

### **1. Text Classification**
Use the pre-trained DistilBERT model to classify text:
```bash
python app.py --text "Enter your text here."

2. Tokenization

Process text using DistilBERT's text similarity:

python app.py --text "Entet your text here."

Project Structure

  • classify.py: Script to classify text using pre-trained DistilBERT.
  • similarity.py: Script to similarity input text.

License

This project is licensed under the MIT License.