# DistilBERT Model with Hugging Face Transformers
A basic implementation of **DistilBERT** using the Hugging Face Transformers library to demonstrate its capabilities for NLP tasks such as text classification, sentiment analysis, and more.
---
## Features
- Load and use pre-trained DistilBERT without custom fine-tuning.
- Perform text-based tasks out of the box.
- Simple and reusable implementation.
---
## Requirements
- Python >= 3.7
- Transformers
- PyTorch
---
## Usage
### **1. Text Classification**
Use the pre-trained DistilBERT model to classify text:
```bash
python app.py --text "Enter your text here."
Process text using DistilBERT's text similarity:
python app.py --text "Entet your text here."
classify.py
: Script to classify text using pre-trained DistilBERT.similarity.py
: Script to similarity input text.
This project is licensed under the MIT License.