This repository contains implementations of various normalization techniques used in machine learning to preprocess data. Normalization ensures that the data scales properly, improving the performance of machine learning algorithms.
-
📘 Notebooks:
- 📊
feature-scaling-min-max-scaling.ipynb
: Demonstrates Min-Max Scaling. - 📏
Mean-Max-Scaling.ipynb
: Implements Mean-Max Scaling. - 📈
Max-Abs-Scaling.ipynb
: Explains Max-Abs Scaling. - 📉
RobustScaling.ipynb
: Covers Robust Scaling.
- 📊
-
📜 License:
- The repository is licensed under the MIT License.
-
⚖️ Min-Max Scaling
- Scales the data to a fixed range, typically [0, 1].
- 🔗 Example Notebook
-
📏 Mean-Max Scaling
- Normalizes data using the mean and maximum values.
-
📈 Max-Abs Scaling
- Scales each feature by its maximum absolute value.
-
🛡️ Robust Scaling
- Removes the median and scales data according to the interquartile range, making it robust to outliers.
- Clone the repository:
git clone https://github.com/Himel-Sarder/ML-Normalization.git
- Navigate to the repository:
cd ML-Normalization
- Open the desired Jupyter Notebook to explore and run the examples:
jupyter notebook
- 🐍 Python 3.x
- 📓 Jupyter Notebook
- Required libraries:
- 🧮 NumPy
- 📊 Pandas
- 🤖 Scikit-learn
- 🎨 Matplotlib (optional for visualization)
Install the required libraries using:
pip install numpy pandas scikit-learn matplotlib
Contributions are welcome! 🎉 If you have suggestions for improvements or additional techniques, feel free to fork the repository and create a pull request.
This project is licensed under the MIT License. See the LICENSE file for details.