TinyNet is a straightforward implementation of a Feedforward Neural Network (FFNN) built from scratch in Python. Inspired by @karpathy's micrograd, this project aims to provide an educational resource for understanding the foundational components of neural networks without relying on external libraries.
- Pure Python Implementation: Constructed without dependencies to offer clear insights into the mechanics of neural networks.
- Educational Focus: Designed to help students and enthusiasts grasp the inner workings of FFNNs through hands-on experience.
- Modular Design: Organized code structure for ease of understanding and extension.
Clone the repository to your local machine:
git clone https://github.com/nMaax/TinyNet.git
cd TinyNet
The primary code and examples are contained within the main.ipynb
Jupyter Notebook. To explore and run the code:
-
Install Jupyter Notebook: If you don't have it installed, you can add it using pip:
pip install notebook
-
Launch Jupyter Notebook:
jupyter notebook
-
Open
main.ipynb
: In the Jupyter interface, navigate to theTinyNet
directory and openmain.ipynb
. -
Run the Notebook: Execute the cells sequentially to build and train the neural network.
NN/
: Contains the core neural network implementation.main.ipynb
: Jupyter Notebook demonstrating the usage of the neural network.LICENSE
: Project license information..gitignore
: Specifies files to ignore in the repository.
Note: This project is currently under development. Features and implementations are subject to change.