Developed a Microservice based architecture having 5 different components
- Frontend (Simple HTML/CSS + JS website to test the architecture)
- Backend (Flask, Kafka Producer)
- Kafka Cosnsumer (Python script to consume new data and train and save model)
- Tensorflow Serving on Docker (Docker)
- Saved Models (Folder having different saved versions of the model, used by Tensorflow serving to provide a API)
- First, Host the frontend/index.html or Open it in a browser
https://<localhost/path-to-file>/index.html
- Start the Flask backend (backend/predict_annotate.py) on a terminal
python predict_annotate.py
- Open a new terminal, Start our Incremental Model (Kafka consumer) (model/incremental_model.py)
python incremental_model.py
- Start Tensorflow serving on Docker
- Pull tensorflow serving image from dockerhub
docker pull tensorflow/serving
- Run the docker image
docker run -it -v <path-to-model-folder>\model:/inc_model_kafka -p 8605:8605 --entrypoint /bin/bash tensorflow/serving
Here inc_model_kafka
is just a sample name for the new folder that will be created in the docker container, you can use any other name
- After this you should entered in shell of the docker container, last step is to start the tensorflow server:
tensorflow_model_server --port=8500 --rest_api_port=8605 --model_config_file=/inc_model_kafka/model.config
After everything is up and running you can start interacting with the frontend and as you go on add new data your new model will stored in model/saved_model
directory
All the components interact together in a way shown below:
Check Flowchart in High Resolution here
The online machine learning paradigm is a bit different from the traditional/conventional way of training machine learning models.
- In traditional approaches the dataset is fixed and the model iterates over it n number of times.
- In online learning, the model continues to incrementally learn/update it's parameters as soon as the new data points are available and this process is expected to continue indefinitely.