Skip to content

Latest commit

 

History

History
78 lines (50 loc) · 2.33 KB

index.markdown

File metadata and controls

78 lines (50 loc) · 2.33 KB
layout title nav_order description permalink
default
Quick start
1
fastDeploy enables you to build production ready APIs for Deep Learning models.
/

ML/DL models -> scalable, efficient and configurable API deployments.

{: .fs-9 }

fastDeploy provides a convenient way to serve DL/ ML models with minimal extra code. {: .fs-6 .fw-300 }

Download CLI{: .btn .btn-primary .fs-5 .mb-4 .mb-md-0 .mr-2 } View it on GitHub{: .btn .fs-5 .mb-4 .mb-md-0 }


Features:

  • Optimal batch size is estimated and inputs are automatically batched.
  • Prediction can be run in sync or async using /sync and /async endpoints.
  • Supports everything supported by Python; Tensorflow, Keras, Pytorch, MXNet, Kaldi ..
  • Minimal extra code: You don't need to write any fastDeploy specific code.

Dependencies

Docker is the only dependency.

Quick start: Run a pre-built fastDeploy recipe via CLI.

fastDeploy's CLI is a simple, helpful wrapper over docker.

  • See the complete list of commands supported here.
  • Learn about the structure of source_dir here.
# See all the arguments supported.
./fastDeploy.py --help

# Print list of available recipes with descriptions.
./fastDeploy.py --list_recipes

# Run a recipe (eg: craft_text_detection).
./fastDeploy.py --run craft_text_detection --name craft_text_detection_test_run

About the project

fastDeploy is built by notAI.tech.

License

fastDeploy is distributed by an MIT license.

Contributing

When contributing to this repository, please first discuss the change you wish to make via Github issue

This website is adapted by:

    {% for contributor in site.github.contributors %}
  • {{ contributor.login }}
  • {% endfor %}

Code of Conduct

notAI.tech is committed to fostering a welcoming community.

View our Code of Conduct on our GitHub repository.