-
Hi I trained model with a train configuration file and export with export configuration file. There are several information which need to run model such as preprocessing method. I want to pass this configuration file after create bentoservice object, but I can not find how can I do it. Below is the pseudo code that explains the functions I want to do. bento_svc = PytorchService()
# What I want to do
bento_svc.train_config = {"exp_name":"cls_baseline", "preprocessing":...}
bento_svc.pack('cls_predictor', model) @artifacts([
PytorchModelArtifact("cls_predictor"),
])
class PytorchService(BentoService):
...
@api(input=JsonInput(), route="v1/predict", output=JsonOutput())
def predict(self, parsed_json):
exp_name = self.train_config["exp_name"]
preprocessing = self.train_config["preprocessing"] Is there anyone who can help? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Hi, @kboseong . %%writefile some_parameters.py
EXP_NAME = "cls_baseline"
PREPROCESSING = "preprocessing"
%%
from some_parameters import *
...
@artifacts([
PytorchModelArtifact("cls_predictor"),
])
class PytorchService(BentoService):
...
@api(input=JsonInput(), route="v1/predict", output=JsonOutput())
def predict(self, parsed_json):
exp_name = EXP_NAME
preprocessing = PREPROCESSING |
Beta Was this translation helpful? Give feedback.
-
Hi @kboseong, exactly as @withsmilo suggested, in 0.13, there's no easy way to customize Service attributes per se: although it is possible to override the bento_svc = PytorchService()
bento_svc.pack('cls_predictor', model, metadata={"exp_name":"cls_baseline", "preprocessing":...})
bento_svc.save() @artifacts([
PytorchModelArtifact("cls_predictor"),
])
class PytorchService(BentoService):
...
@api(input=JsonInput(), route="v1/predict", output=JsonOutput())
def predict(self, parsed_json):
metadata = self.artifacts.get('cls_predictor').metadata
exp_name = metadata["exp_name"]
preprocessing = metadata["preprocessing"] Another way to customize your BentoService's behavior base on parameter, is by using environment variables, e.g.: import os
@artifacts([
PytorchModelArtifact("cls_predictor"),
])
class PytorchService(BentoService):
...
@api(input=JsonInput(), route="v1/predict", output=JsonOutput())
def predict(self, parsed_json):
exp_name = str(os.environ["exp_name"])
preprocessing_config_1 = int(os.environ["config_key_1"]) This is a bit more flexible API and can be applicable to other type of Service level configurations. Although for your use case, I'd recommend using the metadata API in Artifact. |
Beta Was this translation helpful? Give feedback.
Hi @kboseong, exactly as @withsmilo suggested, in 0.13, there's no easy way to customize Service attributes per se: although it is possible to override the
__init__
parameter of your BentoService class, but it must have a default value and you can't pass in parameters when starting the server. However, it looks like you can probably use the Artifact metadata to achieve what you need: