-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Serving using ExportStrategy #4
Comments
It might be easier to add support for using Usually you would declare two places holders, one for features and one for targets and then |
I think the standard way is freeze the model, then use feed_dict to feed input and get the output tensor. |
@skyw that's correct but since the models in this repo use Readers you first have to declare a new input tensor to the model (Placeholder). |
Any updates of this feature? I agree with @gidim , currently the model depend of tf.Slim.Dataset which is hard to be used for online inference. |
I think there is two options:
I think, the serving feature is very important feature for this project referred in other issues (#114). |
@amirj does tfserving supports option two? If so could link to the documentation/code ? I couldn't find anything about it. |
Input pipelines are the standard way to feeding TF models. So, I think, its indeed possible. But the current documentation in TF serving module is not clear. |
@amirj - the standard way when performing training. for inference most models use feed_dict. |
It's strange if TF serving is not compatible with input queues. |
@dennybritz Are things performed in _preprocess exported to Graph? |
Should figure out how to export models for serving, I think Tensorflow does provide something like an
ExportStrategy
that can be passed to the estimator and it will occasionally export the model.The text was updated successfully, but these errors were encountered: