Open
Description
Using the tf.io ops in the tf.serving ecosystem would be a large development convenience and likely decrease inference latency. Can there be an official docker build or documentation to integrate tensorflow-io with tensorflow-serving?
Related issue on tensorflow/io issue #414