Requests to TensorFlow serving's predict API returns error “Missing inputs”










0















I have trained a simple regression model to fit a linear function with the following equation: y = 3x + 1. For testing purposes, I saved the model as checkpoints, so that I could resume training and wouldn't have to start from scratch every time.



Now I want to make this model available via TF serving. For this reason, I had to convert it into the SavedModel format of tensorflow via this script:



import tensorflow as tf
import restoretest as rt ## just the module that contains the linear model

tf.reset_default_graph()

latest_checkpoint = tf.train.latest_checkpoint('path/to/checkpoints')
model = rt.LinearModel()
saver = tf.train.Saver()

export_path = 'path/to/export/folder'

with tf.Session() as sess:

if latest_checkpoint:
saver.restore(sess, latest_checkpoint)
else:
raise ValueError('No checkpoint file found')

print('Exporting trained model to', export_path)

builder = tf.saved_model.builder.SavedModelBuilder(export_path)

## define inputs and outputs

tensor_info_x = tf.saved_model.utils.build_tensor_info(model.x)
tensor_info_y = tf.saved_model.utils.build_tensor_info(model.y_pred)

prediction_signature = (
tf.saved_model.signature_def_utils.build_signature_def(
inputs='xvals': tensor_info_x,
outputs='yvals': tensor_info_y,
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))


builder.add_meta_graph_and_variables(sess,
[tf.saved_model.tag_constants.SERVING],
signature_def_map=tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: prediction_signature,
main_op=tf.tables_initializer(),
strip_default_attrs=True)

builder.save()

print('Done exporting')


This creates a folder (as expected) with the contents:



export_folder
|-saved_model.pb
|-variables
|-variables.index
|-variables.data-00000-of-00001


To serve this with tf serving and docker, I pulled the tensorflow/serving image from docker and ran the container via the command:



sudo docker run -p 8501:8501 --mount type=bind,source=path/to/export/folder,target=models/linear -e MODEL_NAME=linear -t tensorflow/serving


This seems to execute without problems, as I get a lot of infos. In the last line of the output it says




[evhttp_server.cc : 237] RAW: Entering the event loop ...




I guess the server is waiting for requests. Now, when I try to send a request to it via curl, I get an error:



curl -d '"xvals": [1.0 2.0 5.0]' -X POST http://localhost:8501/v1/models/linear:predict



"error": "Missing 'inputs' or 'instances' key"




What am I doing wrong? The model works when I send dummy values via the saved_model_cli.










share|improve this question


























    0















    I have trained a simple regression model to fit a linear function with the following equation: y = 3x + 1. For testing purposes, I saved the model as checkpoints, so that I could resume training and wouldn't have to start from scratch every time.



    Now I want to make this model available via TF serving. For this reason, I had to convert it into the SavedModel format of tensorflow via this script:



    import tensorflow as tf
    import restoretest as rt ## just the module that contains the linear model

    tf.reset_default_graph()

    latest_checkpoint = tf.train.latest_checkpoint('path/to/checkpoints')
    model = rt.LinearModel()
    saver = tf.train.Saver()

    export_path = 'path/to/export/folder'

    with tf.Session() as sess:

    if latest_checkpoint:
    saver.restore(sess, latest_checkpoint)
    else:
    raise ValueError('No checkpoint file found')

    print('Exporting trained model to', export_path)

    builder = tf.saved_model.builder.SavedModelBuilder(export_path)

    ## define inputs and outputs

    tensor_info_x = tf.saved_model.utils.build_tensor_info(model.x)
    tensor_info_y = tf.saved_model.utils.build_tensor_info(model.y_pred)

    prediction_signature = (
    tf.saved_model.signature_def_utils.build_signature_def(
    inputs='xvals': tensor_info_x,
    outputs='yvals': tensor_info_y,
    method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))


    builder.add_meta_graph_and_variables(sess,
    [tf.saved_model.tag_constants.SERVING],
    signature_def_map=tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: prediction_signature,
    main_op=tf.tables_initializer(),
    strip_default_attrs=True)

    builder.save()

    print('Done exporting')


    This creates a folder (as expected) with the contents:



    export_folder
    |-saved_model.pb
    |-variables
    |-variables.index
    |-variables.data-00000-of-00001


    To serve this with tf serving and docker, I pulled the tensorflow/serving image from docker and ran the container via the command:



    sudo docker run -p 8501:8501 --mount type=bind,source=path/to/export/folder,target=models/linear -e MODEL_NAME=linear -t tensorflow/serving


    This seems to execute without problems, as I get a lot of infos. In the last line of the output it says




    [evhttp_server.cc : 237] RAW: Entering the event loop ...




    I guess the server is waiting for requests. Now, when I try to send a request to it via curl, I get an error:



    curl -d '"xvals": [1.0 2.0 5.0]' -X POST http://localhost:8501/v1/models/linear:predict



    "error": "Missing 'inputs' or 'instances' key"




    What am I doing wrong? The model works when I send dummy values via the saved_model_cli.










    share|improve this question
























      0












      0








      0








      I have trained a simple regression model to fit a linear function with the following equation: y = 3x + 1. For testing purposes, I saved the model as checkpoints, so that I could resume training and wouldn't have to start from scratch every time.



      Now I want to make this model available via TF serving. For this reason, I had to convert it into the SavedModel format of tensorflow via this script:



      import tensorflow as tf
      import restoretest as rt ## just the module that contains the linear model

      tf.reset_default_graph()

      latest_checkpoint = tf.train.latest_checkpoint('path/to/checkpoints')
      model = rt.LinearModel()
      saver = tf.train.Saver()

      export_path = 'path/to/export/folder'

      with tf.Session() as sess:

      if latest_checkpoint:
      saver.restore(sess, latest_checkpoint)
      else:
      raise ValueError('No checkpoint file found')

      print('Exporting trained model to', export_path)

      builder = tf.saved_model.builder.SavedModelBuilder(export_path)

      ## define inputs and outputs

      tensor_info_x = tf.saved_model.utils.build_tensor_info(model.x)
      tensor_info_y = tf.saved_model.utils.build_tensor_info(model.y_pred)

      prediction_signature = (
      tf.saved_model.signature_def_utils.build_signature_def(
      inputs='xvals': tensor_info_x,
      outputs='yvals': tensor_info_y,
      method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))


      builder.add_meta_graph_and_variables(sess,
      [tf.saved_model.tag_constants.SERVING],
      signature_def_map=tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: prediction_signature,
      main_op=tf.tables_initializer(),
      strip_default_attrs=True)

      builder.save()

      print('Done exporting')


      This creates a folder (as expected) with the contents:



      export_folder
      |-saved_model.pb
      |-variables
      |-variables.index
      |-variables.data-00000-of-00001


      To serve this with tf serving and docker, I pulled the tensorflow/serving image from docker and ran the container via the command:



      sudo docker run -p 8501:8501 --mount type=bind,source=path/to/export/folder,target=models/linear -e MODEL_NAME=linear -t tensorflow/serving


      This seems to execute without problems, as I get a lot of infos. In the last line of the output it says




      [evhttp_server.cc : 237] RAW: Entering the event loop ...




      I guess the server is waiting for requests. Now, when I try to send a request to it via curl, I get an error:



      curl -d '"xvals": [1.0 2.0 5.0]' -X POST http://localhost:8501/v1/models/linear:predict



      "error": "Missing 'inputs' or 'instances' key"




      What am I doing wrong? The model works when I send dummy values via the saved_model_cli.










      share|improve this question














      I have trained a simple regression model to fit a linear function with the following equation: y = 3x + 1. For testing purposes, I saved the model as checkpoints, so that I could resume training and wouldn't have to start from scratch every time.



      Now I want to make this model available via TF serving. For this reason, I had to convert it into the SavedModel format of tensorflow via this script:



      import tensorflow as tf
      import restoretest as rt ## just the module that contains the linear model

      tf.reset_default_graph()

      latest_checkpoint = tf.train.latest_checkpoint('path/to/checkpoints')
      model = rt.LinearModel()
      saver = tf.train.Saver()

      export_path = 'path/to/export/folder'

      with tf.Session() as sess:

      if latest_checkpoint:
      saver.restore(sess, latest_checkpoint)
      else:
      raise ValueError('No checkpoint file found')

      print('Exporting trained model to', export_path)

      builder = tf.saved_model.builder.SavedModelBuilder(export_path)

      ## define inputs and outputs

      tensor_info_x = tf.saved_model.utils.build_tensor_info(model.x)
      tensor_info_y = tf.saved_model.utils.build_tensor_info(model.y_pred)

      prediction_signature = (
      tf.saved_model.signature_def_utils.build_signature_def(
      inputs='xvals': tensor_info_x,
      outputs='yvals': tensor_info_y,
      method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))


      builder.add_meta_graph_and_variables(sess,
      [tf.saved_model.tag_constants.SERVING],
      signature_def_map=tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: prediction_signature,
      main_op=tf.tables_initializer(),
      strip_default_attrs=True)

      builder.save()

      print('Done exporting')


      This creates a folder (as expected) with the contents:



      export_folder
      |-saved_model.pb
      |-variables
      |-variables.index
      |-variables.data-00000-of-00001


      To serve this with tf serving and docker, I pulled the tensorflow/serving image from docker and ran the container via the command:



      sudo docker run -p 8501:8501 --mount type=bind,source=path/to/export/folder,target=models/linear -e MODEL_NAME=linear -t tensorflow/serving


      This seems to execute without problems, as I get a lot of infos. In the last line of the output it says




      [evhttp_server.cc : 237] RAW: Entering the event loop ...




      I guess the server is waiting for requests. Now, when I try to send a request to it via curl, I get an error:



      curl -d '"xvals": [1.0 2.0 5.0]' -X POST http://localhost:8501/v1/models/linear:predict



      "error": "Missing 'inputs' or 'instances' key"




      What am I doing wrong? The model works when I send dummy values via the saved_model_cli.







      python python-3.x tensorflow tensorflow-serving






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 12 '18 at 13:47









      DocDrivenDocDriven

      1,0661520




      1,0661520






















          1 Answer
          1






          active

          oldest

          votes


















          1














          Looks like the body of the POST request should be modified. According to documentation the format should be



          "inputs": "xvals": [1.0 2.0 5.0]






          share|improve this answer























          • Thanks, aside from my original issue I also realized how to choose different signatures due to your link.

            – DocDriven
            Nov 12 '18 at 15:59










          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53263546%2frequests-to-tensorflow-servings-predict-api-returns-error-missing-inputs%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1














          Looks like the body of the POST request should be modified. According to documentation the format should be



          "inputs": "xvals": [1.0 2.0 5.0]






          share|improve this answer























          • Thanks, aside from my original issue I also realized how to choose different signatures due to your link.

            – DocDriven
            Nov 12 '18 at 15:59















          1














          Looks like the body of the POST request should be modified. According to documentation the format should be



          "inputs": "xvals": [1.0 2.0 5.0]






          share|improve this answer























          • Thanks, aside from my original issue I also realized how to choose different signatures due to your link.

            – DocDriven
            Nov 12 '18 at 15:59













          1












          1








          1







          Looks like the body of the POST request should be modified. According to documentation the format should be



          "inputs": "xvals": [1.0 2.0 5.0]






          share|improve this answer













          Looks like the body of the POST request should be modified. According to documentation the format should be



          "inputs": "xvals": [1.0 2.0 5.0]







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 12 '18 at 15:45









          Vlad-HCVlad-HC

          826815




          826815












          • Thanks, aside from my original issue I also realized how to choose different signatures due to your link.

            – DocDriven
            Nov 12 '18 at 15:59

















          • Thanks, aside from my original issue I also realized how to choose different signatures due to your link.

            – DocDriven
            Nov 12 '18 at 15:59
















          Thanks, aside from my original issue I also realized how to choose different signatures due to your link.

          – DocDriven
          Nov 12 '18 at 15:59





          Thanks, aside from my original issue I also realized how to choose different signatures due to your link.

          – DocDriven
          Nov 12 '18 at 15:59

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53263546%2frequests-to-tensorflow-servings-predict-api-returns-error-missing-inputs%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Darth Vader #20

          How to how show current date and time by default on contact form 7 in WordPress without taking input from user in datetimepicker

          Ondo