“未找到 SavedModel 包!"tensorflow_hub 模型部署到 AWS SageMaker [英] 'no SavedModel bundles found!' on tensorflow_hub model deployment to AWS SageMaker

查看:32
本文介绍了“未找到 SavedModel 包!"tensorflow_hub 模型部署到 AWS SageMaker的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我尝试将通用句子编码器模型部署到 aws Sagemaker 端点并收到错误 raise ValueError('no SavedModel bundles found!')

I attempting to deploy the universal-sentence-encoder model to a aws Sagemaker endpoint and am getting the error raise ValueError('no SavedModel bundles found!')

我在下面展示了我的代码,我感觉我的一个路径不正确

I have shown my code below, I have a feeling that one of my paths is incorrect

import tensorflow as tf
import tensorflow_hub as hub
import numpy as np
from sagemaker import get_execution_role
from sagemaker.tensorflow.serving import Model

def tfhub_to_savedmodel(model_name,uri):
    tfhub_uri = uri
    model_path = 'encoder_model/' + model_name

    with tf.Session(graph=tf.Graph()) as sess:
        module = hub.Module(tfhub_uri) 
        input_params = module.get_input_info_dict()
        dtype = input_params['text'].dtype
        shape = input_params['text'].get_shape()

        # define the model inputs
        inputs = {'text': tf.placeholder(dtype, shape, 'text')}

        # define the model outputs
        # we want the class ids and probabilities for the top 3 classes
        logits = module(inputs['text'])
        outputs = {
            'vector': logits,
        }

        # export the model
        sess.run([tf.global_variables_initializer(), tf.tables_initializer()])
        tf.saved_model.simple_save(
            sess,
            model_path,
            inputs=inputs,
            outputs=outputs)  

    return model_path


sagemaker_role = get_execution_role()

!tar -C "$PWD" -czf encoder.tar.gz encoder_model/
model_data = Session().upload_data(path='encoder.tar.gz',key_prefix='model')

env = {'SAGEMAKER_TFS_DEFAULT_MODEL_NAME': 'universal-sentence-encoder-large'}

model = Model(model_data=model_data, role=sagemaker_role, framework_version=1.12, env=env)
predictor = model.deploy(initial_instance_count=1, instance_type='ml.t2.medium')

推荐答案

我想你是从这个例子开始的吧?https://github.com/awslabs/amazon-sagemaker-examples/tree/master/sagemaker-python-sdk/tensorflow_serving_container

I suppose you started from this example? https://github.com/awslabs/amazon-sagemaker-examples/tree/master/sagemaker-python-sdk/tensorflow_serving_container

看起来您没有正确保存 TF Serving 包:缺少模型版本号,因为这一行:

It looks like you're not saving the TF Serving bundle properly: the model version number is missing, because of this line:

model_path = 'encoder_model/' + model_name

用它替换它应该可以解决您的问题:

Replacing it with this should fix your problem:

model_path = '{}/{}/00000001'.format('encoder_model/', model_name)

您的模型人工制品应如下所示(我使用了上面笔记本中的模型):

Your model artefact should look like this (I used the model in the notebook above):

mobilenet/
mobilenet/mobilenet_v2_140_224/
mobilenet/mobilenet_v2_140_224/00000001/
mobilenet/mobilenet_v2_140_224/00000001/saved_model.pb
mobilenet/mobilenet_v2_140_224/00000001/variables/
mobilenet/mobilenet_v2_140_224/00000001/variables/variables.data-00000-of-00001
mobilenet/mobilenet_v2_140_224/00000001/variables/variables.index

然后,上传到 S3 并部署.

Then, upload to S3 and deploy.

这篇关于“未找到 SavedModel 包!"tensorflow_hub 模型部署到 AWS SageMaker的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆