使用LDA模型运行Sagemaker批量转换时出错 [英] Errors running Sagemaker Batch Transformation with LDA model

查看:127
本文介绍了使用LDA模型运行Sagemaker批量转换时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经用sagemaker成功地训练了LDA模型,我已经能够建立一个推理API,但是它限制了我一次可以查询多少条记录.

I've successfully trained a LDA model with sagemaker, I've been able to set up an Inference API but it has a limit of how many records I can query at a time.

我需要获取大文件的预测,并且一直在尝试使用批量转换",但是遇到了障碍.

I need to get predictions for a large file and have been trying to use Batch Transformation however am running against roadblock.

我的输入日期为application/x-recordio-protobuf内容类型,代码如下:

My input date is in application/x-recordio-protobuf content type, code is as follows:

# Initialize the transformer object
transformer =sagemaker.transformer.Transformer(
    base_transform_job_name='Batch-Transform',
    model_name=model_name,
    instance_count=1,
    instance_type='ml.c4.xlarge',
    output_path=output_location,
    max_payload=20,
    strategy='MultiRecord'
    )
# Start a transform job
transformer.transform(input_location, content_type='application/x-recordio-protobuf',split_type="RecordIO")
# Then wait until the transform job has completed
transformer.wait()

# Fetch validation result 
s3_client.download_file(bucket, 'topic_model_batch_transform/output/batch_tansform_part0.pbr.out', 'batch_tansform-result')
with open('batch_tansform-result') as f:
    results = f.readlines()   
print("Sample transform result: {}".format(results[0]))

我已将输入文件分成10个文件,每个文件的大小约为19MB.我首先尝试在单个块上运行,因此总共19MB.我尝试过更改策略,尝试使用SingleRecord.我也尝试了不同的split_types,也尝试了None和"Line".

I have chunked by input file into 10 files each around 19MB in size. I am attempting at first to run on a single chunk, therefore 19MB in total. I have tried changing strategy, trying SingleRecord. I have also tried different split_types, also trying None and "Line".

我已经阅读了文档,但是不清楚我还应该尝试什么,错误消息也很不清楚.

I've read the documentation but its not clear what else I should try, also the error messages are very unclear.

2019-04-02T15:49:47.617:[sagemaker logs]: MaxConcurrentTransforms=1, MaxPayloadInMB=20, BatchStrategy=MULTI_RECORD
#011at java.lang.Thread.run(Thread.java:748)2019-04-02T15:49:48.035:[sagemaker logs]: du-sagemaker/data/batch_transform/batch_tansform_part0.pbr: Bad HTTP status returned from invoke: 413
2019-04-02T15:49:48.036:[sagemaker logs]: du-sagemaker/data/batch_transform/batch_tansform_part0.pbr:
2019-04-02T15:49:48.036:[sagemaker logs]: du-sagemaker/data/batch_transform/batch_tansform_part0.pbr: Message:
2019-04-02T15:49:48.036:[sagemaker logs]: du-sagemaker/data/batch_transform/batch_tansform_part0.pbr: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
2019-04-02T15:49:48.036:[sagemaker logs]: du-sagemaker/data/batch_transform/batch_tansform_part0.pbr: <title>413 Request Entity Too Large</title>
2019-04-02T15:49:48.036:[sagemaker logs]: du-sagemaker/data/batch_transform/batch_tansform_part0.pbr: <h1>Request Entity Too Large</h1>
2019-04-02T15:49:48.036:[sagemaker logs]: du-sagemaker/data/batch_transform/batch_tansform_part0.pbr: <p>The data value transmitted exceeds the capacity limit.</p>

上面是我用上述配置获得的最后一个,在此之前,我还收到了400个HTTP错误代码.

The above is the last one I got with the above configuration, before that I was also getting a 400 HTTP error code.

任何帮助或指针将不胜感激!谢谢

Any help or pointers would be greatly appreciated! Thank you

推荐答案

我设法解决了这个问题,看来我使用的maxpayload太高了.我设置了MaxPayloadInMB=1,它现在像梦一样运行

I managed to resolve the issue, it seemed the maxpayload I was using was too high. I set MaxPayloadInMB=1 and it now runs like a dream

这篇关于使用LDA模型运行Sagemaker批量转换时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆