AWS S3 cp创建未定义的文件 [英] AWS S3 cp creates undefined files

查看:131
本文介绍了AWS S3 cp创建未定义的文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用aws cli cp命令递归复制文件时,存在一个错误,该错误会创建一些未定义的文件.

While using the aws cli cp command to copy the files recursively, there is a bug which creates some undefined files.

aws s3 cp --recursive $HOME/$MYHOST-$MYTIMESTAMP/$MYHOST-$MYTIMESTAMP-*.xml  s3://mybucket/$MYHOST-$MYTIMESTAMP/

程序运行正常,并上传到指定的存储桶.但是它还会在根文件夹的存储桶之外创建一些未定义的文件.这种情况一直发生,我必须rm(删除)那些烦人的未定义文件.

The program works fine and uploads to the specified bucket. But it also creates some undefined files outside the bucket in the root folder. This happens all the time and I have to rm (delete) those annoying undefined files.

我认为这是一个错误,然后尝试单独上传文件而不是使用通配符,其结果与递归相同,但仍会在根文件夹的存储桶之外再次创建其他未定义的文件.而且只有当我在bash脚本中运行一堆相同的cp命令时,才会发生这种情况.在这种情况下,问题会间歇出现.

I presumed it to be a bug and then tried individually uploading the files rather than using wildcards, with the same results as the recursive, it still creates additional undefined files outside the bucket in the root folder again. And this happens only when I run a bunch of the same cp commands in a bash script. In this case the problem is intermittently showing up.

aws s3 cp  $HOME/$MYHOST-$MYTIMESTAMP/$MYHOST-$MYTIMESTAMP-hello.xml  s3://mybucket/$MYHOST-$MYTIMESTAMP/

但是,仅对单个文件执行此操作时,它不会显示. 我的Cli版本-

However while doing it only for a single file, it doesn't show up. My Cli version -

aws-cli/1.14.34 Python/2.7.14 + Linux/4.4.104-39-default botocore/1.8.38

aws-cli/1.14.34 Python/2.7.14+ Linux/4.4.104-39-default botocore/1.8.38

任何帮助将不胜感激.

Any help would be highly appreciated on this.

推荐答案

您已配置S3访问日志记录以将日志写入此存储桶.大概是这些存储桶的日志文件.

You have configured S3 access logging to write logs into this bucket. Presumably, these are the log files for this bucket.

为什么文件名以"undefined"开头并不清楚-为存储桶设置日志记录时可能出错了,因此日志文件的前缀没有得到保存-但文件名看起来像文件名S3创建的日志文件.

Why the filenames begin with "undefined" is not clear -- something may have gone wrong when you set up logging for the bucket so that the log file prefix did not get saved -- but the filenames look like the names of the log files that S3 creates.

https://docs.aws.amazon.com/AmazonS3/latest/dev/ServerLogs.html

最佳做法是设置一个单独的存储桶,以收集每个区域中的S3访问日志.

Best practice is to set up a separate bucket for collecting S3 access logs in each region.

这篇关于AWS S3 cp创建未定义的文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆