将多个项目的日志接收器指向一个存储桶 [英] Pointing multiple projects' log sinks to one bucket

查看:80
本文介绍了将多个项目的日志接收器指向一个存储桶的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有几个带日志接收器的GCP项目到不同的存储桶。我想将它们合并成一个桶。但stackdriver导出不会为其创建的对象名添加任何区别信息;它们全部看起来像 cloudaudit.googleapis.com/activity/2017/11/14/00:00:00_00:59:59_S0.json



如果我开始将它们全部推送到一个桶中,会发生什么?不同的项目是否会覆盖彼此的对象?是否有任何方法可以区分哪个项目只是从对象创建日志?



如果不是,我想我应该切换到pubsub接收器,然后编写一些代码具有更多所需名称的对象。是否有任何已建立的模式或示例可以做到这一点?



更新:我提交了 https://issuetracker.google.com/issues/69371200 为此问题。

解决方案

要启用此功能,只需选择自定义目标,并指向以下格式的存储分区: storage.googleapis.com/ [BUCKET_ID]



我刚刚在几个项目中启用了此功能,因为我很好奇在导出到存储桶时看到结果。但是,我一直在为我的所有项目使用单个BQ接收器,并且创建的表具有所有日志混合的情况,因此在使用单个BQ接收器时不会丢失日志。



我假设一个GCS接收器将以同样的方式工作,但我会在几天内告诉你。

如果单个水槽接收器没有工作中,您可以始终使用单个BQ接收器(这将有助于分析日志),并且当您不再希望在BQ中使用它们时,可以将它们导出并存储到任意位置。



此外,由于您会一直写信给您的汇,因此您无法使用近线或冷线,因此存储定价在BQ中更好区域存储桶(BQ为0.02美元/ GB,区域存储为0.02至0.35美元/ GB,具体取决于地区; BQ每月10GB免费,GCS 5GB)。



我通常会推荐使用BQ接收器,但我会告诉您我的存储桶日志会发生什么。



更新:

几小时后,我已验证共享存储区接收器的工作原理与您会期待。它按时间顺序连接日志,无论项目的来源如何,只为每个时间窗口创建一个文件。希望这可以帮助! (我仍然更喜欢BQ作为日志接收器...)



更新2:

对于您在功能请求中寻找的行为,我会使用BQ,但您可以轻松地grep项目ID并分隔日志:

  grep'logName:projects /< your-project-id> /'mixed-log.json> single-project-log.json 

或者只是获取由存储桶更新(所以,每次你在水槽里收到一个日志文件)为你运行这个。

或者命名空间你的水桶,并有一个云功能,将它们移动到任何你需要的地方,只要它们被写入。



可能性是无止境的!


I have a few GCP projects with log sinks to different storage buckets. I'd like to combine them into a single bucket. But the stackdriver export doesn't add any distinguishing information to the object names it creates; they all look like cloudaudit.googleapis.com/activity/2017/11/14/00:00:00_00:59:59_S0.json

What will happen if I start pushing them all to a single bucket? Will the different project sinks overwrite each other's objects? Is there any way to distinguish which project created the logs just from the object?

If not, I guess I should switch to pubsub sinks, and then write some code that produces objects with more desirable names. Are there any established patterns or examples for doing this?

Update: I filed https://issuetracker.google.com/issues/69371200 for this issue.

解决方案

To enable this, just select custom destination on the sink and point to the bucket with this format: storage.googleapis.com/[BUCKET_ID].

I've just enabled this in a couple of my projects, as I'm curious to see the results when exporting to a bucket. However, I have been using a single BQ sink for all my projects, and the tables created have all the logs mixed, so no logs lost when using a single BQ sink.

I'm assuming for a GCS sink will work in the same way, but I'll tell you in a couple of days.

If a single bucket sink does not work, you can always use a single BQ sink (that will help in analyzing the logs), and when you no longer want to have them in BQ, export them and store the files wherever you want.

Also, since you'll be writing to your sink constantly, you can't use nearline or coldline, so the storage pricing is better in BQ than a regional bucket (0.02 USD/GB in BQ vs somewhere between 0.02 and 0.35 USD/GB for regional storage, depending on the region; BQ has 10GB free monthly, GCS 5GB).

I would generally recommend using a BQ sink, but I'll tell you what happens with my bucket logs.

Update:

A few hours later, and I've verified that shared bucket sinks work pretty much as you would expect. It concatenates logs chronologically regardless of the project origin, and only creates a single file for each time window. Hope this helps! (I still prefer BQ as a log sink...)

Update 2:

For the behavior you seek in the feature request, I would use BQ, but you could just as easily grep the project ID and separate the logs:

grep '"logName":"projects/<your-project-id>/' mixed-log.json > single-project-log.json

Or just get a cloud function triggered by bucket updates (so, every time you receive a log file in the sink) to run this for you.

Or namespace you buckets and have a cloud function moving them to wherever you need as soon as they are written.

The possibilities are endless!

这篇关于将多个项目的日志接收器指向一个存储桶的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆