AWS Kinesis Firehose不在Redshift中插入数据 [英] AWS Kinesis Firehose not inserting data in Redshift

查看:172
本文介绍了AWS Kinesis Firehose不在Redshift中插入数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我尝试让Kinesis Firehose在Redshift表中推送数据.

I try to have a Kinesis Firehose pushing data in a Redshift table.

firehose流正在运行并将数据放入S3.

The firehose stream is working and putting data in S3.

但是没有任何东西到达Redshift的目标表中.

But nothing arrive in the destination table in Redshift.

  • 在指标中DeliveryToRedshift成功为0(DeliveryToRedshift记录为空)
  • 加载日志(redshift Web控制台)和STL_LOAD_ERRORS表为空.
  • 我检查了Firehose是否能够连接到Redshift(我在STL_CONNECTION_LOG中看到了连接)

如何解决此问题?

推荐答案

最后,我通过删除并重新创建Firehose流使其工作:-/ 可能是通过Web控制台重复进行的编辑使事情变得不稳定.

In the end, I made it work by deleting and re-creating the Firehose stream :-/ Probably the repeated edits via the web console made the thing unstable.

但这是故障排除指南:

  • 此过程是一个很好的起点: http://docs .aws.amazon.com/firehose/latest/dev/troubleshooting.html
  • 检查数据是否在S3中到达
    • A good start point is this procedure : http://docs.aws.amazon.com/firehose/latest/dev/troubleshooting.html
    • Check that data is arriving in S3
      • There must be an IAM role for firehose delivery, with a trust relationship between the firehose service and this role
      • This IAM role must have S3 access policy
      • See the policy jsons here : http://docs.aws.amazon.com/firehose/latest/dev/controlling-access.html#using-iam-s3
      • 美国东部地区(弗吉尼亚北部)52.70.63.192/27
      • 美国西部(俄勒冈州)为52.89.255.224/27
      • 欧盟(爱尔兰)的
      • 52.19.239.192/27

      这时,您应该能够在Redshift日志中查看连接尝试:

      At this point, you should be able to see the connection attempts in Redshift logs :

      select * from stl_connection_log where remotehost like '52%' order by recordtime desc;  
      

    • 检查Firehose使用的Redshift用户在目标表上是否具有足够的特权:

    • Check that the Redshift user used by Firehose has enough privileges on the target table :

      select tablename, 
         HAS_TABLE_PRIVILEGE(tablename, 'select') as select,
         HAS_TABLE_PRIVILEGE(tablename, 'insert') as insert,
         HAS_TABLE_PRIVILEGE(tablename, 'update') as update,
         HAS_TABLE_PRIVILEGE(tablename, 'delete') as delete, 
         HAS_TABLE_PRIVILEGE(tablename, 'references') as references 
      from pg_tables where schemaname='public' order by tablename;
      

    • 然后您可以检查COPY命令是否运行:

    • Then you can check if the COPY command is run :

      select * from stl_query order by endtime desc limit 10;
      

    • 然后检查加载错误或服务器错误:

    • Then check load errors, or server errors :

      select * from stl_load_errors  order by starttime desc;
      select * from stl_error where userid!=0 order by recordtime desc;
      

    • 如果数据或COPY选项中存在格式问题,或者数据与目标列之间不匹配,则至少应看到COPY尝试和一些加载错误.

    • If you have format problems in your data, or in the COPY options, or a mismatch between your data and the target columns, you should at least see the COPY attempts, and some load errors.

      如果您仍然受阻,在那些日志表中什么也没有出现,请尝试删除并重新创建整个firehose流,因为可能存在一些与Web控制台相关的错误. (这一步对我有用)

      If you're still stuck, with nothing appearing in those log tables, try deleting and recreating the whole firehose stream, as there may be some bugs related to the web console. (This step worked for me)

      这篇关于AWS Kinesis Firehose不在Redshift中插入数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆