使用DocumentDB作为接收器的Azure Stream Analytics中出现错误 [英] Getting error in Azure Stream Analytics with DocumentDB as sink

查看:97
本文介绍了使用DocumentDB作为接收器的Azure Stream Analytics中出现错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Azure流分析将事件从事件中心流式传输到DocumentDB. 我已经按照记录配置了 input query output ,并使用示例数据对其进行了测试,并设法按预期返回结果.

I'm using Azure Stream Analytics to stream events from Event Hubs to DocumentDB. I have configured the input, query and output as documented, tested it with sample data and it managed to return results as expected.

但是当我开始流作业并发送与样本数据相同的有效负载时,我收到了以下错误消息:

But when I start the streaming job and send the same payload as the sample data earlier, I got this error message:

根据DocumentDB约束,为DocumentDB db:[my-database-name]和collection:[my-collection-name]格式化文档[id]列时出现问题.

There was a problem formatting the document [id] column as per DocumentDB constraints for DocumentDB db:[my-database-name], and collection:[my-collection-name].

我的示例数据是JSON数组:

My sample data is an array of JSON:

[
 { "Sequence": 1, "Tenant": "T1", "Status": "Started" },
 { "Sequence": 2, "Tenant": "T1", "Status": "Ended" }
]

我将输入配置如下:

  • 输入别名:eventhubs-events
  • 源类型:数据流
  • 来源:事件中心
  • 订阅:与创建Analytics(分析)工作的订阅相同
  • 服务总线名称空间:现有的事件中心名称空间
  • 事件中心名称:事件(名称空间中的现有事件中心)
  • 事件中心策略名称:具有读取权限的策略
  • 活动中心消费者组:空白
  • 事件序列化格式:JSON
  • 编码:UTF-8
  • Input alias: eventhubs-events
  • Source Type: Data stream
  • Source: Event Hub
  • Subscription: same subscription as where I create the Analytics job
  • Service bus namespace: an existing Event Hub namespace
  • Event hub name: events (existing event hub in the namespace)
  • Event hub policy name: a policy with read access
  • Event hub consumer group: blank
  • Event serialization format: JSON
  • Encoding: UTF-8

输出如下:

  • 输出别名:documentdb-events
  • 接收器:DocumentDB
  • 订阅:与创建Analytics(分析)工作的订阅相同
  • 帐户ID:现有的DocumentDB帐户
  • 数据库:记录(帐户中的现有数据库)
  • 集合名称模式:集合(数据库中的现有集合)
  • 文档ID:ID
  • Output alias: documentdb-events
  • Sink: DocumentDB
  • Subscription: same subscription as where I create the Analytics job
  • Account id: an existing DocumentDB account
  • Database: records (an existing database in the account)
  • Collection name pattern: collection (an existing collection in the database)
  • Document id: id

我的查询很简单:

SELECT
    event.Sequence AS id,
    event.Tenant,
    event.Status
INTO [documentdb-events]
FROM [eventhubs-events] AS event

推荐答案

将输出中的所有字段名称自动转换为小写.

Turns out all field names in the output are automatically lower-cased.

在我的DocumentDB集合中,我已将"/Tenant" 作为分区键以 Partitioned 模式配置了集合.

In my DocumentDB collection, I've configured the collections in Partitioned mode, with "/Tenant" as the Partition Key.

由于案例与输出的案例不匹配,因此未能通过约束.

Since the case didn't match that of the output, it failed the constraint.

将分区密钥更改为"/tenant".解决了这个问题.

Changing the Partition Key to "/tenant" fixed the issue.

希望通过分享我的发现结果可以为碰到这个问题的人们节省一些麻烦.

Hope by sharing the outcome of my findings could save some trouble for people who bump into this.

第二个选项

现在,我们可以更改Stream Analytics中的兼容性级别,而不是使用小写形式更改分区键.

Instead of changing partition key in lower case, now we can change compatibility-Level in Stream analytics.

1.0版本:由Azure Stream Analytics引擎处理时,字段名称已更改为小写.

1.0 versions: Field names were changed to lower case when processed by the Azure Stream Analytics engine.

1.1版本:Azure Stream Analytics引擎处理字段名称时,对字段名称区分大小写.

1.1 version: case-sensitivity is persisted for field names when they are processed by the Azure Stream Analytics engine.

这篇关于使用DocumentDB作为接收器的Azure Stream Analytics中出现错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆