从SQL Server复制到Azure DataLake。将Byte数组转换为JSON时出错 [英] Copying from SQL server to Azure DataLake. Error converting Byte array to JSON

查看:102
本文介绍了从SQL Server复制到Azure DataLake。将Byte数组转换为JSON时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

嗨!


我试图将多个表从sql数据库复制到JSON格式的azure数据湖。 sql数据库中的一列是timestamp(Bytearray)类型,它在运行管道时给出错误:


{" errorCode":" 2200"," message":""故障发生在'Sink'一侧。 ErrorCode = DataTypeNotSupported,'Type = Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message =不支持
数据类型ByteArray。,Source = Microsoft.DataTransfer.Common,'"," failureType":" ; UserError"," target":" Copy_prj" }


有没有办法使副本跳过此列?或者其他任何方法来修复它?

解决方案

你好,


您是否尝试在您的方案中将Skip不兼容行设置为true?



跳过不兼容的行将处理不兼容的行,以便在将数据从源数据存储到接收器数据存储时不会调用失败。点击

here
查找更多详细信息。感谢。


Hi!

Im trying to copy multiple tables from a sql database to azure data lake in JSON format. One of the column in the sql database is of type timestamp (Bytearray) which gives the error when running the pipeline:

{ "errorCode": "2200", "message": "Failure happened on 'Sink' side. ErrorCode=DataTypeNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The data type ByteArray is not supported.,Source=Microsoft.DataTransfer.Common,'", "failureType": "UserError", "target": "Copy_prj" }

Is there a way to make the copy skip this column? Or any other way to fix it?

解决方案

Hi there,

Did you try to set Skip incompatible rows as true under your scenario?

Skip incompatible rows will handle incompatible row to not invoke failure when copy data from source to sink data store. Click here to find more details. Thanks.


这篇关于从SQL Server复制到Azure DataLake。将Byte数组转换为JSON时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆