将数据从Cosmo DB导入Auzure ML Studio时出错0100 [英] Error 0100 while importing data from Cosmo DB into Auzure ML Studio

查看:76
本文介绍了将数据从Cosmo DB导入Auzure ML Studio时出错0100的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试通过导入数据模块从Cosmos DB将数据导入azure ML studio。

填充Cosmos DB我已将CSV上传到Azure Blob存储并复制它通过Data Factory进入Cosmos DB。

I'm trying to import data into azure ML studio from Cosmos DB via the Import Data module.
To populate Cosmos DB I've uploaded the CSV into Azure Blob Storage and copied it via Data Factory into Cosmos DB.

以下是完整运行错误记录:

Here's the full run error transcript:

[Critical] Error: Error 1000: DocumentDb library exception: DocumentDB client threw an exception [Critical] {"InputParameters":

{" Generic":{" source":" DocumentDB",

{"Generic":{"source":"DocumentDB",

" documentDbServer":" HTTPS://worldindicators.documents.azure.com:443 /" ;,

"documentDbServer":"https://worldindicators.documents.azure.com:443/",

" documentDbDatabaseName":" indicatorsDB" ;,

"documentDbDatabaseName":"indicatorsDB",

" documentDbCollection":" indicators",

"documentDbCollection":"indicators",

" inferSchema":false},

"inferSchema":false},

" Unknown":[" Key:documentDbPassword,

"Unknown":["Key: documentDbPassword,

ValueType:Microsoft.Analytics.Modules.SecureString",

ValueType : Microsoft.Analytics.Modules.SecureString",

" Key:documentDbQuery,ValueType:System.IO.StreamReader",

"Key: documentDbQuery, ValueType : System.IO.StreamReader",

" Key: documentDbQueryParams,值类型:就是System.IO.StreamReader"]},

"Key: documentDbQueryParams, ValueType : System.IO.StreamReader"]},

" OutputParameters":[],

"OutputParameters":[],

" ModuleType":" Microsoft.Analytics.Modules.Reader。 DLL",

"ModuleType":"Microsoft.Analytics.Modules.Reader.Dll",

"&的ModuleVersion QUOT;:" Version = 6.0.0.0",

"ModuleVersion":" Version=6.0.0.0",

" AdditionalModuleInfo" ;:\"Microsoft.Analytics.Modules.Reader.Dll,Version = 6.0.0.0,Culture = neutral,PublicKeyToken = 69c3241e6f0468ca; Microsoft.Analytics。 Modules.Reader.Dll.Reader; Load"," Errors":" Microsoft.Analytics.Exceptions.ErrorMapping + ModuleException:Error 1000:DocumentDb库异常:DocumentDB客户端抛出异常---> Microsoft.Analytics.Modules.DocumentDbException:DocumentDB客户端引发异常---> Microsoft.Azure.Documents.BadRequestException:{\" errors \":[{\" severity \":\" Error \",\" location \" :{\" start\":15,\" end\":16},\" code\":\" SC1010\",\" ; message \":\"语法错误,无效令牌';'。\"}]} ---> System.Runtime.InteropServices.COMException:从HRESULT异常:0x800A0B00\r\\\
---内部异常堆栈跟踪的结尾--- \r\\\
在Microsoft.Azure.Documents.Query.QueryPartitionProvider.GetPartitionedQueryExecutionInfo( Microsoft.Azure.Documents.Query.DocumentQueryExecutionContextBase中的SqlQuerySpec querySpec,PartitionKeyDefinition partitionKeyDefinition)\\\\ n。< GetPartitionedQueryExecutionInfoAsync> d__0.MoveNext()\\\\ nn ---从上一个位置开始的堆栈跟踪异常被抛出--- \r\\\
在System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\\\
在System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(任务task)\r\\\
在Microsoft.Azure.Documents.Query.DocumentQueryExecutionContextFactory。< CreateDocumentQueryExecutionContextAsync> d__3.MoveNext()\\\\ nn ---从抛出异常的上一个位置开始的堆栈跟踪结束--- \\\\ n \\ at at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\\\ at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\\\\\ at Microsoft.Azure.Documents.Linq.DocumentQuery `1.< CreateDocumentQueryExecutionContext> d__f.MoveNext()\\\\ n --- ---在System.Runtime.ExceptionServices.ExceptionDispatchInfo中抛出异常的前一个位置的堆栈跟踪--- \\\\ n。在Microsoft.Azure.Documents.Linq.DocumentQuery`1的System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(任务任务)\\\\ n中抛出()\\\\。< ExecuteAllAsync> d__19.MoveNext( )\\\ ---从系统上的System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\\\\ n中抛出异常的前一个位置的堆栈跟踪结束--- \\\\ n .Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(任务任务)\ r \\ n在Microsoft.Azure.Documents.Linq.DocumentQuery`1。< GetEnumeratorAsync> d__5.MoveNext()\\\\ nn --- --- Microsoft的内部异常堆栈跟踪结束--- \\\\ n。 m:\\AzureMLVS15-004 \\_work \中的Analytics.Modules.Reader.Dll.DocDbReader.RunImpl(String server,String databaseName,SecureString accountPassword,String collection,String query,String queryParams,Boolean inconsistentSchema) \ _117 \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ .Modules.Reader.Dll.Reader.LoadImpl(ReaderDataSourceOrSink source,String inputURL,DataFormat dataFormat,Nullable`1 csvTsvHasHeader,String databaseServerName,String databaseName,String sqlAccountName,SecureString accountPassword,Boolean trustServerCertificate,StreamReader sqlStreamReader,ReaderAuthenticationType authType,String sas,String accountName,SecureString accountKey,字符串路径,FileTypes sasBlobFormat,ReaderDloimFormat blobFormat,ReaderDelimiterType分隔符,ReaderEncodingType编码,ReaderExcelFormat excelFormat,字符串excelName,Boolean blobCsvHasHeader,Boolean sasBlobCsvHasHeader,ReaderAuthenticationType tableAuthType,String tableSas,String tableAccountName,SecureString tableAccountKey,String tableNames,PropertyScan scanMode,Int32 rowsToScan,PropertyScan sasScanMode ,Int32 sasRowsToScan,StreamReader hiveStreamReader,String hCatUri,String hadoopUsername,SecureString hadoopPassword,DataLocation dataLocation,String hdfsUri,String azureAccountName,SecureString azureStorageKey,String containerName,ReaderUrlContents urlContent,String powerQueryURL,DataGatewayName dataGatewayName,String onPremSqlDatabaseServerName,String onPremSqlDatabaseName,SecureString encryptedCredential,StreamReader onPremSqlStreamReader,String documentDbServer,String documentDbDatabaseName,SecureString documentDbPassword,String documentDbColl挠度,StreamReader的documentDbQuery,StreamReader的documentDbQueryParams,布尔则InferSchema)以m:\\AzureMLVS15-004\\_work\\117\\s\\Product\\Source\\ Modules\\Reader.Dll\\Reader.cs:线407\r\\\
在Microsoft.Analytics.Modules.Reader.Dll.Reader.Load(ReaderDataSourceOrSink源,字符串inputURL,DATAFORMAT DATAFORMAT,Nullable` 1 csvTsvHasHeader,String databaseServerName,String databaseName,String sqlAccountName,SecureString accountPassword,Boolean trustServerCertificate,StreamReader sqlStreamReader,ReaderAuthenticationType authType,String sas,String accountName,SecureString accountKey,String path,FileTypes sasBlobFormat,ReaderBlobFormat blobFormat,ReaderDelimiterType delimiter,ReaderEncodingType encoding,ReaderExcelFormat excelFormat ,String excelName,Boolean blobCsvHasHeader,Boolean sasBlobCsvHasHeader,ReaderAuthenticationType tableAuthType,String tableSas,String tableAccountName,SecureString tableAccountKey,String tableNames,PropertyScan scanMode,Int32 rowsToScan,PropertyScan sasScanMode,int32 sasRowsToScan,StreamReader hiveStreamReader,String hCatUri,String hadoopUsername,SecureString hadoopPassword,DataLocation dataLocation,String hdfsUri,String azureAccountName,SecureString azureStorageKey,String containerName,ReaderUrlContents urlContent,字符串powerQueryURL,DataGatewayName dataGatewayName,字符串onPremSqlDatabaseServerName,字符串onPremSqlDatabaseName,SecureString encryptedCredential,StreamReader onPremSqlStreamReader,String documentDbServer,String documentDbDatabaseName,SecureString documentDbPassword,String documentDbCollection,StreamReader documentDbQuery,StreamReader documentDbQueryParams,Boolean inferSchema)in m:\\AzureMLVS15-004 \\ \\\_work\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ n ---内部异常堆栈跟踪结束---",","警告":[],"持续时间":"00:00:01.5502313"}

"AdditionalModuleInfo":"Microsoft.Analytics.Modules.Reader.Dll, Version=6.0.0.0, Culture=neutral, PublicKeyToken=69c3241e6f0468ca;Microsoft.Analytics.Modules.Reader.Dll.Reader;Load","Errors":"Microsoft.Analytics.Exceptions.ErrorMapping+ModuleException: Error 1000: DocumentDb library exception: DocumentDB client threw an exception ---> Microsoft.Analytics.Modules.DocumentDbException: DocumentDB client threw an exception ---> Microsoft.Azure.Documents.BadRequestException: {\"errors\":[{\"severity\":\"Error\",\"location\":{\"start\":15,\"end\":16},\"code\":\"SC1010\",\"message\":\"Syntax error, invalid token ';'.\"}]} ---> System.Runtime.InteropServices.COMException: Exception from HRESULT: 0x800A0B00\r\n --- End of inner exception stack trace ---\r\n at Microsoft.Azure.Documents.Query.QueryPartitionProvider.GetPartitionedQueryExecutionInfo(SqlQuerySpec querySpec, PartitionKeyDefinition partitionKeyDefinition)\r\n at Microsoft.Azure.Documents.Query.DocumentQueryExecutionContextBase.<GetPartitionedQueryExecutionInfoAsync>d__0.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.Azure.Documents.Query.DocumentQueryExecutionContextFactory.<CreateDocumentQueryExecutionContextAsync>d__3.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.Azure.Documents.Linq.DocumentQuery`1.<CreateDocumentQueryExecutionContext>d__f.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.Azure.Documents.Linq.DocumentQuery`1.<ExecuteAllAsync>d__19.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.Azure.Documents.Linq.DocumentQuery`1.<GetEnumeratorAsync>d__5.MoveNext()\r\n --- End of inner exception stack trace ---\r\n at Microsoft.Analytics.Modules.Reader.Dll.DocDbReader.RunImpl(String server, String databaseName, SecureString accountPassword, String collection, String query, String queryParams, Boolean inconsistentSchema) in m:\\AzureMLVS15-004\\_work\\117\\s\\Product\\Source\\Modules\\Reader.Dll\\DocDbReader.cs:line 93\r\n at Microsoft.Analytics.Modules.Reader.Dll.Reader.LoadImpl(ReaderDataSourceOrSink source, String inputURL, DataFormat dataFormat, Nullable`1 csvTsvHasHeader, String databaseServerName, String databaseName, String sqlAccountName, SecureString accountPassword, Boolean trustServerCertificate, StreamReader sqlStreamReader, ReaderAuthenticationType authType, String sas, String accountName, SecureString accountKey, String path, FileTypes sasBlobFormat, ReaderBlobFormat blobFormat, ReaderDelimiterType delimiter, ReaderEncodingType encoding, ReaderExcelFormat excelFormat, String excelName, Boolean blobCsvHasHeader, Boolean sasBlobCsvHasHeader, ReaderAuthenticationType tableAuthType, String tableSas, String tableAccountName, SecureString tableAccountKey, String tableNames, PropertyScan scanMode, Int32 rowsToScan, PropertyScan sasScanMode, Int32 sasRowsToScan, StreamReader hiveStreamReader, String hCatUri, String hadoopUsername, SecureString hadoopPassword, DataLocation dataLocation, String hdfsUri, String azureAccountName, SecureString azureStorageKey, String containerName, ReaderUrlContents urlContent, String powerQueryURL, DataGatewayName dataGatewayName, String onPremSqlDatabaseServerName, String onPremSqlDatabaseName, SecureString encryptedCredential, StreamReader onPremSqlStreamReader, String documentDbServer, String documentDbDatabaseName, SecureString documentDbPassword, String documentDbCollection, StreamReader documentDbQuery, StreamReader documentDbQueryParams, Boolean inferSchema) in m:\\AzureMLVS15-004\\_work\\117\\s\\Product\\Source\\Modules\\Reader.Dll\\Reader.cs:line 407\r\n at Microsoft.Analytics.Modules.Reader.Dll.Reader.Load(ReaderDataSourceOrSink source, String inputURL, DataFormat dataFormat, Nullable`1 csvTsvHasHeader, String databaseServerName, String databaseName, String sqlAccountName, SecureString accountPassword, Boolean trustServerCertificate, StreamReader sqlStreamReader, ReaderAuthenticationType authType, String sas, String accountName, SecureString accountKey, String path, FileTypes sasBlobFormat, ReaderBlobFormat blobFormat, ReaderDelimiterType delimiter, ReaderEncodingType encoding, ReaderExcelFormat excelFormat, String excelName, Boolean blobCsvHasHeader, Boolean sasBlobCsvHasHeader, ReaderAuthenticationType tableAuthType, String tableSas, String tableAccountName, SecureString tableAccountKey, String tableNames, PropertyScan scanMode, Int32 rowsToScan, PropertyScan sasScanMode, Int32 sasRowsToScan, StreamReader hiveStreamReader, String hCatUri, String hadoopUsername, SecureString hadoopPassword, DataLocation dataLocation, String hdfsUri, String azureAccountName, SecureString azureStorageKey, String containerName, ReaderUrlContents urlContent, String powerQueryURL, DataGatewayName dataGatewayName, String onPremSqlDatabaseServerName, String onPremSqlDatabaseName, SecureString encryptedCredential, StreamReader onPremSqlStreamReader, String documentDbServer, String documentDbDatabaseName, SecureString documentDbPassword, String documentDbCollection, StreamReader documentDbQuery, StreamReader documentDbQueryParams, Boolean inferSchema) in m:\\AzureMLVS15-004\\_work\\117\\s\\Product\\Source\\Modules\\Reader.Dll\\EntryPoint.cs:line 437\r\n --- End of inner exception stack trace ---","Warnings":[],"Duration":"00:00:01.5502313"}


推荐答案

您好,

感谢您的反馈。我认为这与以下类似的问题。可能是您正在尝试导入太大(大于10 GB)的数据集。请参考以下帖子:

Thank you for the feedback. I think it's similar issue with the following. It could be you are trying to import a data set which is too big (larger than 10 GB). Please refer to following post:

https://social.msdn.microsoft.com/Forums/en-US/0ba864c9-90a2-46e3-8ccd- 96ca0cea5f0b / error-when-trying-to-many-files-from-azure-cosmos-db

请告诉我你的数据集是不是很大。

Please tell me if your data set is not such big.

问候,

宇通

------------------------ -------------------------------------------------- ---------------------------------

如果您发现这篇文章有帮助,请给予它是"有用的"投票。

如果有帮助,请记得将回复标记为答案。


这篇关于将数据从Cosmo DB导入Auzure ML Studio时出错0100的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆