从Azure Datalake到Azure SQL数据库-最佳复制方法 [英] Azure Datalake to Azure SQL Database - Best Way To Copy

查看:77
本文介绍了从Azure Datalake到Azure SQL数据库-最佳复制方法的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

大师的

我试图使用Azure数据工厂的(ADF)复制活动将表从本地数据库复制到Azure SQL数据库.

I was trying to copy tables from my On-Prem databases to Azure SQL Database using Azure Data Factory's(ADF) Copy Activity.

不久,我意识到要花很长时间(表=> 10GB)进行复制.因此,我想到了使用BCP作为CSV文件提取表的过程,这花了大约一分钟,然后将这些CSV复制到DataLake中.我的想法是,我会减少花费的时间 如果我使用Azure DataLake,则将表从本地复制到Azure SQL数据库,因为两者都在云上.但是我没有看到我希望得到的进步.

Soon I realized it is taking long time(tables are >=10GB) to copy. So, I thought of extracting tables using BCP as CSV files, which took around a minute and then copy those CSV into DataLake. My thought process was, I would cut down the time it takes to copy a table from On-Prem to Azure SQL Database if I use Azure DataLake since both are on Cloud. But I haven't seen the improvement I was hoping for.

我查看了Microsoft网站上的Performance Matrix,并希望看到DataLake Gen1到Azure SQL数据库的最低吞吐量为5MB/s,但是,我的吞吐量仅为1.6 MB/s.

I looked at Performance Matrix that microsoft has on their website and was expecting to see a minimum of 5MB/s throughput for DataLake Gen1 to Azure SQL Database, however, I am only seeing a throughput of 1.6 MB/s.

任何对我可能会与正常工作有所不同的意见/想法的帮助将不胜感激.自从我刚开始工作以来,我发现网上的信息不知所措,或者与我试图做的事情无关做.

Any help with the any opinions/ideas on what I might be doing differently from the norm would be greatly appreciated.Since I just started working, I find the information online either overwhelming or couldn't relate to what I am trying to do.

谢谢

推荐答案

你好,

您能否提供有关此处所述的复制活动的更多统计信息或屏幕截图:

Could you provide more statistics or a screen shot on the copy activity as outlined here:

https://docs.microsoft.com/zh-CN /azure/data-factory/copy-activity-overview#monitoring


这篇关于从Azure Datalake到Azure SQL数据库-最佳复制方法的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆