SSIS 无法保存包并重新启动 Visual Studio [英] SSIS failing to save packages and reboots Visual Studio

查看:311
本文介绍了SSIS 无法保存包并重新启动 Visual Studio的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是我第一次使用 SSIS,所以请耐心等待...我正在使用 SSIS 将表从 Oracle 迁移到 SSMS,我正在尝试传输一些非常大的表(5000 万行 +).当我只是试图保存包(甚至没有运行它)时,SSIS 现在完全冻结并重新启动 VS.它不断返回内存不足的错误,但是,我正在远程服务器上工作,该服务器的 RAM 远远超过运行此程序包所需的 RAM.

尝试保存时的错误消息

我唯一能想到的是当这个包试图运行时,我的以太网 Kbps 在包启动时就通过了屋顶.也许需要更新我的管道?

以太网图

此外,由于 BYTE 大小(同样,几乎没有使用服务器上的所有内存),我最大的表在导入时会失败.我们正在使用 ODBC 源,因为这是我们能够让其他大型表上传超过 100 万行的唯一方法.

我尝试创建一个临时缓冲区文件来帮助缓解内存压力,但没有任何变化.我已将 AutoAdjustBufferSize 更改为 True,结果没有变化.还更改了 DefaultBufferMaxRowsDefaultBufferSize.. 没有变化.

运行大表时出错:

<块引用>

信息:0x4004300C at SRC_STG_TABLENAME,SSIS.Pipeline:执行阶段开始.

信息:0x4004800D at SRC_STG_TABLENAME:缓冲区管理器810400000 字节的内存分配调用失败,但无法交换任何缓冲区以减轻内存压力.2 个缓冲液是已考虑,2 个已锁定.

管道没有足够的可用内存,因为没有安装足够多,其他进程正在使用它,或者太多缓冲区被锁定.

信息:0x4004800F at SRC_STG_TABLENAME:缓冲区管理器在 2 个物理缓冲区中分配了 1548 MB.

信息:0x40048010 at SRC_STG_TABLENAME:组件ODBCSource" (60) 拥有 775 兆字节的物理缓冲区.

信息:0x4004800D at SRC_STG_TABLENAME:缓冲区管理器810400000 字节的内存分配调用失败,但无法交换任何缓冲区以减轻内存压力.2 个缓冲液是考虑过,2 被锁定.

管道没有足够的可用内存,因为没有安装足够多,其他进程正在使用它,或者太多缓冲区被锁定.

信息:0x4004800F at SRC_STG_TABLENAME:缓冲区管理器在 2 个物理缓冲区中分配了 1548 MB.

信息:0x40048010 at SRC_STG_TABLENAME:组件ODBCSource" (60) 拥有 775 兆字节的物理缓冲区.

信息:0x4004800D at SRC_STG_TABLENAME:缓冲区管理器810400000 字节的内存分配调用失败,但无法交换任何缓冲区以减轻内存压力.2 个缓冲液是考虑过,2 被锁定.

管道没有足够的可用内存,因为没有安装足够多,其他进程正在使用它,或者太多缓冲区被锁定.

信息:0x4004800F at SRC_STG_TABLENAME:缓冲区管理器在 2 个物理缓冲区中分配了 1548 MB.

信息:0x40048010 at SRC_STG_TABLENAME:组件ODBCSource" (60) 拥有 775 兆字节的物理缓冲区.

错误:0xC0047012 at SRC_STG_TABLENAME:缓冲区失败,同时分配 810400000 字节.

错误:0xC0047011 at SRC_STG_TABLENAME:系统报告 26百分比内存负载.有 68718940160 字节的物理内存有 50752466944 字节可用.有 4294836224 字节的虚拟内存有 914223104 字节可用.分页文件有84825067520字节,其中 61915041792 字节可用.

信息:0x4004800F at SRC_STG_TABLENAME:缓冲区管理器在 2 个物理缓冲区中分配了 1548 MB.

信息:0x40048010 at SRC_STG_TABLENAME:组件ODBCSource" (60) 拥有 775 兆字节的物理缓冲区.

错误:0x279 at SRC_STG_TABLENAME,ODBC 源 [60]:添加失败行到输出缓冲区.

错误:0x384 at SRC_STG_TABLENAME,ODBC 源 [60]:打开数据库发生连接 (ODBC) 错误.

错误:0xC0047038 at SRC_STG_TABLENAME,SSIS.Pipeline:SSIS 错误代码 DTS_E_PRIMEOUTPUFAILED.ODBC 源上的 PrimeOutput 方法返回错误代码 0x80004005.组件返回失败代码当管道引擎调用 PrimeOutput() 时.的意义失败代码由组件定义,但错误是致命的并且管道停止执行.可能会发布错误消息在此之前提供有关失败的更多信息.

这真的阻碍了我的工作.帮助!

解决方案

我建议分块读取数据:

不要加载整个表,而是尝试将数据拆分成块并将它们导入到 SQL Server.有一段时间,我

  1. 添加具有以下首选项的 For 循环:

  1. 在for循环容器内添加一个数据流任务
  2. 在数据流任务中添加ODBC SourceOLEDB Destination
  3. 在 ODBC Source 中选择 SQL Command 选项并编写一个 SELECT * FROM TABLE 查询 *(仅检索元数据`
  4. 映射源和目标之间的列
  5. 返回Control flow并点击Data flow task并点击F4查看属性窗口
  6. 在属性窗口中,转到表达式并将以下表达式分配给 [ODBC Source].[SQLCommand] 属性:(有关更多信息,请参阅

    参考文献

    <小时>

    更新 1 - 其他可能的解决方法

    在搜索类似问题时,我发现了一些您可以尝试的其他解决方法:

    (1) 更改 SQL Server 最大内存

    (2) 启用命名管道

    (3) 如果使用 SQL Server 2008 安装修补程序

    <小时>

    更新 2 - 了解错误

    在下面MSDN 链接,错误原因描述如下:

    <块引用>

    虚拟内存是物理内存的超集.Windows 中的进程通常不指定要使用的进程,因为这会(极大地)抑制 Windows 如何进行多任务处理.SSIS 分配虚拟内存.如果 Windows 能够,所有这些分配都保存在物理内存中,在那里访问速度更快.但是,如果 SSIS 请求的内存多于物理可用的内存,则该虚拟内存会溢出到磁盘,从而使程序包的运行速度降低几个数量级.而在最坏的情况下,如果系统中没有足够的虚拟内存,那么打包就会失败.

    This is my first experience with SSIS so bear with me... I am using SSIS to migrate tables from Oracle to SSMS, there are some very large tables I am trying to transfer (50 million rows +). SSIS is now completely freezing up and rebooting VS when I am just trying to save the package (not even running it). It keeps returning errors of insufficient memory, however, I am working on a remote server that has well over the RAM it takes to run this package.

    Error Message when trying to save

    The only thing I can think of is when this package is attempting to run, my Ethernet Kbps are through the roof right as the package starts. Maybe need to update my pipeline?

    Ethernet Graph

    Also, my largest table will fail when importing due to BYTE sizes (again, not nearly using all the memory on the server). We are using ODBC Source as this was the only way we were able to get other large tables to upload more than 1 million rows.

    I have tried creating a temporary buffer file to help with memory pressure, but that had no changes. I have changed the AutoAdjustBufferSize to True, no change in results. also changed DefaultBufferMaxRows and DefaultBufferSize.. no change.

    ERRORS WHEN RUNNING LARGE TABLE:

    Information: 0x4004300C at SRC_STG_TABLENAME, SSIS.Pipeline: Execute phase is beginning.

    Information: 0x4004800D at SRC_STG_TABLENAME: The buffer manager failed a memory allocation call for 810400000 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked.

    Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

    Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

    Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

    Information: 0x4004800D at SRC_STG_TABLENAME: The buffer manager failed a memory allocation call for 810400000 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked.

    Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

    Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

    Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

    Information: 0x4004800D at SRC_STG_TABLENAME: The buffer manager failed a memory allocation call for 810400000 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked.

    Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

    Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

    Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

    Error: 0xC0047012 at SRC_STG_TABLENAME: A buffer failed while allocating 810400000 bytes.

    Error: 0xC0047011 at SRC_STG_TABLENAME: The system reports 26 percent memory load. There are 68718940160 bytes of physical memory with 50752466944 bytes free. There are 4294836224 bytes of virtual memory with 914223104 bytes free. The paging file has 84825067520 bytes with 61915041792 bytes free.

    Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

    Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

    Error: 0x279 at SRC_STG_TABLENAME, ODBC Source [60]: Failed to add row to output buffer.

    Error: 0x384 at SRC_STG_TABLENAME, ODBC Source [60]: Open Database Connectivity (ODBC) error occurred.

    Error: 0xC0047038 at SRC_STG_TABLENAME, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on ODBC Source returned error code 0x80004005. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

    This is really holding up my work. HELP!

    解决方案

    I suggest reading data in chunks:

    Instead of loading the whole table, try to split the data into chunks and import them to SQL Server. From a while, I answered a similar answer related to SQLite, i will try to reproduce it to fit the Oracle syntax:


    Step by Step guide

    In this example each chunk contains 10000 rows.

    1. Declare 2 Variables of type Int32 (@[User::RowCount] and @[User::IncrementValue])
    2. Add an Execute SQL Task that execute a select Count(*) command and store the Result Set into the variable @[User::RowCount]

    1. Add a For Loop with the following preferences:

    1. Inside the for loop container add a Data flow task
    2. Inside the dataflow task add an ODBC Source and OLEDB Destination
    3. In the ODBC Source select SQL Command option and write a SELECT * FROM TABLE query *(to retrieve metadata only`
    4. Map the columns between source and destination
    5. Go back to the Control flow and click on the Data flow task and hit F4 to view the properties window
    6. In the properties window go to expression and Assign the following expression to [ODBC Source].[SQLCommand] property: (for more info refer to How to pass SSIS variables in ODBC SQLCommand expression?)

      "SELECT * FROM MYTABLE ORDER BY ID_COLUMN
      OFFSET " + (DT_WSTR,50)@[User::IncrementValue] + "FETCH NEXT 10000 ROWS ONLY;"
      

    Where MYTABLE is the source table name, and IDCOLUMN is your primary key or identity column.

    Control Flow Screenshot

    References


    Update 1 - Other possible workarounds

    While searching for similar issues i found some additional workarounds that you can try:

    (1) Change the SQL Server max memory

    (2) Enable Named pipes

    (3) If using SQL Server 2008 install hotfixes


    Update 2 - Understanding the error

    In the following MSDN link, the error cause was described as following:

    Virtual memory is a superset of physical memory. Processes in Windows typically do not specify which they are to use, as that would (greatly) inhibit how Windows can multitask. SSIS allocates virtual memory. If Windows is able to, all of these allocations are held in physical memory, where access is faster. However, if SSIS requests more memory than is physically available, then that virtual memory spills to disk, making the package operate orders of magnitude slower. And in worst cases, if there is not enough virtual memory in the system, then the package will fail.

    这篇关于SSIS 无法保存包并重新启动 Visual Studio的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆