内存不足错误归档日志文件 [英] Out of memory error archiving a log file

查看:170
本文介绍了内存不足错误归档日志文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我遇到运行控制台作业的问题,并创建了我在午夜归档的每日日志文件。

I am having a problem with a console job that runs and creates a daily log file that I archive at midnight.

这会在第二天创建一个空白日志文件,并在名称和旧文件的内容中创建一个带有昨天日期的存档文件,用于调试我可能遇到的问题和直到第二天才知道。

This creates a blank log file for the next day and an archived file with yesterdays date in the name and the contents of the old file for debugging issues I may have had and not known about until the day after.

但是,当我尝试归档文件时,由于我加快了BOT的工作,我遇到了系统内存不足错误的问题。

However since I cranked up the BOT's job I have been hitting issues with System Out of Memory errors when I try and archive the file.

起初我根本无法获得一个存档文件然后我找到了一种方法来获得至少最后10万行,这还不够。

At first I was just not able to get an archived file at all then I worked out a way to get at least the last 100,000 lines which is not nearly enough.

我将所有内容包装在3次尝试/捕获中

I wrap everything in 3 try/catches


  1. I / O

  2. 系统内存不足

  3. 标准例外

然而它是总是我得到的OutOfMemoryException例如

However it's always the OutOfMemoryException that I get e.g

System.OutOfMemoryException错误:抛出了类型'System.OutOfMemoryException'的异常。;

System.OutOfMemoryException Error: Exception of type 'System.OutOfMemoryException' was thrown.;

为您举例说明100,000行日志大小约为11MB文件

To give you an example of size 100,000 lines of log is about 11MB file

标准完整日志文件可以是1/2 GB到2 GB之间的任何内容

A standard full log file can be anything from 1/2 a GB to 2GB

我需要知道的是:

a)标准文本文件的大小是多少尝试使用File.ReadAllText或自定义StreamReader函数时出现内存不足错误我调用ReadFileString例如

a) what size of a standard text file will throw an out of memory error when trying to use File.ReadAllText or a custom StreamReader function I call ReadFileString e.g

public static string ReadFileString(string path)
{
    // Use StreamReader to consume the entire text file.
using (StreamReader reader = new StreamReader(path))
{
    return reader.ReadToEnd();
    }
}

b)是我的电脑内存(我有16GB) RAM - 复制时使用8GB)或我在C#中使用的文件打开和复制失败的对象。

b) is it my computers memory (I have 16GB RAM - 8GB use at time of copying) or the objects I am using in C# that are failing with the opening and copying of files.

归档时我首先尝试使用我的自定义ReadFileString函数(见上文),如果返回0字节的内容我尝试File.ReadAllText然后如果失败我试试自定义函数来获取最后100,000行,这对于当天早些时候的调试错误来说真的不够。

When archiving I first try with my custom ReadFileString function (see above), if that returns 0 bytes of content I try File.ReadAllText and then if that fails I try a custom function to get the last 100,000 lines, which is really not enough for debugging errors earlier in the day.

日志文件从午夜开始,当创建一个新文件时整天记录。我以前从来没有出现内存错误,但由于我已经调高了方法调用的频率,因此日志记录已经扩展,这意味着文件大小也一样。

The log file starts at midnight when a new one is created and logs all day. I never used to have out of memory errors but since I have turned up the frequency of method calls the logging has expanded which means the file sizes have as well.

这是获取最后100,000行的自定义函数。我想知道在没有IT丢失内存错误的情况下我可以获得多少行,而且我根本没有得到最后几天日志文件的任何内容。

This is my custom function for getting the last 100,000 lines. I am wondering how many lines I could get without IT throwing an out of memory error and me not getting any contents of the last days log file at all.

人们做什么建议保存X行所需的各种方法/内存的最大文件大小,以及获取尽可能多的日志文件的最佳方法是什么?

What do people suggest for the maximum file size for various methods / memory needed to hold X lines, and what is the best method for obtaining as much of the log file as possible?

EG某种方式逐行循环,直到遇到异常,然后保存我所拥有的。

E.G some way of looping line by line until an exception is hit and then saving what I have.

这是我的GetHundredThousandLines方法,它记录到一个非常小的调试文件,所以我可以看到归档过程中发生了什么错误。

This is my GetHundredThousandLines method and it logs to a very small debug file so I can see what errors happened during the archive process.

private bool GetHundredThousandLines(string logpath, string archivepath)
{
    bool success = false;

    int numberOfLines = 100000;


    if (!File.Exists(logpath))
    {
    this.LogDebug("GetHundredThousandLines - Cannot find path " + logpath + " to archive " + numberOfLines.ToString() + " lines");
    return false;
    }

    var queue = new Queue<string>(numberOfLines);

    using (FileStream fs = File.Open(logpath, FileMode.Open, FileAccess.Read, FileShare.Read))
    using (BufferedStream bs = new BufferedStream(fs))  // May not make much difference.
    using (StreamReader sr = new StreamReader(bs))
    {
    while (!sr.EndOfStream)
    {
        if (queue.Count == numberOfLines)
        {
        queue.Dequeue();
        }

        queue.Enqueue(sr.ReadLine() + "\r\n");
    }
    }

    // The queue now has our set of lines. So print to console, save to another file, etc.
    try
    {

    do
    {        
        File.AppendAllText(archivepath, queue.Dequeue(), Encoding.UTF8);
    } while (queue.Count > 0);


    }
    catch (IOException exception)
    {
    this.LogDebug("GetHundredThousandLines - I/O Error accessing daily log file with ReadFileString: " + exception.Message.ToString());
    }
    catch (System.OutOfMemoryException exception)
    {
    this.LogDebug("GetHundredThousandLines - Out of Memory Error accessing daily log file with ReadFileString: " + exception.Message.ToString());
    }
    catch (Exception exception)
    {
    this.LogDebug("GetHundredThousandLines - Exception accessing daily log file with ReadFileString: " + exception.Message.ToString());
    }


    if (File.Exists(archivepath))
    {
    this.LogDebug("GetHundredThousandLines - Log file exists at " + archivepath);
    success = true;
    }
    else
    {
    this.LogDebug("GetHundredThousandLines - Log file DOES NOT exist at " + archivepath);
    }

    return success;

}

我们非常感谢任何帮助。

Any help would be much appreciated.

谢谢

推荐答案

尝试:
在课堂上保留队列和流的位置范围,当出现内存不足异常并再次调用函数时尝试GC.Collect()。寻求流到最后的位置并继续。
或:
使用一个数据库,如sqlite,并在每个表中保留最新的100000记录。

try: keep the queue and stream position in class scope, try GC.Collect() when getting out of memory exception and call function again. seek stream to last position and continue. or: use one database like sqlite and keep newest 100000 record in each table.

这篇关于内存不足错误归档日志文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆