有没有办法限制“git gc”的内存量?使用? [英] Is there a way to limit the amount of memory that "git gc" uses?

查看:437
本文介绍了有没有办法限制“git gc”的内存量?使用?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在共享主机上托管一个git repo。我的repo必须有一些非常大的文件,每次我尝试在repo上运行git gc时,我的进程都被共享主机提供商杀死,因为它使用了太多的内存。有没有办法限制git gc可以消耗的内存量?我的希望是,它可以交换内存使用的速度,只需要花费一点时间来完成它的工作。

是的,查看 git config 的帮助页面,并查看包。* 选项,特别是 pack.depth pack.window pack.windowMemory pack.deltaCacheSize



它不是一个完全精确的大小,因为git需要将每个对象映射到内存中,因此一个非常大的对象可以不管窗口和增量缓存设置如何,都会导致大量的内存使用。



您可能有更好的运气在本地打包并将包文件手动传输到远程端,一个 .keep 文件,以便远程git不会尝试彻底重新包装所有内容。


I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work.

解决方案

Yes, have a look at the help page for git config and look at the pack.* options, specifically pack.depth, pack.window, pack.windowMemory and pack.deltaCacheSize.

It's not a totally exact size as git needs to map each object into memory so one very large object can cause a lot of memory usage regardless of the window and delta cache settings.

You may have better luck packing locally and transfering pack files to the remote side "manually", adding a .keep files so that the remote git doesn't ever try to completely repack everything.

这篇关于有没有办法限制“git gc”的内存量?使用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆