有没有办法限制“git gc"的内存量?用途? [英] Is there a way to limit the amount of memory that "git gc" uses?

查看:28
本文介绍了有没有办法限制“git gc"的内存量?用途?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在共享主机上托管一个 git 存储库.我的 repo 中必然有几个非常大的文件,每次我尝试在 repo 上运行git gc"时,我的进程都会因使用过多内存而被共享托管提供商杀死.有没有办法限制 git gc 可以消耗的内存量?我希望它可以用内存来换取速度,并且只需要花更长的时间来完成它的工作.

I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work.

推荐答案

是的,请查看 git config 的帮助页面并查看 pack.*选项,特别是 pack.depthpack.windowpack.windowMemorypack.deltaCacheSize.

Yes, have a look at the help page for git config and look at the pack.* options, specifically pack.depth, pack.window, pack.windowMemory and pack.deltaCacheSize.

这不是一个完全准确的大小,因为 git 需要将每个对象映射到内存中,因此无论窗口和增量缓存设置如何,一个非常大的对象都会导致大量内存使用.

It's not a totally exact size as git needs to map each object into memory so one very large object can cause a lot of memory usage regardless of the window and delta cache settings.

您可能会在本地打包并手动"将打包文件传输到远程端,添加一个 .keep 文件以便远程 git 永远不会尝试完全重新打包所有内容.

You may have better luck packing locally and transfering pack files to the remote side "manually", adding a .keep files so that the remote git doesn't ever try to completely repack everything.

这篇关于有没有办法限制“git gc"的内存量?用途?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆