导入数千条记录后无法推送到Heroku [英] unable to push to Heroku after importing thousands of records

查看:96
本文介绍了导入数千条记录后无法推送到Heroku的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个问题,我认为我的sqlite3数据库太大.我将大约100,000条记录导入到数据库中,并且能够"git push"和"git push heroku".现在我可能犯了一个错误,导入了太多记录... 500,000.我能够推送到git(现在它在bitbucket中显示大约336MB),这似乎可以正常工作,但是当我推送到heroku时,这就是我得到的:

I have a problem where I believe my sqlite3 database is too big. I imported around 100,000 records into a database and I was able to "git push" and "git push heroku." Now I probably made a mistake and imported too many records...500,000. I was able to push to git(and now it states around 336MB in bitbucket) and that seems to work but when i push to heroku this is what i get:

/workspace/new_foodback$ git push heroku
Counting objects: 26, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (25/25), done.
Writing objects: 100% (26/26), 159.26 MiB | 1.43 MiB/s, done.
Total 26 (delta 20), reused 1 (delta 0)
remote: 
remote: !       Size of checkout and restored submodules exceeds 1 GB. Reduce size and try pushing again.
remote: 
To https://git.heroku.com/magnetic-beach-35611.git
 ! [remote rejected]   master -> master (pre-receive hook declined)
error: failed to push some refs to 'https://git.heroku.com/magnetic-beach-35611.git'
ubuntu@colin339-rails-tutorial-482323864:~/workspace/new_foodback$ 

我怀疑我可能必须拆分提交.我已经运行了命令sqlite3 business.db;并运行命令VACUUM FULL;.我尝试过多次推送,我曾尝试在REBASE中几次拆分提交,但我不确定100%是否是正确的方法还是我正在正确拆分(第一次).此错误总是发生在159.26 MiB.1.43 MiB/s,但经过一些分割后(25/25)数一​​直在增加(以前是(18/18)).有什么想法可以解决对Heroku的推动吗?

I have suspicious that I have to split commits possibly. I've run the command sqlite3 business.db; and ran the command VACUUM FULL;. I've tried to push up various times, I've tried splitting the commits a couple of times in REBASE, and I'm not 100% sure if It's even the right way to go or I'm splitting it right(first time). This error always happens at 159.26 MiB | 1.43 MiB/s, but after some of the splits the (25/25) numbers have been increasing(previously (18/18)). Any ideas how I can resolve this push to Heroku?

推荐答案

Heroku在整个提交历史记录(而不是当前文件大小)中将git repos的大小限制为1GB.您的回购可能超过1GB.

Heroku limits git repos to 1GB in size for the entire commit history (not the current file size). Your repo likely exceeds 1GB.

https://devcenter.heroku.com/articles/limits#git-repos

您至少有两个选择:

两者都将以重写的历史记录结尾,但是应该允许您缩小git存储库的大小.

Both will end up with rewritten history, but should allow you to shrink the size of your git repository.

即使您进行清理,二进制sqlite文件中仍然可能会有大量更改,因此我不确定您对数据库执行的任何操作都会使它变得更好(实际上,它将使情况变得更糟).通过添加其他提交和大小)

Even if you vacuum, you still likely have a significant number of changes in your binary sqlite files, so I'm not sure any action you take on the database will make it any better (in fact, it will make it worse by adding additional commits and size)

Heroku似乎不适用于浅克隆,并且需要完整的历史记录,因此您可能需要重写历史记录.

Heroku doesn't appear to work with shallow clones and requires the full history so you may need to rewrite your history.

您的git存储库大小超过1GB.这可能是由于您历史记录中所有先前提交的总大小(heroku需要部署完整克隆).拆分和添加新提交只会继续增加大小.您需要确定膨胀的来源.可以是继续添加二进制文件,甚至可以是添加文件,然后再通过git commit删除大型二进制文件.

Your git repository is over 1GB in size. This is likely due the total size of all the previous commits in your history (heroku requires a full clone to deploy). Splitting and adding new commits is only going to continue to add to size. You need to determine the source of the bloat. It could be the continued addition of binary files or even the addition and then later deletion (via a git commit) of a large binary file.

您可以通过在本地运行 git count-objects -vH 并查看 size-pack size 来检查回购的大小.

You may be able to inspect the size of your repo by running git count-objects -vH locally and looking at the size-pack size.

您还可以尝试使用脚本来比较提交之间的差异并获取blob大小:

You might also try using a script to compare the difference between commits and to get blob sizes:

这里还有一些其他不涉及重写历史记录的选项,用于清理存储库:

Also some other options here on cleaning a repository that don't involve rewriting history:

如果您已经在本地修复了回购协议,那么如果它拒绝您的推送,则可能需要强制推送到Heroku.除此之外,我不认为您可以在Heroku方面采取任何措施来解决此问题:您需要:

If you've already fixed the repo locally you may need to force push to Heroku if it's rejecting your push. Other than that, I don't think there's anything you can do on the Heroku side to fix this: you need to either:

  • 减小当前提交中的sqlite文件的文件大小(如果您只是添加它们而它们太大的话)
  • 重写您的历史记录以减少存储库的总容量
  • 使用上述文章中的一种方法压缩文件

由于Heroku不支持lfs,而github不支持大于gitlfs(100MB)的特定大小的文件,并且bitbucket似乎没有列出限制.在这种情况下,您很可能会堆叠一堆提交,这些提交具有数百MB的二进制文件更改,彼此之间的连接使您超出1GB的限制.

Since Heroku doesn't support lfs and github doesn't support file sizes larger than a certain size without gitlfs (100MB), and bitbucket doesn't seem to have a limit listed. This is most likely a scenario where you stacked a bunch of commits that had hundreds of MB worth of binary file changes on top of each other brining you over the 1GB limit.

Bitbucket还提供了有关如何确定实际存储库大小(不是当前默认分支总文件大小)的更多信息:

Bitbucket has more info on how you can figure out your actual repository size (not current default branch total file size) also: https://confluence.atlassian.com/bitbucket/what-kind-of-limits-do-you-have-on-repository-file-size-273877699.html

这篇关于导入数千条记录后无法推送到Heroku的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆