Docker 容器中的巨大文件 [英] Huge files in Docker containers

查看:39
本文介绍了Docker 容器中的巨大文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要创建一个使用大文件(包含基因组数据,因此大小达到约 10GB)的 Docker 映像(以及该映像中的容器).

I need to create a Docker image (and consequently containers from that image) that use large files (containing genomic data, thus reaching ~10GB in size).

我应该如何优化它们的使用?我是否应该将它们包含在容器中(例如 COPY large_folder large_folder_in_container)?有没有更好的方法来引用这些文件?关键是在我的私人存储库中推送这样的容器(> 10GB)对我来说听起来很奇怪.我想知道是否有一种方法可以将某种卷附加到容器上,而无需将所有这些 GB 打包在一起.

How am I supposed to optimize their usage? Am I supposed to include them in the container (such as COPY large_folder large_folder_in_container)? Is there a better way of referencing such files? The point is that it sounds strange to me to push such container (which would be >10GB) in my private repository. I wonder if there is a way of attaching a sort of volume to the container, without packing all those GBs together.

谢谢.

推荐答案

我是否应该将它们包含在容器中(例如 COPY large_folder large_folder_in_container)?

如果这样做,它们会将它们包含在 映像 中,而不是容器中:您可以从该映像启动 20 个容器,实际使用的磁盘空间仍为 10 GB.

If you do so, that would include them in the image, not the container: you could launch 20 containers from that image, the actual disk space used would still be 10 GB.

如果您要从第一个图像制作另一个图像,分层文件系统将重用父图像中的图层,而新图像仍将仅"10GB.

If you were to make another image from your first image, the layered filesystem will reuse the layers from the parent image, and the new image would still be "only" 10GB.

这篇关于Docker 容器中的巨大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆