Bitbucket管道-如何将同一个Docker容器用于多个步骤? [英] Bitbucket Pipelines - How to use the same Docker container for multiple steps?

查看:89
本文介绍了Bitbucket管道-如何将同一个Docker容器用于多个步骤?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已使用以下配置( bitbucket-pipelines.yml )为我的Web应用程序设置了持续部署.

I have set up Continuous Deployment for my web application using the configuration below (bitbucket-pipelines.yml).

pipelines:
  branches:
    master:
        - step:
            name: Deploy to production
            trigger: manual
            deployment: production
            caches:
              - node
            script:
              # Install dependencies
              - yarn install
              - yarn global add gulp-cli

              # Run tests
              - yarn test:unit
              - yarn test:integration

              # Build app
              - yarn run build

              # Deploy to production
              - yarn run deploy

尽管这可行,但我想通过并行运行单元和集成测试步骤来提高构建速度.

Although this works, I would like to increase the build speed by running the unit and integration test steps in parallel.

pipelines:
  branches:
    master:
        - step:
            name: Install dependencies
            script:
              - yarn install
              - yarn global add gulp-cli

        - parallel:
            - step:
                name: Run unit tests
                script:
                  - yarn test:unit
            - step:
                name: Run unit tests
                script:
                  - yarn test:integration

        - step:
            name: Build app
            script:
              - yarn run build

        - step:
            name: Deploy to production
            trigger: manual
            deployment: production
            script:
              - yarn run deploy

这还具有查看Bitbucket中不同步骤的优势,包括每个步骤的执行时间.

This also has the advantage of seeing the different steps in Bitbucket including the execution time per step.

这不起作用,因为对于每个步骤,都会创建一个干净的Docker容器,并且不再在测试步骤中安装依赖项.

This does not work because for each step a clean Docker container is created and the dependencies are no longer installed on the testing steps.

我知道我可以使用工件在步骤之间共享文件,但是仍然需要创建多个容器,这会增加总执行时间.

I know that I can share files between steps using artifacts, but that would still require multiple containers to be created which increases the total execution time.

如何在多个步骤之间共享同一个Docker容器?

How can I share the same Docker container between multiple steps?

推荐答案

前一段时间,我遇到了同样的问题,并且找到了解决问题的方法,而我现在正在成功地使用它.

I've had the same issue a while ago and found a way to do it and I'm using it successfully right now.

您可以使用Docker的 save load 以及BitBucket的工件来进行此操作.您只需要确保您的图像不会太大,因为BitBucket的工件限制为 1GB ,并且您可以使用多阶段构建和其他技巧轻松地确保这一点.

You can do this using Docker's save and load along with BitBucket's Artifacts. You just need to make sure that your image isn't too large because BitBucket's Artifacts limit is 1GB and you can easily ensure this using multi stage-builds and other tricks.

<代码>- 步:名称:构建应用脚本:-纱线运行-docker save --output<备份文件名> .tar<要导出的图像>工件:-<备份文件名> .tar- 步:名称:部署到生产触发:手动部署:生产脚本:-docker加载--input< backup-file-name> .tar-纱线运行展开

您可能还想使用BitBucket的缓存,这可以更快地构建Docker映像.例如,您可以这样做,以便仅在 package.json yarn.lock 文件更改时才安装NPM软件包.

You might also like to use BitBucket's caches which can make building Docker images much faster. For example, you can make it so that NPM packages are only installed when package.json and yarn.lock files change.

  • docker save (Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/save/index
  • docker load (Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/load/index
  • BitBucket Artifacts: https://confluence.atlassian.com/bitbucket/using-artifacts-in-steps-935389074.html
  • BitBucket Pipelines Caches: https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html

这篇关于Bitbucket管道-如何将同一个Docker容器用于多个步骤?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆