MLflow工件存储工件(Google云存储),但不显示在MLFlow UI中 [英] MLflow Artifacts Storing artifacts(google cloud storage) but not displaying them in MLFlow UI

查看:157
本文介绍了MLflow工件存储工件(Google云存储),但不显示在MLFlow UI中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在一个docker环境(docker-compose)上使用jupyter笔记本docker映像和postgres docker映像来运行ML模型并使用Google云存储来存储模型工件.将模型存储在云存储上可以正常工作,但是我无法在MLFlow UI中显示它们.我见过类似的问题,但没有解决方案使用google cloud storage作为工件的存储位置.错误消息显示以下无法列出存储在< gs-location>下的工件.对于当前运行.请与您的跟踪服务器管理员联系,以将此错误通知他们,如果跟踪服务器缺乏在当前运行的根工件目录下列出工件的权限,则会发生这种情况.可能是什么原因导致了此问题?

I am working on a docker environment(docker-compose) with a jupyter notebook docker image and a postgres docker image for running ML models and using google cloud storage to store the model artifacts. Storing the models on the cloud storage works fine but i can't get to show them within the MLFlow UI. I have seen similar problems but non of the solutions used google cloud storage as the storage location for artifacts. The error message says the following Unable to list artifacts stored under <gs-location> for the current run. Please contact your tracking server administrator to notify them of this error, which can happen when the tracking server lacks permission to list artifacts under the current run's root artifact directory.What could possibly be causing this problem?

推荐答案

我遇到了完全相同的问题.关键字包括docker-compose,google cloud storage,在GCS中成功存储,但在UI中列出工件失败.就我而言,事实证明,在docker-compose文件中,如果您通过读取.env文件(例如GOOGLE_APPLICATION_CREDENTIALS)来分配环境变量,则服务器可能会在分配之前启动.快速解决方案是直接使用键 environment:分配环境变量,而不是使用键 env_file:.对于仍然需要放入.env文件的敏感数据,您可以添加服务器的等待时间,并在docker-compose文件中添加取决于:,以确保数据库容器在启动之前mlflow服务器(如果您使用的是数据库支持的存储.)

I had the exactly the same issue. Keywords are docker-compose, google cloud storage, success in storing in GCS, but failure in listing artifacts in UI. In my case, it turns out that in docker-compose file, if you assign the env vars by reading from a .env file (eg. GOOGLE_APPLICATION_CREDENTIALS), the server might start before the assignment. The quick solve is to assign the env var directly with key environment: instead of using key env_file:. For sensitive data that you still need to put in .env file, you can add wait time for the server, and add depends on: in docker-compose file to make sure that the database container starts before the mlflow server if you are using database-backed store.

这篇关于MLflow工件存储工件(Google云存储),但不显示在MLFlow UI中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆