Tensorflow的超参数调整 [英] Hyperparameter tune for Tensorflow

查看:219
本文介绍了Tensorflow的超参数调整的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在搜索超参数调整包,以查找直接用Tensorflow编写的代码(不是Keras或Tflearn).你能提出一些建议吗?

I am searching for a hyperparameter tune package for code written directly in Tensorflow (not Keras or Tflearn). Could you make some suggestion?

推荐答案

通常,您不需要将超参数优化逻辑与优化模型结合使用(除非您的超参数优化逻辑特定于您所使用的模型类型)培训,在这种情况下,您需要告诉我们更多信息).有几种工具和软件包可用于该任务. 此处是关于该主题的好论文,和此处是带有示例的更实用的博客文章.

Usually you don't need to have your hyperparameter optimisation logic coupled with the optimised model (unless your hyperparemeter optimisation logic is specific to the kind of model that you are training, in which case you would need to tell us a bit more). There are several tools and packages available for the task. Here is a good paper on the topic, and here is a more practical blog post with examples.

  • hyperopt 实现随机搜索和parzen估计量优化树.
  • Scikit-Optimize 实现了其他一些功能,包括高斯过程贝叶斯优化.
  • SigOpt 是一项针对超参数的便捷服务(收费,尽管有免费套餐,并为学生和研究人员提供了额外的津贴)优化.它基于Yelp的 MOE (开源)(尽管发布的版本似乎更新不多) ),并且在理论上可以单独使用,尽管这需要花费额外的精力.
  • Spearmint 也是常用的软件包,也是开源的,但出于商业目的不是免费的(尽管您可以使用限制性较小的旧版本).它看起来不错,但不是很活跃,并且可用的版本与Python 3不兼容(即使已提交拉取请求来解决该问题).
  • BayesOpt 似乎是贝叶斯优化的黄金标准,但主要是C ++和Python接口看起来并没有太多文献记载.
  • hyperopt implements random search and tree of parzen estimators optimization.
  • Scikit-Optimize implements a few others, including Gaussian process Bayesian optimization.
  • SigOpt is a convenient service (paid, although with a free tier and extra allowance for students and researchers) for hyperparameter optimization. It is based upon Yelp's MOE, which is open source (although the published version doesn't seem to update much) and can, in theory, be used on its own, although it would take some additional effort.
  • Spearmint is a commonly referred package too, also open source but not free for commercial purposes (although you can fall back to a less restrictive older version). It looks good, but not very active, and the available version is not compatible with Python 3 (even though pull requests have been submitted to fix that).
  • BayesOpt seems to be the golden standard in Bayesian optimization, but it's mainly C++, and the Python interface doesn't look very documented.

在这些中,我只有确实(即存在实际问题)与hypers一起在TensorFlow中使用hyperopt,并且并没有花费太多的精力.该API在某些方面有点怪异,并且文档并不十分详尽,但是它确实有效并且似乎正在积极开发中,可能还会有更多的优化算法和适应性修改(例如专门用于神经网络).但是,如先前链接的博客文章中所建议的那样,Scikit-Optimize可能同样出色,并且如果适合您,SigOpt看起来非常易于使用.

Out of these, I have only really (that is, with a real problem) used hyperopt with TensorFlow, and it didn't took too much effort. The API is a bit weird at some points and the documentation is not terribly thorough, but it does work and seems to be under active development, with more optimization algorithms and adaptations (e.g. specifically for neural networks) possibly coming. However, as suggested in the previously linked blog post, Scikit-Optimize is probably as good, and SigOpt looks quite easy to use if it fits you.

这篇关于Tensorflow的超参数调整的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆