Tensorflow Hub进行微调和评估 [英] Tensorflow hub fine-tune and evaluate

查看:132
本文介绍了Tensorflow Hub进行微调和评估的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想使用tensorflow集线器在我的图中重新训练其模块之一,然后使用该模块,但是我的问题是当我设置trainable = Truetags = {"train"}来创建模块时,我无法进行评估由于批次归一化层. 因此,当我阅读此问题时,我发现我还应该在不设置tags = {"train"}的情况下创建另一个图形进行评估.但我不知道如何将训练图的变量还原为评估图.我尝试用相同的名称创建两个模块,并在评估图中使用reuse = True,但这没有帮助.

I want to use tensorflow hub, to retrain one of its modules in my graph and then use that module, but my problem is when I set trainable = True and tags = {"train"} to create the module, I can not do an evaluation because of batch normalization layers. so as I read about this issue, I found that I should create also another graph for evaluation without setting tags = {"train"}. but I don't know how to restore variables from train graph into eval graph. I tried creating both modules with the same name and use reuse = True in the eval graph, but it wasn't helpful.

推荐答案

为社区的利益指定Arno在答案"部分中引用的解决方案(即使它在注释"部分中也存在).

Specifying the Solution referenced by Arno in Answer Section (even though it is present in Comments Section), for the benefit of the community.

答案是

对于TF1,使用hub.Module时,情况就如您所说:训练图或推理图都被实例化,并且没有很好的方法来导入两者并在单个tf.Session中共享它们之间的变量.这是由Estimators和TF1中的许多其他培训脚本(尤其是分布式脚本)使用的方法所告知的:有一个培训课程可生成检查点,而单独的评估课程可从中恢复模型权重. (两者在读取的数据集和执行的预处理方面也可能会有所不同.)

With hub.Module for TF1, the situation is as you say: either the training or the inference graph is instantiated, and there is no good way to import both and share variables between them in a single tf.Session. That's informed by the approach used by Estimators and many other training scripts in TF1 (esp. distributed ones): there's a training Session that produces checkpoints, and a separate evaluation Session that restores model weights from them. (The two will likely also differ in the dataset they read and the preprocessing they perform.)

有了TF2及其对急切"模式的强调,这种情况已经改变. TF2样式的集线器模块(位于 https://tfhub.dev/s?q=tf2中-preview )实际上只是 TF2-样式SavedModels ,并且它们没有多个图形版本.相反,如果需要训练/推理区别,则在还原的顶级对象上的 call 函数采用可选的training = ...参数.

With TF2 and its emphasis on Eager mode, this has changed. TF2-style Hub modules (as found at https://tfhub.dev/s?q=tf2-preview) are really just TF2-style SavedModels, and these don't come with multiple graph versions. Instead, the call function on the restored top-level object takes an optional training=... parameter if the train/inference distinction is required.

有了这个,TF2应该符合您的期望.请参见交互式演示 tf2_image_retraining.ipynb tensorflow_hub/keras_layer.py 中的基础代码>如何做到这一点. TF Hub团队正在努力为TF2版本提供更完整的模块选择.

With this, TF2 should match your expectations. See the interactive demo tf2_image_retraining.ipynb and the underlying code in tensorflow_hub/keras_layer.py for how it can be done. The TF Hub team is working on making more complete selection of modules available for the TF2 release.

这篇关于Tensorflow Hub进行微调和评估的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆