如何在NiftyNet中实施转学? [英] How do I implement transfer learning in NiftyNet?

查看:101
本文介绍了如何在NiftyNet中实施转学?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想使用NiftyNet堆栈进行一些转移学习,因为我的标签图像数据集很小.在TensorFlow中,这是可能的-我可以加载各种预先训练的网络并直接使用它们的层.为了微调网络,我可以冻结中间层的训练,而只训练最后一层,或者可以将中间层的输出用作特征向量,以馈入另一个分类器.

I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.

我如何在NiftyNet中做到这一点?在文档或源代码中唯一提及转移学习"的内容是模型动物园的参考,但是对于我的任务(图像分类),动物园中没有可用的网络. ResNet体系结构似乎已经实现并可以使用,但是据我所知,它尚未经过任何培训.另外,似乎可以训练网络的唯一方法是使用config文件中的各种TRAIN配置选项运行net_classify train,这些选项都没有冻结网络的选项. niftynet.layer中的各个层似乎也没有使它们受训或受训的选项.

How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train, using the various TRAIN configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer also do not seem to have options to enable them to be trained or not.

我想我的问题是:

  1. 是否可以通过预先训练的TensorFlow网络进行移植?
    • 如果我在NiftyNet中手动重新创建层体系结构,是否可以从预先训练的TF网络中导入权重?
  1. Is it possible to port over a pre-trained TensorFlow network?
    • If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?

推荐答案

:这里是NiftyNet进行迁移学习的文档.

: Here are the docs for transfer learning with NiftyNet.

此功能目前正在使用中.有关完整的详细信息,请参见此处.

This feature is currently being worked on. See here for full details.

预期功能包括:

  • 用于打印所有可训练变量名称的命令(具有可选的正则表达式匹配项)
  • 能够随机初始化变量的子集,该子集由正则表达式名称匹配创建
  • 能够(从现有检查点)恢复并继续更新变量的子集的能力.如果更改了优化方法,请处理方法特定的变量(例如动量)
  • 能够从现有检查点恢复并冻结其余变量的训练权重
  • 训练后保存所有可训练变量
  • 添加用于微调的配置参数,变量名regex, 单元测试
  • 演示/教程
  • 预处理检查点以检查兼容性问题
  • 具有批处理规范和退出层的交易(编辑网络以删除批处理规范变量)
  • Command for printing all trainable variable names (with optional regular expression matching)
  • Ability to randomly initialize a subset of variables, this subset is created by regex name matching
  • Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)
  • Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables
  • Saving all trainable variables after training
  • Add configuration parameters for finetuning, variable name regex, unit tests
  • A demo/tutorial
  • Preprocess the checkpoints for compatibility issues
  • Deal with batch norm and dropout layers (editing networks to remove batch norm variables)

这篇关于如何在NiftyNet中实施转学?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆