微调火炬模型 [英] Finetune a Torch model

查看:68
本文介绍了微调火炬模型的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经在Torch中加载了一个模型,我想对其进行微调。目前,我想重新训练网络的最后2层(尽管将来可能要添加层)。我怎样才能做到这一点?我一直在寻找教程,但没有找到我想要的东西。

I have loaded a model in Torch and I would like to fine-tune it. For now I'd like to retrain the last 2 layers of the network (though in the future I may want to add layers). How can I do this? I have been looking for tutorials, but I haven't found what I am looking for. Any tips?

推荐答案

我不知道我是否理解您的要求。如果您想保留除要训练(或微调)的2层之外的网络,则必须停止对不想训练的网络的反向传播,例如:

I don't know if I understood what you are asking for. If you want to leave the net as it was except for the 2 layers you want to train (or fine-tune) you have to stop the backpropagation on the ones you don't want to train, like this:

for i=1, x do
  c = model:get(i)
  c.updateGradInput = function(self, inp, out) end
  c.accGradParameters = function(self,inp, out) end
 end

现在,仅此循环外的层将升级其参数。如果要添加新图层,只需调用 model:insert(module,position),您可以在此处火炬容器

Now only the layers outside of this loop will upgrade their parameters. If you want to add new layers just call model:insert(module, position), you can have a look here Torch containers

如果这不是您想要的内容,请详细说明该问题。

If that was not what you were looking for, please elaborate more on the question.

这篇关于微调火炬模型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆