Pytorch根据epoch数改变学习率 [英] Pytorch Change the learning rate based on number of epochs

查看:21
本文介绍了Pytorch根据epoch数改变学习率的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我设置学习率并发现训练几个时期后准确率无法提高时

When I set the learning rate and find the accuracy cannot increase after training few epochs

optimizer = optim.Adam(model.parameters(), lr = 1e-4)

n_epochs = 10
for i in range(n_epochs):

    // some training here

如果我想使用阶跃衰减:每 5 个 epoch 将学习率降低 10 倍,我该怎么做?

If I want to use a step decay: reduce the learning rate by a factor of 10 every 5 epochs, how can I do so?

推荐答案

你可以使用 lr shedular torch.optim.lr_scheduler.StepLR

You can use lr shedular torch.optim.lr_scheduler.StepLR

import torch.optim.lr_scheduler.StepLR
scheduler = StepLR(optimizer, step_size=5, gamma=0.1)

step_size epochs 在此处查看文档来自文档的示例

Decays the learning rate of each parameter group by gamma every step_size epochs see docs here Example from docs

# Assuming optimizer uses lr = 0.05 for all groups
# lr = 0.05     if epoch < 30
# lr = 0.005    if 30 <= epoch < 60
# lr = 0.0005   if 60 <= epoch < 90
# ...
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(100):
    train(...)
    validate(...)
    scheduler.step()

示例:

import torch
import torch.optim as optim

optimizer = optim.SGD([torch.rand((2,2), requires_grad=True)], lr=0.1)
scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=5, gamma=0.1)

for epoch in range(1, 21):
    scheduler.step()
    print('Epoch-{0} lr: {1}'.format(epoch, optimizer.param_groups[0]['lr']))
    if epoch % 5 == 0:print()

Epoch-1 lr: 0.1
Epoch-2 lr: 0.1
Epoch-3 lr: 0.1
Epoch-4 lr: 0.1
Epoch-5 lr: 0.1

Epoch-6 lr: 0.010000000000000002
Epoch-7 lr: 0.010000000000000002
Epoch-8 lr: 0.010000000000000002
Epoch-9 lr: 0.010000000000000002
Epoch-10 lr: 0.010000000000000002

Epoch-11 lr: 0.0010000000000000002
Epoch-12 lr: 0.0010000000000000002
Epoch-13 lr: 0.0010000000000000002
Epoch-14 lr: 0.0010000000000000002
Epoch-15 lr: 0.0010000000000000002

Epoch-16 lr: 0.00010000000000000003
Epoch-17 lr: 0.00010000000000000003
Epoch-18 lr: 0.00010000000000000003
Epoch-19 lr: 0.00010000000000000003
Epoch-20 lr: 0.00010000000000000003

更多关于如何调整学习率 - torch.optim.lr_scheduler 提供了多种方法来根据 epoch 数调整学习率.

More on How to adjust Learning Rate - torch.optim.lr_scheduler provides several methods to adjust the learning rate based on the number of epochs.

这篇关于Pytorch根据epoch数改变学习率的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆