pytorch中這段代碼的意思是把學習率learning_rate設為0.000001
但是設置學習率不是給learning_rate賦值就可以完成的,
在pytorch中設置learning_rate有六種方法(這裏的LR就是LearningRate的縮寫)
1等步長間隔調整學習率
optim.lr_scheduler.StepLR(optimizer, step_size, gamma=0.1, last_epoch=-1)
2cosine學習率
optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max, eta_min=0)
3指數衰減學習率
optim.lr_scheduler.ExponentialLR(optimizer, gamma, last_epoch=-1)
4自適應調整學習率
optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='min', factor=0.1, patience=10,verbose=False, threshold=1e-4, threshold_mode='rel',cooldown=0, min_lr=0, eps=1e-8)
5非等間隔調整學習率
optim.lr_scheduler.MultiStepLR(optimizer, milestones, gamma=0.1, last_epoch=-1)
6自定義網絡層隔學習率
optim.lr_scheduler.LambdaLR( optimizer, lr_lambda, last_epoch=-1)
我給妳壹個等步長間隔調整學習率的例子,妳看看吧
import torch
import torch.optim as optim
from torch.optim import lr_scheduler
from torchvision.models import AlexNet
model = AlexNet(num_classes=2)
optimizer = optim.SGD(params = model.parameters(), lr=0.05)
#每10次叠代,lr = lr * gamma
scheduler = lr_scheduler.StepLR(optimizer, step_size=10, gamma=0.1)
for epoch in range(40):
scheduler.step() lr = scheduler.get_lr() print(epoch, scheduler.get_lr()[0])源代碼(註意源代碼的縮進)