how to

3.2.linear-regression-scratch

Aug 6, 2024
notesjulyfun技术学习d2l
1 Minutes
130 Words

批量大小越大,则梯度下降越快?

(ai) ref: https://www.perplexity.ai/search/zai-pytorch-zhong-ruo-wo-diao-XuETZW8eSoSQFqW3H_kRqw

如果使用 SGD,是的:

1
lr = 0.03
2
num_epochs = 3
3
net = linreg
4
loss = squared_loss
5
6
for epoch in range(num_epochs):
7
for X, y in data_iter(batch_size, features, labels):
8
l = loss(net(X, w, b), y) # X和y的小批量损失
9
# 因为l形状是(batch_size,1),而不是一个标量。l中的所有元素被加到一起,
10
# 并以此计算关于[w,b]的梯度
11
l.sum().backward()
12
sgd([w, b], lr, batch_size) # 使用参数的梯度更新参数
13
with torch.no_grad():
14
train_l = loss(net(features, w, b), labels)
15
print(f'epoch {epoch + 1}, loss {float(train_l.mean()):f}')
Article title:3.2.linear-regression-scratch
Article author:Julyfun
Release time:Aug 6, 2024
Copyright 2025
Sitemap