Home [PyTorch] Softmax & Cross Entropy
Post
Cancel

[PyTorch] Softmax & Cross Entropy

학습 목표

  • 소프트맥스(Softmax)
  • 크로스 엔트로피(Cross Entropy)
1
2
3
4
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim

Softmax

1
z = torch.FloatTensor([1, 2, 3])
1
2
hypothesis = F.softmax(z, dim=0)
print(hypothesis)
1
tensor([0.0900, 0.2447, 0.6652])

모든 확률을 더하면 1이 된다.

1
hypothesis.sum()
1
tensor(1.)

Cross Entropy

1
2
3
z = torch.rand(3, 5, requires_grad=True)
hypothesis = F.softmax(z, dim=1)
print(hypothesis)
1
2
3
tensor([[0.1703, 0.2118, 0.1857, 0.1963, 0.2359],
        [0.2328, 0.2268, 0.1060, 0.2356, 0.1987],
        [0.3189, 0.2760, 0.1258, 0.1234, 0.1559]], grad_fn=<SoftmaxBackward0>)
1
2
y = torch.randint(5, (3,)).long()
print(y)
1
tensor([2, 1, 0])

F.null_loss(F.log_softmax)

1
2
# Cross Entropy 구하는 방법1
F.nll_loss(F.log_softmax(z, dim=1), y)
1
tensor(1.4368, grad_fn=<NllLossBackward0>)

F.cross_entropy

1
2
# Cross Entropy 구하는 방법1
F.cross_entropy(z, y)
1
tensor(1.4368, grad_fn=<NllLossBackward0>)

nn.Module

1
2
3
4
5
6
7
8
9
10
11
12
x_train = [[1, 2, 1, 1],
           [2, 1, 3, 2],
           [3, 1, 3, 4],
           [4, 1, 5, 5],
           [1, 7, 5, 5],
           [1, 2, 5, 6],
           [1, 6, 6, 6],
           [1, 7, 7, 7]]
y_train = [2, 2, 2, 1, 1, 1, 0, 0]

x_train = torch.FloatTensor(x_train)
y_train = torch.LongTensor(y_train)
1
2
3
4
5
print(x_train.dim())
print(x_train.shape)

print(y_train.dim())
print(y_train.shape)
1
2
3
4
2
torch.Size([8, 4])
1
torch.Size([8])
1
2
3
4
5
6
7
class SoftmaxClassifierModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(4, 3) # Output이 3
    
    def forward(self, x):
        return self.linear(x)
1
model = SoftmaxClassifierModel()
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
# optimizer 설정
optimizer = optim.SGD(model.parameters(), lr=0.1)

nb_epochs = 1000
for epoch in range(nb_epochs + 1):
    # H(x)
    prediction = model(x_train)

    # Cost
    cost = F.cross_entropy(prediction, y_train)

    optimizer.zero_grad()
    cost.backward()
    optimizer.step()

    # 100번마다 로그 출력
    if epoch % 100 == 0:
        print('Epoch {:4}/{} Cost: {:.6f}'.format(epoch, nb_epochs, cost.item()))
1
2
3
4
5
6
7
8
9
10
11
Epoch    0/1000 Cost: 1.777960
Epoch  100/1000 Cost: 0.654127
Epoch  200/1000 Cost: 0.561501
Epoch  300/1000 Cost: 0.505037
Epoch  400/1000 Cost: 0.460010
Epoch  500/1000 Cost: 0.420253
Epoch  600/1000 Cost: 0.383131
Epoch  700/1000 Cost: 0.347032
Epoch  800/1000 Cost: 0.310779
Epoch  900/1000 Cost: 0.274060
Epoch 1000/1000 Cost: 0.244281
This post is licensed under CC BY 4.0 by the author.