PyTorch ile Deep Learning'e Giriş
Jasmin Ludolf
Senior Data Science Content Developer, DataCamp
$$

$$

Sigmoid fonksiyonu:
Gradyanlar:
$$
Softmax da doyma yaşar

Rectified Linear Unit (ReLU):
f(x) = max(x, 0)$$
PyTorch'ta:
relu = nn.ReLU()

Leaky ReLU:
$$
PyTorch'ta:
leaky_relu = nn.LeakyReLU(
negative_slope = 0.05)

PyTorch ile Deep Learning'e Giriş