One of the many activation functions is the hyperbolic tangent function (also known as tanh), which is defined as ,
The hyperbolic tangent function outputs in the range (-1, 1), thus mapping strongly negative inputs to negative values. Unlike the sigmoidal function, only near-zero values are mapped to near-zero outputs, and this solves the "vanishing gradients" problem to some extent. The hyperbolic tangent function is differentiable at every point, and its derivative turns out to be Since the expression includes the tanh function, its value can be reused to speed up backpropagation.
Although the network is less likely to get stuck compared to the sigmoid function, the hyperbolic tangent function still suffers from "vanishing gradients." A rectified linear unit (ReLU) can be used to overcome this problem.
The torch.tanh ()
function provides support for the hyperbolic tangent in PyTorch. It expects input in radians, and output is in the range [-∞, ∞]. Input type — tensor, and if the input contains more than one element, the element-wise hyperbolic tangent is calculated.
Syntax : torch.tanh (x, out = None)
Parameters :
x : Input tensor
name (optional): Output tensorReturn type : A tensor with the same type as that of x.
Code # 1:
|
Exit:
1.0000 -0.5000 3.4000 -2.1000 0.0000 -6.5000 [ torch.FloatTensor of size 6] 0.7616 -0.4621 0.9978 -0.9705 0.0000 -1.0000 [torch.FloatTensor of s ize 6]
Code # 2: Visualization
|
Exit:
-0.9999 -0.9996 -0.9984 -0.9934 -0.9728 -0.8914 -0.6134 0.0000 0.6134 0.8914 0.9728 0.9934 0.9984 0.9996 0.9999 [torch.FloatTensor of size 15]