tensorflow.nn module provides support for many basic neural network operations.
One of the many activation functions is the hyperbolic tangent function (also known as tanh ), which is defined as ,
The hyperbolic tangent function outputs in the range (-1, 1), thus mapping strongly negative inputs to negative values. Unlike the sigmoidal function, only near-zero values are mapped to near-zero outputs, and this solves the “vanishing gradients” problem to some extent. The hyperbolic tangent function is differentiable at every point, and its derivative turns out to be Since the expression includes the tanh function, its value can be reused to speed up backpropagation.
Although the network is less likely to get stuck compared to the sigmoid function, the hyperbolic tangent function still suffers from “vanishing gradients.” A rectified linear unit (ReLU) can be used to overcome this problem.
tf.nn.tanh () [alias
tf.tanh ] function provides support for hyperbolic tangent function in Tensorflow.
Syntax : tf.nn.tanh (x, name = None) or tf.tanh (x, name = None) p>
x : A tensor of any of the following types: float16, float32, double, complex64, or complex128.
name (optional): The name for the operation.
Return : A tensor with the same type as that of x. p >
Code # 1:
Input: [-5. -4.28571429 -3.57142857 -2.85714286 -2.14285714 -1.42857143 -0.71428571 0. 0.71428571 1.42857143 2.14285714 2.85714286 3.57142857 4.28571429 5.] Output: [-0.9999092 -0.99962119 -0.99842027 -0.99342468 -0.97284617 -0.89137347 -0.61335726 0. 0.61335726 0.89137347 0.97284617 0.99342468 0.99842027 0.99962119 0.9999092]