The tensorflow.nn
module provides support for many basic neural network operations.
One of the many activation functions is the sigmoid function, which is defined as ,
The sigmoid function outputs in the range (0, 1), which makes it ideal for binary classification tasks where we need find the probability that the data belongs to a certain class. The sigmoid function is differentiable at every point, and its derivative turns out to be Since the expression includes a sigmoid function, its value can be reused to speed up backpropagation.
The sigmoid function suffers from the "vanishing gradients" problem as it flattens at both ends, resulting in very small weight changes when propagating back. This can cause the neural network to refuse to learn and get stuck. For this reason, the use of the sigmoid function is superseded by other non-linear functions, such as the rectified linear unit (ReLU).
tf.nn.sigmoid ()
[alias tf.sigmoid
] provides sigmoid function support in Tensorflow.
Syntax : tf.nn.sigmoid (x, name = None) or tf.sigmoid (x, name = None)
Parameters :
x : A tensor of any of the following types: float16, float32, float64, complex64, or complex128.
name (optional): The name for the operation.Return type : A tensor with the same type as that of x.
Code # 1:
|
Output:
Input type: Tensor ("Const_1: 0", shape = (6,), dtype = float32) Input: [1. -0.5 3.4000001 - 2.0999999 0. -6.5] Return type: Tensor ("sigmoid: 0", shape = (6,), dtype = float32) Output: [0.7310586 0.37754068 0.96770459 0.10909683 0.5 0.00150118]
Code # 2: Rendering
|
Output:
Input: Input : [-five. -4.28571429 -3.57142857 -2.85714286 -2.14285714 -1.42857143 -0.71428571 0. 0.71428571 1.42857143 2.14285714 2.85714286 3.57142857 4.28571429 5.] Output: [0.00669285 0.01357692 0.02734679 0.05431327 0.10500059 0.19332137 0.32865255 0.5 0.67134745 0.80667863 0.89499941 0.94568673 0.97265321 0.98642308 0.99330715]
< figure class = aligncenter amp-wp-inline-e407ac51e00eb7ad9e758d070160c9d8>