Python | Tensorflow nn.softplus ()

NumPy | Python Methods and Functions

The tensorflow.nn module provides support for many basic neural network operations.

Activation function — it is a function that is applied to the output of the neural network layer, which is then passed as input to the next layer. Activation functions are an integral part of neural networks because they provide non-linearity, without which a neural network can be reduced to a simple logistic regression model. One of the many activation functions is the Softplus function, which is defined as ,

Traditional activation functions such as sigmoidal and hyperbolic tangent, have lower and upper bounds, whereas the softplus function outputs in the range (0, ∞). The derivative of the softplus function turns out to be , which is a sigmoid function. The softplus function is very similar to the rectified liner unit (ReLU) function, the main difference being the differentiability of the softplus function at x = 0. Research paper “Improving deep neural networks using softplus blocks” by Zheng et al. (2015) suggests that softplus provides more stabilization and performance for deep neural networks than ReLU. However, ReLU is generally preferred due to its ease of computation and its derivative. Evaluating the activation function and its derivative is a common operation in neural networks, and ReLU provides faster forward and backward propagation than softplus.

nn.softplus () [alias math.softplus ] provides softplus support in Tensorflow.

Syntax : tf.nn.softplus (features, name = None) or tf .math.softplus (features, name = None)

Parameters :
features : A tensor of any of the following types: float32 , float64, int32, uint8, int16, int8, int64, bfloat16, uint16, half, uint32, uint64.
name (optional): The name for the operation.

Return type : A tensor with the same type as that of features.

Code # 1:

# Import Tensorflow library

import tensorflow as tf

 
# Constant vector of size 6

a = tf.constant ([ 1.0 , - 0.5 , 3.4 , - 2.1 , 0.0 , - 6.5 ], dtype = tf.float32)

 
# Using softplus and
# save the result in & # 39; b & # 39;

b = tf.nn.softplus (a, name = 'softplus' )

  
# Initiate a Tensorflow session
with tf.Session () as sess:

print ( 'Input type:' , a)

print ( 'Input:' , sess.run (a))

print ( 'Return type:' , b)

  print ( ' Output: ' , sess.run (b))

Output:

 Input type: Tensor ("Const: 0", shape = (6,), dtype = float32) Input: [1. -0.5 3.4000001 - 2.0999999 0. -6.5] Return type: Tensor ("softplus: 0", shape = (6,), dtype = float32) Output: [1.31326163e + 00 4.74076986e-01 3.43282866e + 00 1.15519524e-01 6.93147182e- 01 1.50233845e-03] 

Code # 2: Visualization

# Library import Tensorflow

import tensorflow as tf

 
# Importing the NumPy library

import numpy as np

 
# Import the matplotlib.pylot function

import matplotlib.pyplot as plt

  
# Vector size 15 with values ​​from -5 to 5

a = np.linspace ( - 5 , 5 , 15 )

 
# Using the softplus function and
# saving the result to & # 39; b & # 39;

b = tf.nn.softplus (a, name = 'softplus' )

  
# Initiating a Tensorflow session
with tf.Session () as sess:

print ( 'Input:' , a)

print ( 'Output:' , sess.run (b))

plt.plot (a, sess.run (b), color = 'red' , marker = "o"

plt.title ( "te nsorflow.nn.softplus "

  plt.xlabel ( "X"

plt.ylabel ( " Y "

 

plt .show ()

Output:

 Input: [-5. -4.28571429 -3.57142857 -2.85714286 -2.14285714 -1.42857143 -0.71428571 0. 0.71428571 1.42857143 2.14285714 2.85714286 3.57142857 4.28571429 5.] Output: [0.00671535 0.01366993 0.02772767 0.05584391 0.11093221 0.21482992 0.39846846 0.69314718 1.11275418 1.64340135 2.25378936 2.91298677 3.59915624 4.29938421 5.00671535] 

< figure class = aligncenter amp-wp-inline-e407ac51e00eb7ad9e758d070160c9d8>





Get Solution for free from DataCamp guru