NumPy | Python Methods and Functions

** **

The

module provides support for many basic neural network operations. **tensorflow.nn**

Activation function — it is a function that is applied to the output of the neural network layer, which is then passed as input to the next layer. Activation functions are an integral part of neural networks because they provide non-linearity, without which a neural network can be reduced to a simple logistic regression model. One of the many activation functions is the Softplus function, which is defined as ,

Traditional activation functions such as sigmoidal and hyperbolic tangent, have lower and upper bounds, whereas the softplus function outputs in the range (0, ∞). The derivative of the softplus function turns out to be , which is a sigmoid function. The softplus function is very similar to the rectified liner unit (ReLU) function, the main difference being the differentiability of the softplus function at x = 0. Research paper “Improving deep neural networks using softplus blocks” by Zheng et al. (2015) suggests that softplus provides more stabilization and performance for deep neural networks than ReLU. However, ReLU is generally preferred due to its ease of computation and its derivative. Evaluating the activation function and its derivative is a common operation in neural networks, and ReLU provides faster forward and backward propagation than softplus.

` nn.softplus () `

[alias ` math.softplus `

] provides softplus support in Tensorflow.

Syntax: tf.nn.softplus (features, name = None) or tf .math.softplus (features, name = None)

Parameters:

features: A tensor of any of the following types: float32 , float64, int32, uint8, int16, int8, int64, bfloat16, uint16, half, uint32, uint64.

name(optional): The name for the operation.

Return type: A tensor with the same type as that of features.

** Code # 1: **

` ` |

** Output:**

Input type: Tensor ("Const: 0", shape = (6,), dtype = float32) Input: [1. -0.5 3.4000001 - 2.0999999 0. -6.5] Return type: Tensor ("softplus: 0", shape = (6,), dtype = float32) Output: [1.31326163e + 00 4.74076986e-01 3.43282866e + 00 1.15519524e-01 6.93147182e- 01 1.50233845e-03]

** Code # 2: ** Visualization

` ` |

** Output: **

Input: [-5. -4.28571429 -3.57142857 -2.85714286 -2.14285714 -1.42857143 -0.71428571 0. 0.71428571 1.42857143 2.14285714 2.85714286 3.57142857 4.28571429 5.] Output: [0.00671535 0.01366993 0.02772767 0.05584391 0.11093221 0.21482992 0.39846846 0.69314718 1.11275418 1.64340135 2.25378936 2.91298677 3.59915624 4.29938421 5.00671535]

< figure class = aligncenter amp-wp-inline-e407ac51e00eb7ad9e758d070160c9d8>

You should understand the basics of development and usage atop Apache Spark. This book will not be covering introductory material. There are numerous books, forums, and resources available that cover ...

10/07/2020

This book project was first presented to me during my first week in my current role of managing the data mining development at SAS. Writ- ing a book has always been a bucket‐list item, and I was ver...

10/07/2020

Cracking the Coding Interview PDF: 189 Programming Questions and Solutions, 6th Edition. I am not a recruiter. I am a software engineer. And as such, I know what it's like to be asked to create ing...

31/08/2021

As the title promises, this book will introduce you to one of the world’s most popular programming languages: Python. It’s aimed at beginning programmers as well as more experienced programmers wh...

23/09/2020

X
# Submit new EBook