Introduction to Tensor with Tensorflow

An introduction to tensor flow and tensor along with the implementation of tensors in tensor flow.

What is Tensorflow and how it works

TensorFlow — is an open source software library for programming data streams in a variety of tasks. It is a symbolic math library that is also used for machine learning applications like neural networks. Google open source TensorFlow in November 2015. Since then, TensorFlow has become the most popular machine learning repository on Github. (Https://github.com/tensorflow/tensorflow)

Why TensorFlow? The popularity of TensorFlow is due to many reasons, but primarily due to the concept of computational graph, automatic differentiation, and adaptability of the structure of the Python Tensorflow API. This makes solving real TensorFlow problems available to most programmers.

Google`s Tensorflow engine has a unique way of solving problems. This unique method allows you to very efficiently solve machine learning problems. We`ll go over the basic steps to understand how Tensorflow works.

What is Tensor in Tensorflow?

TensorFlow, as the name suggests, is a platform for defining and running calculations using tensors. Tensor — it is a generalization of vectors and matrices to potentially higher dimensions. Internally, TensorFlow represents tensors as n-dimensional arrays of basic data types. Every item in Tensor has the same datatype, and the datatype is always known. The shape (that is, the number of dimensions and the size of each dimension) can only be partially known. Most operations produce tensors of fully known shapes if the shapes of their input are also fully known, but in some cases the shape of the tensor can only be found during graph execution.

General TensorFlow Algorithm Flowcharts

Here we will present the general flow of tensorflow algorithms.

  1. Import or generate data

    All our machine learning algorithms will depend on data. In practice, we will either generate data or use an external data source. Sometimes it is better to rely on the generated data because we want to know the expected result. And also tensorflow comes pre-installed with well-known datasets like MNIST, CIFAR-10, etc.

  2. Data transformation and normalization

    Data usually has the wrong size or type that our Tensorflow algorithms expect. We will have to transform our data before we can use it. Most algorithms also expect normalized data. Tensorflow has built-in functions that can normalize data for you.

     data = tf.nn.batch_norm_with_global_normalization (...) 
  3. Set algorithm parameters

    Our algorithms usually have a set of parameters that we keep constant throughout the entire procedure. For example, it could be the number of iterations, the learning rate, or other fixed parameters of our choice. It is considered a good form to initialize them together so that the reader or user can easily find them.

     learning_rate = 0.001 iterations = 1000 
  4. Initialize variables and placeholders

    The flux tensor depends on what we tell it, what it can and cannot change. Tensorflow will modify variables during optimization to minimize the loss function. To do this, we enter data through placeholders. We need to initialize both of these variables and placeholders with size and type so that Tensorflow knows what to expect.

     a_var = tf.constant (42) x_input = tf.placeholder (tf.float32, [None, input_size] ) y_input = tf.placeholder (tf.fload32, [None, num_classes]) 
  5. Define the structure of the model

    Once we have the data and our variables and placeholders are initialized, we need to define the model. This is done by building a computational graph. We tell Tensorflow what operations need to be performed on variables and placeholders to arrive at our model predictions.

     y_pred = tf.add (tf.mul (x_input, weight_matrix), b_matrix) 

  6. Declare loss functions

    After defining the model, we must evaluate the output. Here we are declaring a loss function. The loss function is very important because it tells us how far our forecasts are from the actual values.

     loss = tf.reduce_mean (tf.square (y_actual - y_pred)) 
  7. Initialize and train the model

    Now that we have everything in place, we instantiate or our graph and pass the data through placeholders and let Tensorflow change variables to better predict our training data. Here is one way to initialize a computational graph.

     with tf.Session (graph = graph) as session: ... session.run (...) ...  

    Note that we can also start our graph with

     session = tf.Session (graph = graph) session.run (…) 
  8. Evaluate the model (optional)

    After we have created and trained the model, we must evaluate the model by seeing how well it performs with new data according to some of the specified criteria.

  9. Predict new results (optional)

    It is also important to know how to make predictions for new, invisible data. We can do this with all of our models as soon as we train them.

Summary

In Tensorflow, we need to set up data, variables, placeholders and a model before we tell the program to train and change the variables to improve predictions. Tensorflow does this through a computational graph. We say this to minimize the waste function, and Tensorflow does this by modifying the variables in the model. Tensorflow knows how to change variables because it keeps track of the calculations in the model and automatically calculates gradients for each variable. This allows us to see how easy it is to make changes and try different data sources.

In general, the algorithms in TensorFlow are designed to be cyclical. We set this loop as a computational graph and (1) enter data through placeholders, (2) compute the output of the computational graph, (3) compare the output to the desired output using a loss function, (4) modify the model variables according to automatic backpropagation, and finally (5) repeat the process until the stop criteria are met.

Now begins the tutorial on tensorflow and implements it.

First, we need to import the required libraries.

 import tensorflow as tf from tensorflow.python.framework import ops ops.reset_default_graph () 

Then start a graph session

 sess = tf.Session () 

Now the main part begins, i.e. Creating tensors.

TensorFlow has a built-in function for creating tensors for and use in variables. For example, we can create a zero-filled tensor of a predefined shape using the tf.zeros () function as follows.

 my_tensor = tf.zeros ([1,20]) 

We can evaluate tensors by calling the run () method in our session.

 sess.run (my_tensor) 

TensorFlow algorithms need to know which objects are variables and which — constants. Therefore, we create a variable using the TensorFlow tf.Variable () function. Please note that you cannot run sess.run (my_var), this will result in an error. Since TensorFlow works with computational graphs, we must create a variable initialization operation in order to evaluate the variables. For this script, we can initialize one variable at a time by calling the variable method my_var.initializer.

 my_var = tf.Variable (tf.zeros ([1,20])) sess.run (my_var.initializer) sess.run (my_var) 

Output:

 array ([[0., 0., 0., 0., 0., 0 ., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]], dtype = float32) 

Now let`s create our variable to handle measurements of a certain shape, then initialize the variables with all 1s or 0s

 row_dim = 2 col_dim = 3 zero_var = tf.Variable (tf .zeros ([row_dim, col_dim])) ones_var = tf.Variable (tf.ones ([row_dim, col_dim])) 

Now by evaluating their values, we can run the initializer methods on our variables again.

 sess.run (zero_var.initializer) sess.run (ones_var.initializer) print (sess.run (zero_var)) print (sess.run (ones_var)) 

Output:

 [[0. 0. 0.] [0. 0. 0.]] [[1. 1. 1.] [1. 1. 1.]] 

AND this list will continue. The rest will be for you to learn, follow this jupyter notebook from me to get more information on tensors from here .

Visualizing variable creation in TensorBoard

To visualize variable creation in Tensorboard, we will reset the computational graph and create a global initialization operation.

# Reset graph
ops.reset_default_graph ()

  
# Start graphics session

sess = tf.Session ()

  
# Create variable

my_var = tf.Variable (tf.zeros ([ 1 , 20 ]))

 
# Add resume to tensor board

merged = tf.summary.merge_all ()

 
# Initialize writing schedule:

writer = tf.summary.FileWriter ( " / tmp / variable_logs " , graph = sess.graph)

 
# Initialize the operation

initialize_op = tf.global_variables_initializer ()

  
# Start variable initialization
sess.run (initialize_op)

Now run the following command in cmd.

 tensorboard --logdir = / tmp 

And it will tell us the URL where we can go in our browser to see the Tensorboard to get your loss plots.

Code to generate all types of tensors and evaluate them.

import tensorflow as tf

from tensorflow.python.framework import ops

ops.reset_default_graph ()

  
# Enter tensors in tf

 
# Get graph descriptor

sess = tf.Session ()

 

my_tensor = tf.zeros ([ 1 , 20 ])

 
# Declare variable

my_var = tf.Variable (tf.zeros ([ 1 , 20 ]))

 
# Different kinds of variables

row_dim = 2

col_dim = 3  

 
# Null initialized variable

zero_var = tf.Variable (tf.zeros ([row_dim, col_dim]))

 
# One initialized variable

ones_var = tf.Variable (tf.ones ([row_dim, col_dim]))

 
# in the form of another variable
sess.run (zero_var.initializer)
sess.run (ones_var.init ializer)

zero_similar = tf.Variable (tf.zeros_like (zero_var))

ones_similar = tf.Variable (tf.ones_like (ones_var))

  
sess.run (ones_similar.initializer)
sess.run (zero_similar.initializer)

 
# Fill in the form permanent

fill_var = tf.Variable (tf.fill ([row_dim, col_dim], - 1 ))

 
# Create variable from constant

const_var = tf.Variable (tf.constant ([ 8 , 6 , 7 , 5 , 3 , 0 , 9 ]))

# This can also be used to populate an array:

const_fill_var = tf.Variable (tf.constant ( - 1 , shape = [row_dim, col_dim]))

 
# Generation sequences

linear_var = tf.Variable (tf.linspace (start = 0.0 , stop = 1.0 , num = 3 )) # Generates [0.0, 0.5, 1.0] includes end

 

sequence_var = tf. Variable (tf. range (start = 6 , limit = 15 , delta = 3 ) ) # Generator it says [6, 9, 12] does not include the end

 
# Random numbers

 
# Random Normal

rnorm_var = tf.random_normal ([row_dim, col_dim], mean = 0.0 , stddev = 1.0 )

 
# Add resume to tensor board

merged = tf.summary.merge_all ()

 
# Initialize recording schedule:

 

writer = tf.summary.FileWriter ( "/ tmp / variable_logs" , graph = sess.graph)

 
# Initialize the operation

initialize_op = tf.global_variables_initializer ()

 
# Start variable initialization
sess.run (initialize_op)

Output:

Link Link:

1) Tensorflow documentation