Tensorflow Control Practice with Live Codes Graphs and Sessions- Part 4 | AI Sangam

Introduction for Tensorflow Control Practice:

In recent blog series on tensorflow we have learned about following concepts

  1. Constant, variables and placeholders in tensorflow
  2. Optimizers and loss function
  3. Graphs, session and tensorboard

If you face any problem in graphs, session, constants, optimizer and control flow in tensorflow, please refer to below tutorial of the series:

Tensorflow tutorial from basics for beginners- Part1 | AI Sangam

Optimize Parameters Tensorflow Tutorial –Part2 |AI Sangam

Low Level Introduction of Tensorflow in a simple way-Part 3| AI Sangam

This tutorial will cover Tensorflow Control Practice i.e live codes for each control along with their session and graphs. Please look at below list to know more what this specific tutorial is focused on.

  1. Code with explanation for Dependencies
    1. tf.control_dependencies
    2. tf.group
    3. tf.tuple
  2. Code with explanation for conditional statement
    1. tf.cond
    2. tf.case
  3. Code with explanation for Loop
    1. tf.while

1. Code with explanation for Dependencies

Theory and points for dependencies as well different types are discussed in the previous part. If you want to read about it in detail, please don’t miss to read previous blog on tensorflow series. Please click on this link for it. Here, we will write a live code for each of the dependencies and visualize each of it in the tensorboard to know more about edges and nodes.

a.) tf.control_dependencies:- Please look at the below code. Since this is the practical section hence there would be codes and graphs as promised above.

Explanation of the code above

  1. x is the placeholder where different values can be fed. If shape is not defined we can feed a tensor of any shape. x is the name for the operation.
  2. y is variable whose value will change and to make this happen we will use global_variables_initializer() which will be used inside the session.
  3. Now, we will write assign_op which is assigning value of y equals to y*2 and this will be passed as an argument to tf.control_dependencies which will be executed first or before output = x*y. We had discussion on this in our previous blog so please read from there.
  4. Value of x is passed as 1 in the loop 3 times and each time new value of y is assigned first in the tf.control_dependencies block and then output is calculated. Please see the output as below:-

output: 4
output: 8
output: 16

Let us see the result in the tensorboard. Please run the below command to run the event in the tensorboard

Let us also see the graph for it where we can better knowledge of edges (scalars) and nodes (operations)

Graph for tf.control_dependencies

Graph for tf.control_dependencies

b.) tf.group: When you need to implement multiple operation, you can use tf.group. Please see the below code.

Explanation of this code

First of all, op1 will be executed. So value of c will be a*2 i.e 2. Control_dependencies will be called in the with loop. Parameter of control_dependencies will be op1. Now value of c will be assigned using op2. Both the operation will be called in tf.group. Please see the graph for this code.

Graph for tf.group

Graph for tf.group

c.) tf.tuple: Definitions and understanding is done in the previous blog on it. Let us move to the code with some explanation and live graphs. Please look below to see the code

Explanation of this code

tf.tuple will return the list of tensor. c is the first tensor and d is the second tensor. x and y both are considered as placeholder and hence their values are passed as dictionary. Please see the output and graph as below

[array([1, 4, 9]), array([ 1, 8, 27])]

Graph for tf.tuple

Graph for tf.tuple

Let us move towards the conditional statement. Without wasting any time, let us together build the code for tf.cond.

a.) tf.cond:- Example for tf.cond has been taken from stackoverflow so it is moral duty to pay regard to them so here is the link for the code which is being discussed here.

Please have the code

Explanation of the code

If you make the value of pred = False, output will not change. Please understand this and with this we will be able to understand the code. Reason is if you look at both  branches in the argument of tf.cond (update_x_2, lambda: tf.identity(x)) both uses the value of x after assigning it, hence it makes no sense whether pred = true or false. Output remains same. Please understand that Any Tensor or operation which is created outside true_function (update_x_2) and false_function (lambda: tf.identity(x)) will be executed first. To understand in more details, please see the below graph which is created after running this code.

tf.cond case 1

tf.cond case 1

How to resolve the problem?

If we read the document carefully, if we put the assign_x_2 = tf.assign(x, [2]) inside the control_dependencies, it will help to resolve the error. You can  find the elaborate answer at the stackoverflow whose link has been provided upwards. This will enable to update the value of x in the function update_x_2 while keeping the value same in the function lambda: tf.identity(x). This will help to differentiate between true function and false function inside tf.cond. Please see the below code to understand more.

Output of the code

[1]
[2]

Graph for the above:

tf.cond case 2

tf.cond case 2

b.) tf.case: It is another type of conditional control, where we subject variables to different conditions and append the result and put it in the tf.case argument. Please see the below code with explanation and graph

Please see the output of this code as below

[0, 0.1]
[1, 0.1]
[2, 0.001]
[3, 0.1]
[4, 0.001]
[5, 0.1]

Explanation of the code:

  1. a is a variable whose initial value is 0.
  2. b is assigning the value a = a+1
  3. c is another value which will become default argument of tf.case if the condition in case is false
  4. case is a list which is initialized as empty list
  5. cases are created according to  pred = tf.equal(a,d). If condition is true value of e is considered by tf.case else c

Graph for this is below:

Graph for tf.case

Graph for tf.case

a.) tf.while: Now let us move towards loop control. Let us build code for tf.while which is loop execution in tensorflow. Hope you have followed our previous article on how all these operations operate. If not please stay in touch with the previous blog using the below link

Low Level introduction of Tensorflow in a simple way – Part 3| AI Sangam

Please look at the below code to understand tf.while in a better way

  Explanation of the code:

  1. There are three things in the code one is the condition, second is the body where condition is applied and third is the iteration variable whose values gets modified and returned as the output
  2. Value of i starts from 4 and will end when i becomes 10.

Output of the code:

10

Graph:

Graph for tf.while

Graph for tf.while

Conclusion:

Let us revise what we have learned in this blog about Tensorflow Control Practice. Firstly we have continued the last blog and carried out practical session for different control available in the tensorflow. Each of the control is described with proper codes, output and live graphs. Hope this would create a more depth understanding of tensorflow. Again Explanation of each of the code is also carried point wise. We would keep on working more on tensorflow and bring out simple and understandable blogs on tensorflow. You can follow us different social media which you can find in the footer section of the blog. You can also drop a email at aisangamofficial@gmail.com. You can visit the company official website www.aisangam.com to get to know services and features of our company. If you have any questions related to this blog, please do mention it in the comment section so that we can assist you better.

Also Read:

Low level introduction of Tensorflow in a simple way- Part 3 | AI Sangam

Leave a Reply

Your email address will not be published.