This is the 5th tutorial, in our TensorFlow tutorial series.

The preceding tutorials introduced a surplus of functions that create, transform, and process tensors. This tutorial describes the key theories of how TensorFlow is developed and how it works with simple and intuitive examples.

## What is a Computation Graph?

A dataflow graph or computation graph is the basic unit of computation in TensorFlow. In TensorFlow, when an application executes a function to create, transform, and process a tensor, instead of executing its operation function stores its operation in a data structure called a computation graph. A program in TensorFlow is basically a computation graph. A graph can hold many operations which will be executed in order when a session executes a graph.

A computation graph comprises nodes and edges. Each node represents an operation and each edge describes a tensor that gets transferred between the nodes. In TensorFlow, each of the graph's nodes illustrates an operation, perhaps referred to some input, and can develop an output that is passed on to other nodes. By correlation, we can think of the graph computation as an assembly line where each machine (node) either gets or creates its raw material (input), refines it, and then moves the output to other machines in an orderly fashion, building subcomponents and ultimately a final product when the assembly process comes to an end.

A computational graph is basically like a dataflow graph. Below figure shows a computational graph for a simple computation like sum=a+b:

Each circle identifies a tensor or operation, and each line transfers tensor data. When an application executes a function that constructs a tensor or an operation, TensorFlow includes the data structures to a container structure called a Computation Graph. Computation Graphs can't be nested and only one can be active at a time.

The Graph class have many different methods to access and modify computation graph's content.

### 1) Example of Graph Methods

As we import TensorFlow, a specific empty default graph is getting formed and all the nodes we create are getting associated with that default graph.

## # TensorFlow program with example Graph class methods from __future__ import absolute_import from __future__ import division from __future__ import print_function import tensorflow as tf t1 = tf.constant([[5, 10, 15], [4, 8, 12]], name="T1") t2 = tf.constant([[5, 10, 15], [4, 8, 12]], name="T2") sum_tensor = t1 + t2 print(tf.get_default_graph().get_operations()) print("\n") print(tf.get_default_graph().get_operation_by_name("add")) print("\n") print(tf.get_default_graph().get_tensor_by_name("T1:0")) print(tf.get_default_graph().get_tensor_by_name("T2:0"))

###### Sample output of above program, shown as below:

The get_default_graph() function returns the default graph for the current thread. The get_operations() function returns the list of operations in the graph. The get_operation_by_name() function returns the Operation with the given name. The get_tensor_by_name() function returns the Tensor with the given name.

### 2) Export Graph data in JSON format

Using get_default_graph function we can also export the graph's data in a json file. Each tensor and operation is represented by a node element. Each node has a name field, an op field and other attr fields...

## # TensorFlow program to export Graph data. from __future__ import absolute_import from __future__ import division from __future__ import print_function import tensorflow as tf import os t1 = tf.constant([[5, 10, 15], [4, 8, 12]], name="T1") t2 = tf.constant([[5, 10, 15], [4, 8, 12]], name="T2") sum_tensor = t1 + t2 tf.train.write_graph(tf.get_default_graph(), os.getcwd(), 'graph.json') tf.train.write_graph(tf.get_default_graph(), os.getcwd(), 'graph.txt')

###### Sample output of above program, shown as below:

name: "T1"

op: "Const"

attr {

key: "dtype"

value {

type: DT_INT32

}

}

attr {

key: "value"

value {

tensor {

dtype: DT_INT32

tensor_shape {

dim {

size: 2

}

dim {

size: 3

}

}

tensor_content: "\005\000\000\000\n\000\000\000\017\000\000\000\004\000\000\000\010\000\000\000\014\000\000\000"

}

}

}

}

node {

name: "T2"

op: "Const"

attr {

key: "dtype"

value {

type: DT_INT32

}

}

attr {

key: "value"

value {

tensor {

dtype: DT_INT32

tensor_shape {

dim {

size: 2

}

dim {

size: 3

}

}

tensor_content: "\005\000\000\000\n\000\000\000\017\000\000\000\004\000\000\000\010\000\000\000\014\000\000\000"

}

}

}

}

node {

name: "add"

op: "Add"

input: "T1"

input: "T2"

attr {

key: "T"

value {

type: DT_INT32

}

}

}

versions {

producer: 24

}

The write_graph function used to store graph's data in file. The output in JSON file has 3 nodes, out of which two represents tensors T1 and T2 and one represents operation Add. The version object at the end of node file identifies the version as 24.