In tensorflow, compute graph is used to represent compute task. Computational graph is a kind of directed graph, which is used to define the structure of computation. In fact, it is a combination of a series of functions. In the form of graph, users can build a complex operation by using some simple and easy to understand mathematical function components
Using graph in tensorflow is divided into two steps: establishing calculation graph and execution graph
Graph, formally composed of nodes and edges. – nodes, represented by circles, represent some calculations or operations on data. -Edges, represented by arrows, is the actual value (tensor) passed between operations
graph = tf.Graph()
Access context manager
Use “with” to indicate that we use context manager to tell tensorflow that we are going to add OP to a specific graph
Execution graph must be started in session. Session distributes the op of graph to CPU or GPU, and provides methods to execute op. after these methods are executed, tensor will be returned
sess = tf.Session()`
Once the session is turned on, you can use run() to calculate the desired tensor value
When you finish the conversation, remember to turn it off
Fetches is session.run It can receive any op or tensor we want to execute, or their corresponding list structure.
- If it’s tensor, then session.run () output is a numpy array
- If it’s OP, then session.run () output is none
For example: sess.run (b) It tells session to find all the nodes needed to calculate B, execute them in order and output the results.
Global variable initialization
It means that all defined variables are ready for subsequent use. This OP can also be passed to session.run For example:
init = tf.global_variables_initializer() … sess.run(init)
Tensor tensor and op
Tensor is used to represent all data structures in tensorflow, and tensor is passed between OP nodes
- When defining tensors, numpy can be directly used to pass to the op node, because the op of tensorflow can convert the python data type into tensors, including numbers, Boolean, strings or list.
- Any node in the graph is called operation
- The output of each OP is passed to other OPS or sess.run ()
Variable, which is used to maintain the state information during the execution of the graph, needs to maintain and update the parameter values, and needs to be adjusted dynamically.
- Tensor and operation are immutable, while variable can be changed over time
- Variables can be used in any OP that uses a tensor, and its current value will be passed to the op that uses it
- Usually, the initial value of variable is some large 0, 1 or random value tensor, or built-in OP: tf.zeros (), tf.ones () etc
- In the graph, the status of a variable is managed by a session and initialized in the session. The session can track the current value of the variable.
- Session can initialize only a part of variables
- use tf.variables_ Initializer, pass in the initialization variable list
- The value of variable can be changed
- For example, using variable.assign (variable), or variable.assign_ add(1)
- Each session maintains an independent variable value. Different session values of the same variable can be different.
- When various optimizers train the machine learning model, the variable will change accordingly. When using the trainable = false attribute in the variable, it can not be changed by the optimizer.
- name_ Sopes can be used to manage graphs and put a group of OPS into a block
The above is the whole content of this article, I hope to help you learn, and I hope you can support developer more.