Tensorflow basics (machine learning open source software library)

angryTom
Release: 2019-11-29 14:53:00
forward
3450 people have browsed it

Tensorflow basics (machine learning open source software library)

Note: The example in this article uses Python version 3.5.6 and Tensorflow version 2.0

Introduction

Tensorflow is Google The machine learning open source artifact launched has good language support for Python, supports hardware such as CPU, GPU and Google TPU, and already has a variety of models and algorithms. Currently, Tensorflow has been widely used in many machine learning and deep learning fields such as text processing, speech recognition, and image recognition.

Basic framework

is divided into three layers: application layer, interface layer and core layer

Recommended: "python tutorial

Application layer

Provides machine learning-related training libraries and prediction libraries And programming environments for languages ​​​​such as Python, C and Java, similar to the front end of web systems, mainly implement the construction of calculation graphs.

Interface layer

Encapsulates Tensorflow function modules to facilitate calls from other language platforms.

Core layer

The most important part includes the device layer, network layer, data operation layer and graph computing layer, which performs application layer calculations.

1. Device layer

Includes the implementation of Tensorflow on different hardware devices. It mainly supports CPU, GPU, Mobile and other devices, and implements calculation commands on different hardware devices. The conversion provides a unified interface to the upper layer to realize the cross-platform function of the program.

2. Network layer

The network layer mainly includes RPC and RDMA communication protocols to realize data transmission and updates between different devices. These protocols will be used in distributed computing used in.

3. Data operation layer

takes tensor as the processing object to implement various operations and calculations of tensor.

4. Graph computing layer

Includes the implementation of distributed computing graphs and local computing graphs, and realizes the creation, compilation, optimization and execution of graphs.

Design Concept

Tensorflow can be understood as the "flow of tensors" in a calculation graph, where Tensor (tensor) represents the Edges and Flow represent the flow of data formed by operations performed by nodes in the computational graph.

The design concept is based on data flow. After building the corresponding machine learning model, the training data is used to flow data in the model, and the results are fed back to the parameters in the model in the form of backpropagation. , to adjust parameters, and use the adjusted parameters to iteratively calculate the training data again.

Programming features

There are two programming features:

The definition of the graph and the operation of the graph are completely separated

In tensorflow , it is necessary to define various variables in advance, establish relevant data flow diagrams, create calculation relationships between various variables in the data flow diagram, and complete the definition of the diagram. The input data of the operation need to be put in before the output value can be formed. .

The calculation of the graph is performed in the session

The relevant calculations of tensorflow are defined in the graph, and the specific running environment of the graph is in the session. The calculation can only start after the session is opened. , closing the session means no more calculations can be performed.

For example:

import tensorflow as tf
tf.compat.v1.disable_eager_execution()
a = 3
b = 4
c = 5
y = tf.add(a*b, c)
print(y)
a = tf.constant(3, tf.int32)
b = tf.constant(4, tf.int32)
c = tf.constant(5, tf.int32)
y = tf.add(a*b, c)
print(y)
session = tf.compat.v1.Session()
print(session.run(y))
session.close()
Copy after login

It can be seen that after the graph is created, data calculation is performed in the session and the final result is output.

The advantage of the design is: in the learning process, the most consumed part is the training of data. If designed in this way, when the calculation is performed, the graph has been determined, and the calculation is only a continuous iterative process.

Basic concept

Tensor

Tensor is the most important data structure in tensorflow. Tensor is used in For data transfer in the calculation graph, after creating a tensor, you need to assign it to a variable or placeholder before adding the tensor to the calculation graph.

session

Session is the specific executor of the calculation graph in Tensorflow and actually interacts with the graph. There can be multiple graphs in a session. The main purpose of the session is to add training data to the graph for calculation, and also to modify the structure of the graph.

It is recommended to use the with statement in the calling mode:

with session:
    session.run()
Variable
Copy after login

Variables represent various calculation parameters in the graph. The machine learning algorithm is optimized by adjusting the status of these variables. To create a variable, tf.Variable() should be used. By inputting a tensor, a variable is returned. The variable must be initialized after being declared before it can be used.

Example:

import tensorflow as tf
tf.compat.v1.disable_eager_execution()
tensor = tf.ones([1, 3])
test_var = tf.Variable(tensor)
# 初始化变量
init_op = tf.compat.v1.global_variables_initializer()
session = tf.compat.v1.Session()
with session:
    print("tensor is ", session.run(tensor))
    # print("test_var is ", session.run(test_var))
    session.run(init_op)
    print("after init, test_var is", session.run(test_var))
Copy after login

Placeholder

Placeholder is used to represent the format of input and output data, declares the data location, and allows incoming Specify the type and shape of data, obtain the data through the feed_dict parameter in the session, and use the obtained data to perform calculations when the calculation graph is running. The obtained data will disappear after the calculation is completed.

Example:

x = tf.compat.v1.placeholder(tf.int32)
y = tf.compat.v1.placeholder(tf.int32)
z = tf.add(x, y)
session = tf.compat.v1.Session()
with session:
    print(session.run([z], feed_dict={x: [1, 2], y: [2, 3]}))
Copy after login

Operation

Operation is a node in the graph. The input and output are Tensor, and its function is to complete various operations. Includes:

Mathematical operations: add, sub, mul, div, exp...

Array operations: concat, slice, split, rank...

Matrix operations :matmul, matrixinverse...

Neural network construction: softmax, sigmoid, relu...

检查点:save, restore ...

队列和同步:enqueue, dequeue, mutexacquire, mutexrelease ...

张量控制:merge, switch, enter, leave ...

Queue

队列,图中有状态的节点。包含入列(endqueue)和出列(dequeue)两个操作,入列返回计算图中的一个操作节点,出列返回一个tensor值。

其中,队列有两种:

1. FIFOQueue

按入列顺序出列的队列,在需要读入的训练样本有序时使用。举个例子:

fifo_queue = tf.compat.v1.FIFOQueue(10, 'int32')
init = fifo_queue.enqueue_many(([1, 2, 3, 4, 5, 6], ))
with tf.compat.v1.Session() as session:
    session.run(init)
    queue_size = session.run(fifo_queue.size())
    for item in range(queue_size):
        print('fifo_queue', session.run(fifo_queue.dequeue()))
Copy after login

2. RandomShuffleQueue

以随机顺序出列的队列,读入的训练样本无序时使用。举个例子:

rs_queue = tf.compat.v1.RandomShuffleQueue(capacity=5, min_after_dequeue=0, dtypes='int32')
init = rs_queue.enqueue_many(([1, 2, 3, 4, 5], ))
with tf.compat.v1.Session() as session:
    session.run(init)
    queue_size = session.run(rs_queue.size())
    for i in range(queue_size):
        print('rs_queue', session.run(rs_queue.dequeue()))
Copy after login

本文来自 python教程 栏目,欢迎学习!  

The above is the detailed content of Tensorflow basics (machine learning open source software library). For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:cnblogs.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template