Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% – 50%

Time:2020-5-22

Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% - 50%

Recently, the Graphics Lab of the computer department of Tsinghua University announced a new open-source deep learning framework: jittor, which is also the first open-source deep learning framework developed by universities in China.

According to the official website, this is a deep learning framework based on just in time, internal use of innovative meta operators and unified computing graph.

According to the official comparison of features, jittor has many advanced features compared with the international mainstream platform. Compared with the same type of framework, jittor achieves a performance improvement of 10% – 50% when the convergence accuracy is the same.

Two core innovations: meta operator and unified computation graph

Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% - 50%

According to the official website, jittor has three main design concepts:

  1. Easy to use and customizable: just a few lines of code are needed to define new operators and models.
  2. Separation of implementation and Optimization: the front-end interface can focus on the implementation, and the implementation can be automatically optimized by the back-end.
  3. All are immediate: all of jittor’s code is compiled and running on the fly, including jittor itself. Users can modify all of jittor’s code at any time and run it dynamically.

These ideas and the ability to realize them are attributed to jittor’s two core innovations: meta operator and unified computation graph.

Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% - 50%

A friend who knows deep learning may know that the convolutional neural network used in deep learning is a computational network composed of operators. Due to architecture design and continuous expansion, there are up to 2000 kinds of operators in the current deep learning framework, which is complex in system and difficult in optimization and transplantation.

Jittor further decomposes the operator operations to form more than 20 closure of more than three kinds of meta operators. At present, the common operators of neural network can be expressed by the combination of meta operators.

On the other hand, in order to face the development trend of deep learning framework in the future, jittor uses the advantages of meta operator combination expression, proposes a unified computing graph for optimization, and designs a new dynamic compilation architecture from the bottom.

The architecture supports a variety of compilers, realizes the real-time compilation and dynamic operation of all codes, ensures the separation of implementation and optimization, and greatly improves the flexibility, scalability and portability of application development.

Secondly, in the setting of operators, the team closed the back propagation of meta operators, that is, the back propagation of meta operators is also meta operators. This avoids repeated development. In addition, it supports the calculation of any higher derivative.

Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% - 50%

According to the comparison of the calculation chart characteristics between jittor and other platforms provided by the government, jittor has many advanced features compared with the international mainstream platforms.

Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% - 50%

Compared with the same type of framework, jittor achieves a performance improvement of 10% – 50% when the convergence accuracy is the same.

In terms of programming language, jittor front-end language chooses python. The front-end uses a modular design, similar to pytorch and keras. Users can write Python code for meta operator calculation, and jittor dynamically compiles it into C + +, thus improving performance.

The back end is directly written in high-performance language, such as CUDA, C + +.

Review the deep learning framework in China

Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% - 50%

The reason why the jittor of Tsinghua open source has been widely discussed is that it is the first open-source deep learning framework from Chinese University research institutions. Previously, only theono and Caffe of UC Berkeley from Montreal University of Canada were the open-source deep learning frameworks from universities.

As the largest market for the development and application of artificial intelligence industry in China, we should have a place in the whole industrial chain of artificial intelligence ecology. Let’s take a look at the large-scale in-depth learning framework of domestic R & D. if there is any omission, you are welcome to supplement it in the message area:

1. Tsinghua meter jittor

According to the core development members of the jittor team, the basic functions of jittor were completed by the end of 2019, and then after internal testing, it was officially released and open-source recently.

“Plan” means to plan and seek. This word first appeared in Cui Zhiyuan’s answer to Xuzhou’s PU book in Tang Dynasty: “there are generals in the city now, who plan plans for the future. Please don’t shake up the army, even those who want to start Thailand.”

Although the official didn’t explain the Chinese name, according to the R & D team, deep learning has developed rapidly. Tensorflow and pytorch, the old mainstream frameworks, will not perform well in the new model, algorithm and hardware, so new frameworks are needed, which are easy to expand and efficient at the same time.

Jittor official website: https://cg.cs.tsinghua.edu.cn…
GitHub address: https://github.com/Jittor/jittor

2. Tencent Youtu ncnn

Ncnn is the first open source project of Tencent Youtu laboratory, which was officially opened in July 2017.

This is a high performance neural network forward computing framework optimized for the mobile terminal. From the beginning of the design, ncnn deeply considers the deployment and use of the mobile terminal. No third-party dependence, cross platform, mobile CPU speed faster than all known open source frameworks. Based on ncnn, developers can easily transplant deep learning algorithm to mobile terminal for efficient execution, and develop artificial intelligence app.

Ncnn has been used in many applications of Tencent, such as QQ, qzone, wechat, daily P chart, etc.

GitHub address: https://github.com/Tencent/ncnn

3. Baidu paddlepaddle

As the first in-depth learning and open-source platform in China, baidu began to invest in R & D since 2013, and officially opened up at the end of August 2016.

Paddlepaddle is a comprehensive open source platform, including the core development, training, deployment framework, and a very rich model library. Based on this model library, paddlepaddle can cover many classic application scenarios, and developers can carry out secondary development or use it directly. Based on this model library, paddlepaddle also provides an end-to-end Development Suite, focusing on common tasks and scenarios in the field of artificial intelligence.

On top of the end-to-end Development Suite is a complete set of tool components, which can help developers solve more problems in AI applications. At the same time, paddlepaddle also provides many deployment tool chains for developers to deploy their own applications.

GitHub address: https://github.com/PaddlePaddle

4. Alibaba x-deep learning

X-deep learning (hereinafter referred to as XDL) is independently developed by Alibaba mom, a big data marketing platform under Alibaba, based on its own advertising business, and has been widely deployed in core production scenarios.

XDL adopts the concept of “bridging” architecture design. This architecture makes XDL seamlessly connect with the industry’s open source community. For example, users can easily apply the most advanced open-source deep learning algorithm based on tensorflow or pytorch on XDL framework. In addition, for the enterprise or individual users who are already using other open-source frameworks, they can also easily expand on the basis of the original system and enjoy the ultimate distributed ability under the high-dimensional sparse data scenario brought by XDL.

GitHub address: https://github.com/alibaba/x-…

Deep learning technology is widely used in various fields of artificial intelligence, such as computer vision, machine translation, natural language processing, intelligent robots and so on, making unprecedented breakthroughs.

At present, on the one hand, with the emergence of deep learning new technology and the improvement of task complexity, it is a development trend that it is easy to expand while maintaining efficient architecture; on the other hand, with the rapid development of China’s artificial intelligence industry, we need to build our own open-source deep learning ecology, which is also a great opportunity.

Although the open source framework represented by tensorflow and pytorch has been successful to some extent, China’s development in this field started late and has some disadvantages at present. However, from the perspective of open source framework alone, there is still a lot of room for improvement, whether it is the core technology capability at the bottom or the solution capability for specific application scenarios.

It is hoped that domestic technology enterprises and research institutions can catch up with each other and will no longer be “stuck” in core technology.

Jittor, a Tsinghua deep learning framework, is open-source, innovative meta operators and unified computing diagrams, and the reasoning speed can be increased by 10% - 50%