dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference

Time:2022-5-14

On July 29, with the theme of “Wanxiang update”, datapipeline 2021 data management and Innovation Conference was successfully held in Beijing! Wei Kai, deputy director of cloud computing and big data Research Institute of Chinese Academy of communications and communications, Wei Dong, director of architecture management division of information technology management department of China CITIC Bank, Zhong Xing, technical expert of science and technology big data management department of Minsheng Bank head office, Ni Juntian, technical expert of core operation and maintenance group of Shandong Provincial Urban Commercial Bank alliance, fan Kefeng, expert of China Information Association Hu zhengce, general manager of opengauss database products of Huawei’s computing product line, met with datapipeline technology management, experts and scholars from this field, user representatives and ecological partners to discuss industrial opportunities and challenges and explore new models of data management and innovation practice. With a view to creating new value in the industry through the consensus of “ten thousand” convergence and data rebirth “.
Datapipeline said that the digital transformation was accelerated, and the industrial upgrading put forward unprecedented requirements for real-time data management ability. All data are required to be connected with each other and can be seen, perceived and called at any time. The construction of real-time data management system not only needs to see through the perspective of the large cycle of “early real-time data platform construction, medium-term deepening of real-time data value release and long-term real-time data driven business development”, but also needs to be patient with the continuous improvement of “personnel, process and technology” starting from the construction of platform infrastructure. This principle of adhering to the long-term principle coincides with the product and technology development concept of datapipeline digital technology.

01 data elementalization gives birth to technological and management changes

Data lifecycle management and value oriented operation management have become the development direction

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Wei Kai, deputy director of cloud computing and big data Research Institute of China Academy of communications and communications

With the development of information economy, data has long been integrated into the process of economic value creation together with other elements. Wei Kai, deputy director of the Institute of cloud computing and big data of the Chinese Academy of communications and communications, said: at present, we are in a new era to meet the rise of data production factors. In the process of digital transformation, data, as a factor of production, on the one hand, plays an increasingly important role in driving industrial intelligence and giving birth to new forms of production organization, promoting the creation of new products and services; On the other hand, as an element of distribution, the data involves the changes of economic structure. It has a great demand for our data technology and underlying infrastructure, so our management system, concept and methodology also have a great demand for innovation. This is mainly reflected in:

First Data elementalization requires a technological revolution:
From the previous pursuit of efficient data processing, the pursuit of faster and larger, to today, the organization pays more attention to data, more intelligence and good governance. Data is no longer simply put in the database can produce value, but to high-quality integration, and really make it a necessary element in the closed loop. In the past, inter organizational data was more protected, isolated, locked in a safe and kept confidential. In the future, in order to make the industrial Internet, the data should be connected upstream and downstream, between government and enterprises, and between enterprises. Therefore, its keywords have become open and integrated, resulting in privacy computing and blockchain. We found that the data flow within the organization and organizational structure is changing, and the technical focus of the organization is also changing. The technology of big data products is becoming richer and more widely used. The one-stop and full life cycle management of data is the direction we see.

Second, data elements are also promoting the reform of management:
The data asset management system with data governance as the core is gradually mature, and the data asset management system guided by data operation is in the bud. Data Management 2.0 data development and operation integration has changed from the perspective of management standard to the perspective of service business. Data management is more for business innovation and development, which puts forward higher requirements for platform automation, rapid response to needs, multi team cooperation and operation driven management. Under this trend, many enterprises began to practice dataops. Dataops is a collaborative data management practice, which is committed to improving the communication, integration and automation of data flow between data managers and users in the organization. Its core concepts include: agile development, closed-loop governance, safety and credibility, and continuous operation, which is indeed the inevitable direction of data management.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference

Fan Kefeng, an expert of China Information Association, talks about the change of data management from the perspective of digital transformation, which is also consistent with the above views. Considering the path of digital transformation, product selection and organization are important links. For the former, the technology implementation methods vary greatly, and it is emphasized that enterprises should choose their own digital tools according to their own development and industry attributes. For the latter, we must not ignore the important role of “people”. We should pay attention to the support of management and the cooperation of employees to improve the overall digital ability of the enterprise.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Fan Kefeng, expert of China Information Association

02 datapipeline: focus on the first kilometer of the data value release link

Dataops is an important principle guiding the development of datapipeline products and services

According to Chen Cheng, founder and CEO of datapipeline, “focusing on the first kilometer” is the best annotation for datapipeline to provide the industry with the driving force of digital transformation and innovation. He pointed out that in the process of enterprise digital transformation, more and more data applications have been built, the amount of data has increased in a large span, and the complexity of data has become higher and higher. The timeliness requirements of data have become stronger and stronger from both the demand side and the supply side. Compared with the “last kilometer” of scenario applications that directly provide value to customers, the “first kilometer” of data fusion is becoming more and more important and has rich connotation.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Chen Cheng, founder & CEO of datapipeline

In a large number of data management practices that optimize the cooperation between data science and operation teams, datapipeline has gradually found that dataops is an effective path to cross the gap of data-driven business change, such as business and technology, data and scenario, scale and quality. Dataops emphasizes the concept of data journey management for the transformation of data flow to value flow, which is also an important principle guiding the development of datapipeline products and services. Among them, the most important three points are:

First Embrace change. The rapid differentiation of technology scenarios produces a large number of different types of storage and computing engines, the emergence of excellent domestic databases under the general trend of information innovation, the rapid iteration of data structure under business orientation, and the environmental changes of network. The above changes make data scheduling and flow become extremely important capabilities. “The first kilometer” has become the key to building a solid foundation of data management, and it is also the initial intention of datapipeline to choose this field.

Second Organize collaboration. There are various roles in data management, such as data scientist, data analyst, Data Engineer, operation and maintenance personnel of big data platform, DBA, etc. various roles and their carrying contents change a lot and speed is fast, and there is a lack of cooperation in daily work. Datapipeline helps organizations through four layers of abstraction of products – system resource management, node management, link management and data task management, and provides corresponding entry and easy-to-use and visual interface for each type of role, so as to form a clear working boundary and good cooperation.

Third Automatic agility. The underlying logic of this principle is that low code and no code are better than development, and the preparation of various strategies is better than code adjustment. Datapipeline provides dozens of policy configurations at each level of abstraction. The level of detail can reach the conflict of writing components, the change of data structure, and whether there is an error queue. When customers respond to new changes, the cycle can be changed from monthly to hourly or even minute, which significantly improves the efficiency. The above are derived from datapipeline’s analysis and understanding of customer scenarios in the past five years, as well as the subdivision of technical functions.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference

03 user centered, focus on scene practice and create new value in the industry

It is rooted in the financial industry with high standards for safety, stability and performance

Datapipeline insists on taking users as the center, finds the combination point of business scenario and real-time data management technology, practices scenario based innovative solutions, and accelerates the digitization and intelligent upgrading of the industry.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference

Datapipeline helps Minsheng Bank build a real-time data management platform, which meets the needs in terms of accuracy of real-time synchronization of heterogeneous data, system stability, ease of use and security, and realizes the collection, synchronization and integration of enterprise level real-time data of Minsheng Bank. Data standardization development and data transmission based on datapipeline have reduced the development cost of Minsheng Bank, accelerated the release of real-time data value, and laid a solid foundation for the data middle platform strategy of Minsheng Bank.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Zhong Xing, a technical expert of the technology big data management department of the head office of Minsheng Bank

Zhong Xing, a technical expert of the science and technology big data management department of the head office of Minsheng Bank, said that in terms of real-time data preprocessing and application layer data synchronization, datapipeline was selected as a partner to jointly complete the implementation of real-time data synchronization pipeline components. The main reasons are as follows: first, at present, the financial industry has entered a period of rapid iteration of infrastructure, Minsheng Bank is also actively verifying the introduction of various open source and commercial basic components to meet the needs of data. Datapipeline is a company focusing on providing enterprise level heterogeneous data fusion solutions, which can continuously follow the changes in computing resources, operating systems, databases, middleware and other aspects in the industry, and continuously support the needs of partners; Second, the function and performance of datapipeline enterprise real-time data fusion platform can well meet the current needs of Minsheng Bank in real-time data preprocessing and synchronization. In addition to supporting rich data sources, the product is reasonably designed in task resource control, status monitoring, exception handling and recovery, and is easy to integrate with the existing data management and centralized monitoring system in the bank. Based on datapipeline products, the self-developed scheme based on open source components can accelerate project implementation and reduce costs.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Datapipeline helps Minsheng Bank build a real-time data management platform architecture

Ni Juntian, a technical expert of the core operation and maintenance group of Shandong Provincial Urban Commercial Bank alliance, said in his speech: the enterprise database quasi real-time data acquisition system built by datapipeline with the help of Shandong Provincial Urban Commercial Bank alliance is of great significance to promote its digital transformation, data standardization and intensive management, enable enterprise operation and enhance its lasting core competitiveness. Datapipeline can realize the second level real-time collection of data. The product has a unified and easy-to-use humanized operation interface, and rich configuration strategies can realize the efficient and full utilization of resources. At the same time, the product has the openness and scalability under the premise of standardized compliance and forward-looking judgment. Of course, the most important thing is its financial level stability and high error tolerance.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Ni Juntian, technical expert of core operation and maintenance group of Shandong city commercial bank Alliance

04 upgrade data technology and enhance the driving force of enterprise digital transformation

User demand leads the continuous evolution of technology and constructs digital innovation infrastructure

Technology has become an important driving force for industrial reform. With the coming of the “ten-year integration of science and technology” and the new form of industrial efficiency, the “new energy” of the “smart industry” will come from the new economic and technological integration in the future. Chen Su, datapipeline partner & CTO, said: the evolution of datapipeline’s technical architecture is essentially a user demand driven process. The technical architecture form we see in version 3.0 today is the result of continuous training and evolution in the real environment of customers over the years.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Chen Su, datapipeline partner & CTO

Looking back on the technology development route in the past 40 years, datapipeline believes that the technology architecture is a process of continuous learning from each other and continuous integration, but the blooming and long-term coexistence of data architecture is an inevitable trend. The construction of enterprise data infrastructure needs to do three things: selection, combination and connection. Connection is the key to realize combination. However, building a stable, high-performance real-time data pipeline that can adapt to the dynamic changes of business development requires expensive investment in time, money and manpower, and is full of challenges in landing. Datapipeline hopes to provide a product-oriented solution for users to solve these complex transactions in an integrated way. The above is the basic basis for datapipeline to make a choice based on the judgment of technology development trend and product commercial value.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference

In May this year, datapipeline officially released the real-time data fusion platform v3 0 milestone version is a key step in the evolution of products from tool type to enterprise level data fusion platform. V3. Version 0 has leapfrog improvement in technical maturity, mainly in its high performance, high availability and manageability.

First High performance. In version 3.0, datapipeline introduces a memory based data exchange method, which can effectively avoid the performance degradation caused by the expansion of the number of message partitions. Meanwhile, the end-to-end processing speed based on this mode is more than twice that of version 2.0.

Second High availability. Since version 1.0, the underlying runtime environment of datapipeline supports high availability. In version 3.0, datapipeline further realizes high availability of all platform components of the product. Users can flexibly deploy component nodes according to the requirements of availability to avoid single point of failure.

Third, manageability. According to the requirements of enterprise hierarchical management, the resources in the system are abstracted into nodes, links and tasks. Each layer can be managed and authorized independently. Users can define field type mapping, speed limit, alarm and other strategies on the link and apply them to the task level, so as to realize hierarchical fine management. At the same time, all important events and alarm information within datapipeline can be pushed to user-defined mailbox, file path or webhook, so as to seamlessly integrate with the existing operation and maintenance monitoring system of the enterprise. Datapipeline 3.0 is equipped with a four-level monitoring system of container, application, thread and business to realize all-round operation and maintenance improvement.

05 concentrate on Symbiosis and win-win with all roles in the industry

Go to the Starry Sea of real-time data management

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Hu zhengce, general manager of opengauss database products of Huawei computing product line

Hu zhengce, general manager of opengauss database products of Huawei computing product line, spoke highly of the past cooperation between Huawei opengauss and datapipeline, and introduced the ecological concept and achievements of “co construction, sharing and co governance” of opengauss enterprise open source database. Recently, datapipeline completed the compatibility test with gaussdb (for opengauss), signed the cla (contribution license agreement) and officially joined the opengauss community. In the future, datapipeline will give full play to its independent innovation ability and product advantages, work with Huawei cloud to create an effect of 1 + 1 greater than 2, and contribute to the development of data management in finance and other industries.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference

At the round table forum session of the conference, the participating experts jointly discussed issues such as “opportunities and challenges for the development of the information and innovation industry” and “the demands of the financial industry for the core data system”. Among them, Wei Dong, director of the architecture management division of the information technology management department of China CITIC Bank, said: Xinchuang is the cornerstone of the safe and healthy development of China’s digital economy. China’s information and innovation industry has accelerated towards easy use and comprehensive promotion, and the ecology of information and innovation industry has begun to take shape. Facing the rapid development of digital economy, Xinchuang will bring immeasurable value to thousands of industries. Technicians should have an open mind and learn from them all over the world. More importantly, while the demand is driven by the market, the IT industry represented by basic software should also adhere to deepening the supply side structural reform. Product technical experts should have the ability to capture requirements in rich scenarios, and forward-looking put forward some new functions and features from the level of product creation. This is very important for manufacturers. At the same time, it also puts forward relatively high strategy research capabilities, including high-level capabilities such as joint user side and joint creation.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference
Round table dialogue

At the round table forum, partners from all fields of the industry also brought insights and expectations for the data management industry. “Potential, innovation, focus, integration and co construction, ecological prosperity and differentiation highlights” — datapipeline 2021 data management and Innovation Conference was successfully concluded in the message of on-site guests to datapipeline.

The construction of real-time data management platform is just the beginning, and the development of data management industry is long-distance. With “adhering to technology driven enterprise service” as the initial goal and anchor point, datapipeline will make every effort to open up the whole process capability of “real-time data integration service quality”, form a full link real-time data asset management business system, symbiosis and win-win with partners, provide global industry users with stable, efficient, safe, reliable, open and compatible digital innovation infrastructure, and accelerate enterprise business innovation, transformation and upgrading.

dried food! Focus on the whole chapter of datapipeline 2021 data management and Innovation Conference

Click me to learn more about datapipeline and try it for free