Quickly understand the micro service application in cloud origin (including welfare)

Time:2020-9-16

[Abstract]The areas affected by cloud native applications are gradually moving from Internet to non internet, from traditional application upgrading to cloud native. At present, the maturity of cloud native technology greatly affects the production and life style of individuals, enterprises and even the whole society.

“Software of the future must grow on the cloud”

Application of cloud native Era

In the cloud native era, with the continuous innovation and rapid development of container technology, microservice Architecture Idea and product R & D operation mode, the landing threshold of application design and development has dropped to a historical low point. According to a survey conducted by IDC, an internationally renowned data consulting company, more than 500000 applications will be created from 2018 to 2023, which is the sum of all applications created in the past 40 years.

In addition, IDC’s “IDC futurescape: global cloud computing 2020 forecast – China’s Enlightenment” released by IDC in February 2020 shows that the areas affected by cloud native applications are gradually moving from Internet to non internet, and from traditional application upgrading to cloud native. At present, the maturity of cloud native technology greatly affects the production and life style of individuals, enterprises and even the whole society.

The following are several predictions that are strongly related to cloud native applications:

  • Distributed cloud: by 2021, more than 90% of Chinese enterprises will rely on a combination of local / exclusive private cloud, multiple public clouds and legacy platforms to meet their infrastructure needs.
  • API ecology: by 2023, 90% of new digital services will use services provided by the public cloud and internal APIs to build composite applications; half of them will utilize artificial intelligence (AI) and machine learning (ML).
  • Multi cloud management: by 2022, 50% of enterprises will deploy unified VMS, kubernetes and multi cloud management processes and tools to support multi cloud management and governance across local and public cloud deployment.
  • Cloud stack expansion: by 2024, 10% of the internal enterprise workload will be supported by the public cloud stack outside the public cloud service provider’s data center, located in the customer data center and at the edge.
  • Super agile app: by 2023, 50% of Chinese enterprise applications will be deployed in a containerized hybrid cloud / multi cloud environment to provide an agile, seamless deployment and management experience.

In this application transformation, more and more application owners will hand over the application infrastructure to more professional public cloud / hybrid cloud service providers, manage the infrastructure through API, and provide more agile and seamless deployment management functions by service providers. In this way, the application owners can focus more investment and manpower on the design, development, operation and maintenance and experience Optimization of the application itself, greatly reducing the time to market and getting higher scalability, so as to maximize the ROI (return on investment) of application development.

Using microservice architecture to build cloud native applications

There are many versions of the definition of cloud native application. Pivital first proposed the definition of cloud native application in 2015. Then CNCF also defined cloud native application in 2015 and redefined it in 2018. For specific definition, please refer to kubernetes Handbook. It can be found that since the emergence of cloud native concept, microservice architecture is a part of cloud native applications.

In this section, we will talk about the usage scenarios of microservice architecture, the position of microservice applications in the entire application technology stack, and what you need to do to develop a microservice.

Scenarios using microservices

To build cloud native applications, first of all, enterprises or individuals want to liberate their time and energy from the complex underlying dependency development and maintenance to focus on the design and implementation of business scenarios, and be able to independently decouple the development and implementation of each module of the application. This means that the first mock exam or a single business developer will make the most of the common goal of the application development and operation by using the DevOps tool chain provided by the cloud manufacturer. This way, you can easily use the application as a loosely coupled service set to release and update quickly, reducing the cost and avoiding single point failure.

The position of microservice application in technology stack

Assuming that the application owner has completed the business design of microservices, let’s take a look at the position of microservice applications in product R & D and operation at the landing stage

Quickly understand the micro service application in cloud origin (including welfare)

Red partThe core module of microservice application is run-time microservice application developed and maintained by application owner. With the growth of business and the influence of system capability, in order to improve the high availability, reliability and toughness of microservices, it is necessary to govern microservices. Common governance measures are: load balancing, fusing, current limiting, degradation, fault tolerance and isolation, etc., which are limited in length, and will not be discussed here.

Yellow partFrom left to right represents the technology stack from dev to OPs. First of all, application developers need to select the dependent microservice development framework (Chase) according to the application type. When using the framework, they can handle the cross cutting concern faced by microservice runtime by adding annotations, such as log4j / logback, health check, metrics, distributed tracing, etc. Secondly, after the coding is completed, the Devops tool chain capabilities provided by cloud service providers can be used to realize the code archiving, compilation and construction, release and deployment, and the microservices can be deployed in the running environment. Finally, we can use the operation and maintenance capabilities provided by cloud service providers to monitor the operation and maintenance of microservices. Generally speaking, the application platform capabilities provided by cloud service providers are independent and decoupled, and application owners can customize and select the services they need according to their own needs and budgets.

Purple partIs the runtime technology stack, and the blue arrow represents the flow direction. After the microservice deployment and operation, the traffic will be connected to the portal from various clients (such as service gateway / ELB). At the same time, the traffic will be distributed to each corresponding business processing microservice according to the request characteristics, and then a series of processing of the request will be carried out, and the results will be returned. The operation of microservices also depends on a lot of middleware, such as cache, message, etc.; there are also some functional features of microservices, such as service grid, service registration and discovery, which are also provided by framework or cloud service vendors. Microservices and middleware are all upper level services deployed on the infrastructure, such as virtual machine, container or CCI instance.

To sum up, the landing of an application actually involves many technologies and scenarios. Using microservice architecture to develop an application can simplify the management and operation of the underlying facilities and middleware by the application owner to the greatest extent. By customizing the application platform capabilities of cloud service providers in all scenarios and end-to-end, resources are focused on business innovation and landing (red box part).

Practice all above

Finally, Huawei cloud CSE microservice platform provides configuration center and service center with servicecomb open source framework, providing dynamic configuration and reliable service center services. The large-scale production practice of servicecomb service center in Huawei (supporting the operation of Huawei mall) supports hundreds of thousands of TPS service clusters, and its reliability has been fully verified.

Developers can enjoy the out of the box microservice middleware on the cloud. Through the CSE microservice platform, they can learn practice as well as production practice. Learning microservice can follow the latest technological trend.

Directions for Huawei cloud feedback activities

One yuan experience original price 500 yuan package cycle engine (100 examples), click here to experience!

Click follow to learn about Huawei’s new cloud technologies~

Quickly understand the micro service application in cloud origin (including welfare)

Recommended Today

How to share queues with hypertools 2.5

Share queue with swote To realize asynchronous IO between processes, the general idea is to use redis queue. Based on the development of swote, the queue can also be realized through high-performance shared memory table. Copy the code from the HTTP tutorial on swoole’s official website, and configure four worker processes to simulate multiple producers […]