- brief introduction
What EZ share is: EZ share is an app privately made by four fresh students of Ali. It provides services for some sultry wechat users (I asked my colleagues in our department, but they didn’t provide an external interface, so they can’t do it). It matches some forced pictures according to the user’s mood, and can reasonably arrange the text and integrate it into the forced pictures, which is more compelling.
- system architecture
Because the function is relatively simple, the complexity of the system is also relatively simple. It is mainly divided into two parts: phonegap, native Android / IOS, as shown in the figure below.
In addition to providing interaction between the interface and app, phonegap also has a very important function, which is to use canvas to process pictures and integrate words into pictures. You might wonder why you did this? Although the performance is good when the original image is used, both Android and IOS need to write a code, and the original image processing is more troublesome.
The native part of Android / IOS is actually very simple, which is to access the SDK of wechat and share wechat.
Communication between phonegap and native. My approach is to establish a socket server in the native part and communicate through socket.
- Technical details of phonegap
3.1 phonegap debugging
I think the debugging of phonegap is troublesome. I use weinre for remote debugging. The principle is as follows:
Configuration steps: (be careful of details. I’ve been tossing around for hours because of some details)
【1】 Installing weinre using NPM
3Inject a JS into index.html (in the virtual machine, pay special attention not to write localhost or 127.0.0.1, otherwise you will request from the virtual machine itself)
【4】 Remember to add access permissions in config.xml
3.2 canvas processing pictures
As shown in the figure below, this part is mainly divided into three canvas layers:
【1】 The first layer: it mainly renders the background of the picture
【2】 Layer 2: it mainly renders some solid color backgrounds and renders them according to the text input by the user, and monitors the focus event of the input box. When the focus is lost, the text will be redrawn to the canvas on layer 2
【3】 Layer 3: the canvas display is hidden. When the user determines to send, the contents of the canvas on layer 1 and layer 2 will be rendered to layer 3, and a picture will be generated and stored in the SD card
Some technical details of native Android
In fact, it’s to access the wechat SDK. When I receive the information sent by phonegap, I send it to wechat. Now I’ve made Android because I’m too poor and don’t have an IOS device. I’ll buy it when I go back to work after graduation^_^ Anyway, there is not much native code, and it should be fast.
Want to make complaints about Android WeChat SDK access is a little trouble, ha ha! Make sure the package name and signing certificate are correct before sending. I also found some information. Some people can bypass the wechat SDK and directly wake up the sharing activity of wechat. I guess their practice is to check the source code in the wechat SDK package and modify it slightly!
Because our EZ team is still on vacation at school, and everyone is busy with graduation and travel, it will take a little time to launch it. Ha ha, it’s definitely worth having!