How to create a sense of Tanabata atmosphere in the way of programmers?

Time:2022-4-22

Once again, a cold knowledge will be double into the Qixi Festival of the year.

At this time of the year, gift giving guides are flying all over the world, and selection difficulties are forced out. People who can handle love are very confused about gift giving!
Let me say that the most important thing for a romantic festival like Tanabata is to create an atmosphere! Today, I’d like to introduce a small gift of “full atmosphere” for programmers on Tanabata——With the help ofHuawei image service, develop a super love Tanabata dynamic map, it supports the input of special keywords to trigger the “Tanabata” special effect; At the same time, the picture can also follow the fingertip touch to produce dynamic effect… See the effect first ↓↓↓

Demo effect

If you have an object, show it to the object. Single friends can also be members of the Tanabata atmosphere group first. The method is available. Maybe you can use it next time.

Don’t talk much, open the whole!

Development steps

one Keyword animation playback

Step 1: material preparation

First, find a suitable picture. Here we choose a picture of Cowherd and Weaver Girl:

Then take the animated parts out of the picture. We took out the four elements of cloud, cowherd, Weaver Girl and red heart.

Step 2: integration preparation

First complete the developer registration, application creation and signature configuration according to the following instructions:
https://developer.huawei.com/consumer/cn/doc/development/Media-Guides/config-agc-0000001050199019?ha_source=hms1

Then configure the code warehouse and compilation dependency as follows:

  1. Configure in the project level “build. Gradle” file:
buildscript {
    repositories {
        google()
        jcenter()
        //Configure Maven warehouse address of HMS core SDK.
        maven {url 'https://developer.huawei.com/repo/'}
    }
    dependencies {
        ...
        //Add AGCP plug-in configuration.
        classpath 'com.huawei.agconnect:agcp:1.4.2.300'
    }
}

allprojects {
    repositories {
        google()
        jcenter()
        //Configure Maven warehouse address of HMS core SDK.
        maven {url 'https://developer.huawei.com/repo/'}
    }
}
  1. Configure the compilation dependency in the “build. Gradle” file at the application level (current latest version 1.0.3.301):
dependencies { 
implementation 'com.huawei.hms:image-render: 1.0.3.301' 
  implementation 'com.huawei.hms:image-render-fallback: 1.0.3.301'
}
  1. Configure permissions

Configure the permissions required by the application in the “androidmanifest. XML” file.

Step 3: function development

  1. Interface design

Here, the simplest interface is used to configure input boxes and buttons in a FrameLayout:

We will debug and display animation in this middle note FrameLayout.

  1. Configure storage permission request

In the oncreate () method of mainactivity, check whether you have the permission to write to the storage. If it is missing, call the requestpermission method to write_ EXTERNAL_ Apply for storage permission:

int permissionCheck = ContextCompat.checkSelfPermission(ImageKitRenderDemoActivity.this, Manifest.permission.WRITE_EXTERNAL_STORAGE);
if (permissionCheck == PackageManager.PERMISSION_GRANTED) {
initData();
initImageRender();
} else {
ActivityCompat.requestPermissions(ImageKitRenderDemoActivity.this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE}, PERMISSION_REQUEST_CODE);
}

If you already have permission, or after the permission application is successful, initialize the image rendering module

@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
    if (requestCode == PERMISSION_REQUEST_CODE) {
        if (grantResults.length > 0
                && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
            // The permission is granted.
            initData();
            initImageRender();
        } else {
            // The permission is rejected.
            Log.w(TAG, "permission denied");
            Toast.makeText(ImageKitRenderDemoActivity.this, "Please grant the app the permission to read the SD card", Toast.LENGTH_SHORT).show();
        }
    }
}
  1. Image rendering module initialization

Get render instance, initialize, and get render view. The directory of animation elements will be specified here:

ImageRender.getInstance(context, new ImageRender.RenderCallBack() {
        //Get the scene dynamic effect service instance and call back successfully to return the scene dynamic effect service instance
        @Override
        public void onSuccess(ImageRenderImpl imageRender) {
            imageRenderAPI = imageRender;
            if (imageRenderAPI != null) {
                int initResult = imageRenderAPI.doInit(sourcePath, Utils.getAuthJson());
                Log.i(TAG, "DoInit result == " + initResult);
                if (initResult == 0) {
                    // Obtain the rendered view.
                    RenderView renderView = imageRenderAPI.getRenderView();
                    if (renderView.getResultCode() == ResultCode.SUCCEED) {
                        View view = renderView.getView();
                        if (null != view) {
                            // Add the rendered view to the layout.
                            contentView.addView(view);
                            hashCode = String.valueOf(view.hashCode());
                        } else {
                            Log.w(TAG, "GetRenderView fail, view is null");
                        }
            }
        }
        //Failed to get the scenario dynamic effect service instance. The callback returns the error code
        @Override
        public void onFailure(int errorCode) {
        ...
        }
    });
  1. Configure keywords to play animation

Remember the input boxes and buttons left in front? We use the keyword “love” for animation playback. You only need to pass the imagerender API Playanimation() can trigger:

wordInput = findViewById(R.id.textinput);
enterBtn = findViewById(R.id.enter);
enterBtn.setOnClickListener(v -> {
    String inputContent = wordInput.getText().toString();
    if (inputContent.contentEquals("Love")) {
        if (null != imageRenderAPI) {
            imageRenderAPI. playAnimation();;
            wordInput.setVisibility(View.GONE);
            enterBtn.setVisibility(View.GONE);
        }
    } else {
        Toast. Maketext (this, "think again?", Toast. LENGTH_ SHORT). show();
    }
});
  1. Configure animation

The frame is set up. Now we come to the animation part. Image kit provides 5 kinds of basic dynamic effects and 9 kinds of advanced dynamic effects, which can meet the use requirements of most scenes.
Here we use transparency animation, displacement animation, scaling animation, and falling animation.

The animation configuration of image kit is in manifest XML file. Don’t talk to androidmanifest The XML file is confused.

First, configure the virtual screen width and background picture. After configuring the virtual screen width, the system will scale the animation according to different resolutions to keep the effect consistent.

We hope that the Cowherd and the weaver girl can gradually approach until they meet. At this time, two elements of Cowherd and the weaver girl are added to configure the moving line respectively. At this time, the special effects of displacement animation are used:

In this way, the Cowherd and weaver girl will approach the middle from both sides of the screen until they meet.

After the meeting, a beating red heart will appear in the center. Here, the superposition effects of transparency animation and scaling animation are used:

Up to now, the key elements have been found, but it still looks a little dry. We need to find something to embellish it.
The clouds in the sky can also move, making it more flexible in a small range:

If you want to be more romantic and sprinkle some petals, the falling dynamic effect is used here:

  1. It’s done here. Finally, you can keep the animation
//Start recording
int resultCode = imageRenderAPI.startRecord(json, new IStreamCallBack () {
    //In the callback of recording success, save the video or GIF byte array as an mp4 or GIF file
    @Override
    void onRecordSuccess(HashMap map) {
        ...
        String recordType = (String) hashMap.get("recordType");
        byte[] videoBytes = (byte[]) hashMap.get("videoBytes");
        byte[] gifBytes = (byte[]) hashMap.get("gifBytes");
        try {
            if (recordType.equals("1")) {
                if (videoBytes != null) {
                    //Save MP4 file
                    saveFile(videoBytes, mp4Path);
                }
            } else if (recordType.equals("2")) {
                ...
            } else if (recordType.equals("3")) {
                ...
            }
        } catch (IOException e) {
            ...
        }
        ...
    }

    //Recording failure callback
    @Override
    void onRecordFailure(HashMap map) {
        ...
    }

    //Recording progress callback. The value range of progress is 0-100
    @Override
    void onProgress(int progress) {
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                textProgress. Settext ("current recording progress:" + progress + "%");
            }
        });
    }
});

The above will get the final effect~

In addition to the dynamic effect, image kit also provides a filter function, which can add romantic colors to the pictures, the function of decals and words, and add love elements to the user’s pictures.

Learn more > >

visitHuawei image service official website

visitOfficial website of Huawei developer Alliance

obtainDevelopment guidance document

Huawei mobile service open source warehouse address:GitHubGitee

Follow us and learn the latest technical information of * * HMS core * * at the first time~