As a developer of multimedia applications, do you want to quickly develop innovative AI functions for media players? For example:
- In the process of playing low picture quality video, it is super divided frame by frame
- Let the screen full of flying bullets automatically bypass the main characters of the screen
HMS core 6.0.0’s open AV pipeline kit helps media application developers reduce the difficulty of developing innovative functions. By defining the standard interface of plug-ins and the flow mode of data flow between plug-ins, developers only need to complete plug-in development according to the standard interface, and they can quickly build new media scenes.
AV pipeline kit defines a set of plug-in standard interfaces, and has built-in data flow management, thread management, memory management, message management, etc. for the plug-in. Developers only need to realize the core processing logic of the plug-in without paying attention to the logic of thread synchronization, asynchrony, flow control, audio and video synchronization, etc. At present, three pipelines that can be used in playback scenes have been preset: video playback, video over division and sound event detection. Java interfaces are provided for developers to use. At the same time, developers can also call a single preset plug-in directly through C + + interface. If the preset plug-in or pipeline does not meet the use requirements, the developer can customize the plug-in and pipeline.
Technical scheme
Video super score
Next, we describe in detail the built-in high-performance video super sub plug-in, which is interspersed between the decoding and display process of video stream, converts low-resolution video into high-resolution video in real time, improves video definition, increases video detail expressiveness, and improves user viewing experience.
Development preparation
1. Create a new Android studio project and modify the project level build The gradle file is as follows
Add Maven warehouse address in “allprojects > repositories”.
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
2. Modify project level build The gradle file is as follows
Set targetsdkversion to 28; And add compilation dependencies in dependencies.
dependencies {
implementation 'com.huawei.hms:avpipelinesdk:6.0.0.302'
implementation 'com.huawei.hms:avpipeline-aidl:6.0.0.302'
implementation 'com.huawei.hms:avpipeline-fallback-base:6.0.0.302'
implementation 'com.huawei.hms:avpipeline-fallback-cvfoundry:6.0.0.302'
}
3. Configure manifest
Modify androidmanifest XML file, add the permission to read external storage.
4. Synchronous Engineering
Click the gradle synchronization icon in the toolbar to complete the synchronization of the “build. Gradle” file and download the relevant dependencies to the local.
Development steps
See example code for detailsGitHub
1. Dynamically apply for storage permission
String[] permissionLists = {
Manifest.permission.READ_EXTERNAL_STORAGE
};
int requestPermissionCode = 1;
for (String permission : permissionLists) {
if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, permissionLists, requestPermissionCode);
}
}
2. Initialize AV pipeline framework
Context context = getApplicationContext();
boolean ret = AVPLoader.initFwk(context);
if(!ret) return;
3. Create mediaplayer instance
The control of the playback process is completed by this instance.
MediaPlayer mPlayer = MediaPlayer.create(MediaPlayer.PLAYER_TYPE_AV);
if (mPlayer == null) return;
4. Set the graph configuration file
The AV pipeline framework relies on this configuration file to orchestrate various plug-ins. In addition, media needs to be_ ENABLE_ The value of CV is set to 1 to enable the video super sub plug-in.
MediaMeta meta = new MediaMeta();
meta.setString(MediaMeta.MEDIA_GRAPH_PATH, getExternalFilesDir(null).getPath() + "/PlayerGraphCV.xml");
meta.setInt32(MediaMeta.MEDIA_ENABLE_CV, 1);
mPlayer.setParameter(meta);
5. call the prepare interface after setting the following parameters to start mediaplayer preparation.
If you need to listen to some events, set the callback function through setonpreparedlistener, setonerrorlistener and other interfaces. (optional)
//Set the surface for video rendering
SurfaceView mSurfaceVideo = findViewById(R.id.surfaceViewup);
SurfaceHolder mVideoHolder = mSurfaceVideo.getHolder();
mVideoHolder.addCallback(new SurfaceHolder.Callback() {
//For user-defined callback function contents, please refer to codelab_ Video playback
});
mPlayer.setVideoDisplay(mVideoHolder.getSurface());
//Set the path of the media file to be played
mPlayer.setDataSource(mFilePath);
//If you need to listen to some events, you also need to set the callback function through the setxxxlistener interface
//For example, if you need to listen to the event that prepare is completed, you need to set the following settings
mPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mp, int param1, int param2, MediaParcel parcel) {
//User defined callback function content
}
});
mPlayer.prepare();
6. Call start to start playing
mPlayer.start();
7. Call stop to stop playing
mPlayer.stop();
8. Destroy the player
mPlayer.reset();
mPlayer.release();
9. Other precautions
See details for the constraints of video super sub plug-infile
Visit the official website of Huawei multimedia pipeline service to learn more about it
Obtain Huawei multimedia pipeline service development guidance document
Huawei multimedia pipeline service open source warehouse address:GitHub、Gitee
Huawei HMS core official forum
Solve integration problemsStack Overflow
Click the attention on the right side of the avatar in the upper right corner to learn the latest technology of HMS core for the first time~