Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

Time:2020-11-4

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

Recently, Tencent cloud real-time video and audio TRTC SDK was used to learn how to develop educational live broadcast app. One of the requirements is often used in various live broadcast scenarios, that is:

How to realize live broadcast of multiple people online at the same time

Let out the effect picture first

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

—“Serious face” ignores the greasy face on the screen—

So today, let’s talk about how to use the texture collectionnode to complete the development of this function.

For learning texture, please refer to the texture official website

Before writing, we need to introduce Tencent’s real-time audio and video TRTC. Through TRTC, we can quickly render real-time video data to the view. We don’t need to consider how to realize real-time video and audio live interaction, so that we can focus on our own business logic.

Tencent real time audio and video TRTC

Tencent real-time communication (TRTC) has accumulated Tencent’s in-depth network and audio-video technology in the past 21 years. It is open to developers through Tencent cloud services with two scene solutions: multi person audio and video call and low delay interactive live broadcast. It is committed to helping developers quickly build low-cost, low delay and high-quality audio and video interactive solutions.

The real-time audio and video TRTC focuses on the solution of multi person audio and video call and low delay interactive live broadcast, which provides SDK of applet, web, Android, IOS, electron, windows, MacOS, Linux and other platforms for developers to quickly integrate and connect with real-time audio and video TRTC cloud service background. Through the interaction of different Tencent cloud products, it can simply and quickly use real-time audio and video TRTC with instant messaging IM, cloud live CSS, cloud VOD and other cloud products to expand more business scenarios.

The product architecture of real-time audio and video TRTC is shown in the following figure:

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

During the development process, it is found that the integration of Tencent real-time audio and video TRTC SDK is still very fast, and the delay of live video and voice live broadcast is within acceptable range in real-time experience. At present, we use the following core functions:

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

For specific introduction and use, you can directly refer to the official website real-time audio and video TRTC product details

Tencent demo

First of all, because it is using Tencent’s real-time audio and video TRTC SDK provided by Tencent, through the supporting demo provided by Tencent, you will find that every live picture on the stage is a uiview. Then, according to the situation of going on stage or stepping down, the live picture uiview can be added or removed dynamically. For specific code, please refer to the following:

@property (weak, nonatomic) IBOutlet UIView *renderViewContainer;

@property (nonatomic, strong) NSMutableArray *renderViews;

#pragma mark - Accessor
- (NSMutableArray *)renderViews
{
    if(!_renderViews){
        _renderViews = [NSMutableArray array];
    }
    return _renderViews;
}

#pragma mark - render view
- (void)updateRenderViewsLayout
{
    NSArray *rects = [self getRenderViewFrames];
    if(rects.count != self.renderViews.count){
        return;
    }
    for (int i = 0; i < self.renderViews.count; ++i) {
        UIView *view = self.renderViews[i];
        CGRect frame = [rects[i] CGRectValue];
        view.frame = frame;
        if(!view.superview){
            [self.renderViewContainer addSubview:view];
        }
    }
}

- (NSArray *)getRenderViewFrames
{
    CGFloat height = self.renderViewContainer.frame.size.height;
    CGFloat width = self.renderViewContainer.frame.size.width / 5;
    CGFloat xOffset = 0;
    NSMutableArray *array = [NSMutableArray array];
    for (int i = 0; i < self.renderViews.count; i++) {
        CGRect frame = CGRectMake(xOffset, 0, width, height);
        [array addObject:[NSValue valueWithCGRect:frame]];
        xOffset += width;
    }
    return array;
}

- (TICRenderView *)getRenderView:(NSString *)userId streamType:(TICStreamType)streamType
{
    for (TICRenderView *render in self.renderViews) {
        if([render.userId isEqualToString:userId] && render.streamType == streamType){
            return render;
        }
    }
    return nil;
}

#pragma mark - event listener
- (void)onTICUserVideoAvailable:(NSString *)userId available:(BOOL)available
{
    if(available){
        TICRenderView *render = [[TICRenderView alloc] init];
        render.userId = userId;
        render.streamType = TICStreamType_Main;
        [self.renderViewContainer addSubview:render];
        [self.renderViews addObject:render];
        [[[TICManager sharedInstance] getTRTCCloud] startRemoteView:userId view:render];
    }
    else{
        TICRenderView *render = [self getRenderView:userId streamType:TICStreamType_Main];
        [self.renderViews removeObject:render];
        [render removeFromSuperview];
        [[[TICManager sharedInstance] getTRTCCloud] stopRemoteView:userId];
    }
    [self updateRenderViewsLayout];
}

- (void)onTICUserSubStreamAvailable:(NSString *)userId available:(BOOL)available
{
    if(available){
        TICRenderView *render = [[TICRenderView alloc] init];
        render.userId = userId;
        render.streamType = TICStreamType_Sub;
        [self.renderViewContainer addSubview:render];
        [self.renderViews addObject:render];
        [[[TICManager sharedInstance] getTRTCCloud] startRemoteSubStreamView:userId view:render];
    }
    else{
        TICRenderView *render = [self getRenderView:userId streamType:TICStreamType_Sub];
        [self.renderViews removeObject:render];
        [render removeFromSuperview];
        [[[TICManager sharedInstance] getTRTCCloud] stopRemoteSubStreamView:userId];
    }
    [self updateRenderViewsLayout];
}

Mainly through the array to add or remove one by oneTICRenderViewTo achieve the goal, I don’t know TencentDemoWhat are the advantages of this way of writing, but it gives me the feeling that the code is not very comfortable. Although it is understood from the literal meaning of the code, it is OK to write it like this. If someone is on the stage, add a uiview, move the frame, embed it into the array, and put it in therenderViewContainerThen, with the help of Tencent real-time audio and video TRTC SDK, render the remote stream or local stream toUIViewJust go up.

However, combined with our specific business scenarios, we can intuitively find that every live screen is not only live streaming, but also contains other interactive things and states, such as the nickname of the user on the stage of each live screen, whether he has the right to speak, whether he is speaking, etc., so every live video is not only live streamingUIViewMore like one by oneUICollectionViewOfitem

So I need to modify this code, or refactoring.

Now let’s start with our main character:ASCollectionNode

ASCollectionNode

ASCollectionNode is equivalent to UIKit’s UICollectionView and can be used in place of any UICollectionView.

You can look at the official website directly. As long as you have used uicollectionview to operate this, it is very simple. For details, see the official website link

initialization

@interface ZJRendersView : UIView <ASCollectionDataSourceInterop, ASCollectionDelegate, ASCollectionViewLayoutInspecting>

@property (nonatomic, strong) ASCollectionNode *collectionNode;;
@property (nonatomic, strong) NSMutableDictionary<NSString*, NSMutableDictionary*> *onlineUsers;

Create a key value pairNSMutableDictionaryAn array of typesonlineUsersIt is used to save the user information.

- (instancetype)init {
    self = [super init];
    if (self) {
        _onlineUsers = [NSMutableDictionary new];
        UICollectionViewFlowLayout* flowLayout = [[UICollectionViewFlowLayout alloc] init];
        flowLayout.minimumInteritemSpacing = 0.1;
        flowLayout.minimumLineSpacing = 0.1;
        _collectionNode = [[ASCollectionNode alloc] initWithCollectionViewLayout:flowLayout];
        _collectionNode.dataSource = self;
        _collectionNode.delegate = self;
        _collectionNode.backgroundColor = UIColorClear;
        _collectionNode.layoutInspector = self;
        [self addSubnode:_collectionNode];
        [_collectionNode.view mas_makeConstraints:^(MASConstraintMaker *make) {
            make.left.mas_equalTo(self);
            make.top.mas_equalTo(self);
            make.right.mas_equalTo(self);
            make.bottom.mas_equalTo(self);
        }];
    }
    return self;
}

Initialization is relatively simple. The layout here mainly usesMasonryWe can save a lot of thought on the layout, because it’s a team project, so we don’t use it as much as possiblestoryboard, layout and uiview are completed with code as much as possible.

ASCollectionDataSource

#pragma mark - ASCollectionNode data source.

- (ASCellNodeBlock)collectionNode:(ASCollectionNode *)collectionNode nodeBlockForItemAtIndexPath:(NSIndexPath *)indexPath
{
    NSString* key = _keys[indexPath.item];
    NSDictionary *user = _onlineUsers[key];
    ASCellNode *(^cellNodeBlock)() = ^ASCellNode *() {
        return [[ZJRenderNode alloc] initWithHashID:key user:user];
    };

    return cellNodeBlock;
}

// The below 2 methods are required by ASCollectionViewLayoutInspecting, but ASCollectionLayout and its layout delegate are the ones that really determine the size ranges and directions
// TODO Remove these methods once a layout inspector is no longer required under ASCollectionLayout mode
- (ASSizeRange)collectionView:(ASCollectionView *)collectionView constrainedSizeForNodeAtIndexPath:(NSIndexPath *)indexPath
{
    return ASSizeRangeMake(CGSizeMake([UIScreen mainScreen].bounds.size.width / 7.0, self.bounds.size.height), CGSizeMake([UIScreen mainScreen].bounds.size.width / 7.0, self.bounds.size.height));
}

- (ASScrollDirection)scrollableDirections
{
    return ASScrollDirectionHorizontalDirections;
}

- (NSInteger)numberOfSectionsInCollectionNode:(ASCollectionNode *)collectionNode
{
    return 1;
}

- (NSInteger)collectionNode:(ASCollectionNode *)collectionNode numberOfItemsInSection:(NSInteger)section
{
    return _keys.count;
}

Here, according to the needs of our business, the whole live broadcast interface is placed on the same line, that is, the scroll direction is set as follows:ASScrollDirectionHorizontalDirections, one line shows:numberOfSectionsInCollectionNodeIs 1. The size of each live interface is the same, and the width of the horizontal screen of the whole mobile phone is divided into 7 equal parts,CGSizeMake([UIScreen mainScreen].bounds.size.width / 7.0, self.bounds.size.height)

The next step is how to do eachitemThe layout of.

ZJRenderNode

As shown in the figure below, each live broadcast interface contains many elements, such as instructor tag, user name, voice volume bar, number of trophies won, etc.

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

It has been introduced in the previous articleASButtonNodeASAbsoluteLayoutSpecASInsetLayoutSpec

Let’s take a look at the other ones today.

- (instancetype)init {
    self = [super init];
    if (self) {
        _backgroundNode = [[ASDisplayNode alloc] init];
        [self addSubnode:_backgroundNode];

        _bottomBackgroundNode = [[ASDisplayNode alloc] init];
        _bottomBackgroundNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0.522];
        [self addSubnode:_bottomBackgroundNode];

        _nicknameNode = [[ASTextNode alloc] init];
        _nicknameNode.maximumNumberOfLines = 1;
        _nicknameNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [_bottomBackgroundNode addSubnode:_nicknameNode];

        _permissionNode = [ASImageNode new];
        _permissionNode.image = UIImageMake(@"icon_permission");
        _permissionNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [self addSubnode:_permissionNode];

        _microNode = [ASImageNode new];
        _microNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [_bottomBackgroundNode addSubnode:_microNode];

        _zanNode = [[ASButtonNode alloc] init];
        [_zanNode setImage:UIImageMake(@"icon_zan") forState:UIControlStateNormal];
        [_zanNode setContentHorizontalAlignment:ASHorizontalAlignmentMiddle];
        [_zanNode setContentSpacing:2];
        _zanNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        _zanNode.hidden = YES;
        [_bottomBackgroundNode addSubnode:_zanNode];

        _volumnNode = [[ASDisplayNode alloc] init];
        _volumnNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [self addSubnode:_volumnNode];

        _teacherIconNode = [ASImageNode new];
        _teacherIconNode.image = UIImageMake(@"icon_jiangshi");
        _teacherIconNode.backgroundColor = [UIColorMakeWithHex(@"#3d3d3d") colorWithAlphaComponent:0];
        [self insertSubnode:_teacherIconNode aboveSubnode:_volumnNode];

        [self updatePermission:user];
    }
    return self;
}

There are three main layouts to consider.

The first is to set onebackgroundNodeIt is used to receive the video stream of remote stream and local stream, and display the live screen. In our design, we use the video stream as the background layer and then add our other elements to it. So here we use itASBackgroundLayoutSpec

ASBackgroundLayoutSpec

ASBackgroundLayoutSpec lays out a component (blue), stretching another component behind it as a backdrop (red).

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

The background spec’s size is calculated from the child’s size. In the diagram below, the child is the blue layer. The child’s size is then passed as the constrainedSize to the background layout element (red layer). Thus, it is important that the child (blue layer) must have an intrinsic size or a size set on it.

ASInsetLayoutSpec* backgroundInsetLayoutSpec =  [ASInsetLayoutSpec
            insetLayoutSpecWithInsets:UIEdgeInsetsMake(0, 0, 0, 0)
                                child:_backgroundNode];

    return [ASBackgroundLayoutSpec backgroundLayoutSpecWithChild:contentSpec background:backgroundInsetLayoutSpec];

The second view is at the bottombottomBackgroundNodeIt is used to layout microphone buttons, nicknames, likes and other information. We use this layoutMasonryTo make constraints.

dispatch_async(dispatch_get_main_queue(), ^{
    //Update audio
    NSString* voiceIcon = [_user[@"voice"] boolValue] ? @"icon_microphone_good" : @"icon_microphone_bad";
    _microNode.image = UIImageMake(voiceIcon);

    if ([_key isEqualToString:_my_key]) {
        //Update your audio status
        if ([_user[@"voice"] boolValue]) {
            [[[TICManager sharedInstance] getTRTCCloud] startLocalAudio];
        } else {
            [[[TICManager sharedInstance] getTRTCCloud] stopLocalAudio];
        }
        [[[TICManager sharedInstance] getTRTCCloud] muteLocalAudio:![_user[@"voice"] boolValue]];
    }

    //Update likes
    if (_user && [_user[@"zan"] intValue] > 0) {
        _zanNode.hidden = NO;
        [_zanNode setTitle:_user[@"zan"] withFont:UIFontMake(10) withColor:UIColor.ZJ_tintColor forState:UIControlStateNormal];
    }

    //User nickname information
    if (_user[@"nickname"] != nil) {
        NSString *nickname = [_user[@"nickname"] stringValue].length > 7 ? [[_user[@"nickname"] stringValue] substringWithRange:NSMakeRange(0, 7)] : [_user[@"nickname"] stringValue];
        _nicknameNode.attributedText = [[NSAttributedString alloc] initWithString:nickname attributes:@{
                NSFontAttributeName : UIFontMake(10),
                NSForegroundColorAttributeName: UIColor.ZJ_tintColor,
        }];
    }
    _teacherIconNode.hidden = ![_user[@"isteacher"] boolValue];

    _permissionNode.hidden = [_user[@"isteacher"] boolValue] || ![_user[@"board"] boolValue];

    [_permissionNode.view mas_updateConstraints:^(MASConstraintMaker *make) {
        make.top.mas_equalTo(self.view.mas_top).offset(4);
        make.right.mas_equalTo(self.view.mas_right).offset(-4);
        make.width.mas_equalTo(11);
        make.height.mas_equalTo(10);
    }];

    [_microNode.view mas_updateConstraints:^(MASConstraintMaker *make) {
        make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY);
        make.left.mas_equalTo(_bottomBackgroundNode.view).offset(4);
        make.width.mas_equalTo(7.5);
        make.height.mas_equalTo(9);
    }];

    [_zanNode.view mas_updateConstraints:^(MASConstraintMaker *make) {
        make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY);
        make.right.mas_equalTo(_bottomBackgroundNode.view.mas_right).offset(-4);
        make.width.mas_equalTo(18);
        make.height.mas_equalTo(13.5);
    }];

    CGSize size = [_nicknameNode calculateSizeThatFits:CGSizeMake(20, 16)];
    [_nicknameNode.view mas_updateConstraints:^(MASConstraintMaker *make) {
        make.left.mas_equalTo(self.microNode.view.mas_right).offset(4);
        make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY);
        make.right.mas_equalTo(_zanNode.view.mas_left);
        make.height.mas_equalTo(size.height);
    }];
});

When your microphone is not invited to turn it on([_user[@"voice"] boolValue])To turn off local audio:

[[[TICManager sharedInstance] getTRTCCloud] stopLocalAudio];

Otherwise, turn on local audio:

[[[TICManager sharedInstance] getTRTCCloud] startLocalAudio];

//At the same time, pay attention to prohibit or release the push flow
[[[TICManager sharedInstance] getTRTCCloud] muteLocalAudio:![_user[@"voice"] boolValue]];

Use the entire bottom layoutMasonryTo constrain the layout to ensure that the controls are aligned vertically

make.centerY.mas_equalTo(_bottomBackgroundNode.view.mas_centerY);

What needs to be noted here is that_nicknameNodeLayout, because you need to calculate the size of the layout before you can go to layout.

The layout here needs to be executed by the main thread:

dispatch_async(dispatch_get_main_queue(), ^{});

The third is the layout of our voice volume bar

[_volumnNode.view mas_updateConstraints:^(MASConstraintMaker *make) {
    make.left.equalTo(self.view.mas_left).offset(5);
    make.bottom.mas_equalTo(_bottomBackgroundNode.view.mas_top);
    make.height.mas_equalTo(30);
    make.width.mas_equalTo(5.5);
}];

for (NSUInteger i = 0; i < 10; i++) {
    ASImageNode *itemView = [[ASImageNode alloc] init];
    itemView.image = UIImageMake(@"icon_voiced");
    [itemView setHidden:YES];
    [_volumnNode addSubnode:itemView];
    [_renderNodes addObject:itemView];
    [_renderViews addObject:itemView.view];
}
[_renderViews mas_distributeViewsAlongAxis:MASAxisTypeVertical withFixedSpacing:0.5 leadSpacing:0 tailSpacing:0];

[_renderViews mas_updateConstraints:^(MASConstraintMaker *make) {
    //Horizontal center can be set in vertical direction
    make.centerX.mas_equalTo(self.volumnNode.view.mas_centerX);
    make.width.mas_equalTo(5.5);
    make.height.mas_equalTo(2.5);
}];

Let’s divide the volume into 10 equal partsASImageNodeAnd then vertically overlay them together. Here we usemas_distributeViewsAlongAxisVertical layout, space occupied 0.5. Each volume takes up 2.5 height, and the whole layout height is controlled at 30, just fullvolumnNodeLayout.

Complete layout

NSMutableArray *mainStackContent = [[NSMutableArray alloc] init];
    if ([_user[@"isteacher"] boolValue]) {
        _teacherIconNode.style.preferredSize = CGSizeMake(22, 22.5);
        _teacherIconNode.style.layoutPosition = CGPointMake(0, 0);
        UIEdgeInsets insets = UIEdgeInsetsMake(0, 0, 0, 0);
        ASInsetLayoutSpec *teacherIconSpec = [ASInsetLayoutSpec insetLayoutSpecWithInsets:insets child:_teacherIconNode];
        [mainStackContent addObject:teacherIconSpec];
    }
    _volumnNode.style.preferredSize = CGSizeMake(8.5, 50);
    _volumnNode.style.layoutPosition = CGPointMake(5, 20);

    _bottomBackgroundNode.style.preferredSize = CGSizeMake(constrainedSize.max.width, 16);
    _bottomBackgroundNode.style.layoutPosition = CGPointMake(0, constrainedSize.max.height - 16);

    [mainStackContent addObject:_volumnNode];
    [mainStackContent addObject:_bottomBackgroundNode];

    ASAbsoluteLayoutSpec *contentSpec = [ASAbsoluteLayoutSpec absoluteLayoutSpecWithChildren:mainStackContent];

    ASInsetLayoutSpec* backgroundInsetLayoutSpec =  [ASInsetLayoutSpec
            insetLayoutSpecWithInsets:UIEdgeInsetsMake(0, 0, 0, 0)
                                child:_backgroundNode];

    return [ASBackgroundLayoutSpec backgroundLayoutSpecWithChild:contentSpec background:backgroundInsetLayoutSpec];

Because of its simple structure and clear positioning, we adopted theASAbsoluteLayoutSpecThis was introduced in the last article, so I won’t do more here.

Combined with TRTC

YesASCollectionNodeLayout, the next step is combinationTRTCComplete push flow and up and down logic.

Initialize TRTC

// Podfile
use_frameworks!
pod 'TEduBoard_iOS','2.4.6.1'
pod 'TXIMSDK_iOS','4.6.101'
pod 'TXLiteAVSDK_TRTC','6.9.8341'

According to the instructions of tic, an educational solution provided by Tencent cloud, it is recommended to install the above three plug-ins (whiteboard function, IM chat, Tencent real-time video and audio TRTC).

stayAppDelegateinitialization:

[[TICManager sharedInstance] init:sdkAppid callback:^(TICModule module, int code, NSString *desc) {
    if(code == 0){
        [[TICManager sharedInstance] addStatusListener:self];
    }
}];

Direct introduction to the governmentDemoProvide code to expand according to business needs. This article does not do secondary processing for them, so it is convenient to follow the official website plug-in update iteration.

Texture ascollectionnode combined with Tencent cloud TRTC to realize multi person live broadcast

Note: the official connection plug-in isCocoaAsyncSocket, please refer to the website
robbiehanson/CocoaAsyncSocket

The next step is to log in to room.

[[TICManager sharedInstance] login:userId userSig:userSig callback:^(TICModule module, int code, NSString *desc) {
    if(code == 0){
        [JMLoadingHUD hide];
        [qmuitips showsucceeded: @ "login succeeded" inview: [[uiapplication shared application] keywindow] hidea fterDelay:3 ];
        ZJClassRoomViewController *classRoom = [ZJClassRoomViewController new];
        TICClassroomOption *option = [[TICClassroomOption alloc] init];
        option.classId = (UInt32) [json[@"room"][@"id"] intValue];
        classRoom.option = option;
        [ws.navigationController pushViewController:classRoom animated:YES];
    }
    else{
        [JMLoadingHUD hide];
        [[jmtoast sharedtoast] showdialogwithmsg: [nsstring stringwithformat: @ "login failure: D,% @", code, desc]];
    }
}];

hereuserSigNeed to cooperate with the background to generate, reference generation rules and interface documents.

[[TICManager sharedInstance] addMessageListener:self];
[[TICManager sharedInstance] addEventListener:self];
__weak typeof(self) ws = self;
[[TICManager sharedInstance] joinClassroom:option callback:^(TICModule module, int code, NSString *desc) {
    if(code == 0) {
//            [JMLoadingHUD hide];
        [qmuitips showsucceeded: @ "class preparation finished" inview: [[uiapplication shared application] keywindow] hidea fterDelay:3 ];
        //Other business codes
        // ...
        //
    } else {
        [[jmtoast sharedtoast] showdialogwithmsg: [nsstring stringwithformat: @ "failed to join class: D,% @", code, desc]];
        if(code == 10015){
            [[jmtoast sharedtoast] showdialogwithmsg: @ "class does not exist, please \" create class \ "];
        }
        else {
            [[jmtoast sharedtoast] showdialogwithmsg: [nsstring stringwithformat: @ "failed to join class: d% @", code, desc]];
        }
        [JMLoadingHUD hide];
    }
}];

Entering the classroom, it is mainly to initialize the whiteboard, join the IM group and other logic, refer to Tencent providedDemo:

- (void)joinClassroom:(TICClassroomOption *)option callback:(TICCallback)callback
{
    _option = option;
    _enterCallback = callback;
    
    //Whiteboard initialization
    __weak typeof(self) ws = self;
    void (^createBoard)(void) = ^(void){
        TEduBoardAuthParam *authParam = [[TEduBoardAuthParam alloc] init];
        authParam.sdkAppId = ws.sdkAppId;
        authParam.userId = ws.userId;
        authParam.userSig = ws.userSig;
        TEduBoardInitParam *initParam = option.boardInitParam;
        if(!initParam){
            initParam = [[TEduBoardInitParam alloc] init];
        }
        [ws report:TIC_REPORT_INIT_BOARD_START];
        ws.boardController = [[TEduBoardController alloc] initWithAuthParam:authParam roomId:ws.option.classId initParam:initParam];
        [ws.boardController addDelegate:ws];
        if(option.boardDelegate){
            [ws.boardController addDelegate:option.boardDelegate];
        }
    };
    
    [self report:TIC_REPORT_JOIN_GROUP_START];
    //Im in
    void (^succ)(void) = ^{
        [ws report:TIC_REPORT_JOIN_GROUP_END];
        createBoard();
    };
    
    void (^fail)(int, NSString*) = ^(int code, NSString *msg){
        [ws report:TIC_REPORT_JOIN_GROUP_END code:code msg:msg];
        TICBLOCK_SAFE_RUN(callback, TICMODULE_IMSDK, code, msg);
    };
    
    [self joinIMGroup:[@(_option.classId) stringValue] succ:^{
        if(ws.option.compatSaas){
            NSString *chatGroup = [self getChatGroup];
            [self joinIMGroup:chatGroup succ:^{
                succ();
            } fail:^(int code, NSString *msg) {
                fail(code, msg);
            }];
        }
        else{
            succ();
        }
    } fail:^(int code, NSString *msg) {
        fail(code, msg);
    }];
};

Whiteboard

Whiteboard is a core function of educational live broadcast. Instructors or users can participate in whiteboard operation and communication according to authorization

UIView *boardView = [[[TICManager sharedInstance] getBoardController] getBoardRenderView];

//Whiteboard cannot be operated by default
[[[TICManager sharedInstance] getBoardController] setDrawEnable:NO];
boardView.frame = self.boardBackgroudView.bounds;
[self.boardBackgroudView addSubview:boardView];
[[[TICManager sharedInstance] getBoardController] addDelegate:self];

Some functions of whiteboard need to be used in actual business scenarios

/**
 *@ brief sets the whiteboard tool to use
 *Whiteboard tool to be set by @ param type
*/
- (void)onSelectToolType:(int)toolType
{
    [[[TICManager sharedInstance] getBoardController] setToolType:(TEduBoardToolType)toolType];
}

/**
 *@ brief set brush color
 *@ param color the brush color to set
 *
 *Brush color is used for all graffiti drawing
*/
- (void)onSelectBrushColor:(UIColor *)color
{
    [[[TICManager sharedInstance] getBoardController] setBrushColor:color];
}

/**
 *@ brief set brush thickness
 *@ param thin the brush thickness to set
 *
 *Brush thickness is used for all graffiti drawing. The actual pixel value is (thin * whiteboard height / 10000) px. If the result is less than 1px, the graffiti line will be more virtual
*/
- (void)onBrushThinChanged:(float)thin
{
    [[[TICManager sharedInstance] getBoardController] setBrushThin:thin];
}

/**
 *@ brief set text color
 *@ param color the text color to set
*/
- (void)onSelectTextColor:(UIColor *)color
{
    [[[TICManager sharedInstance] getBoardController] setTextColor:color];
}

/**
 *@ brief sets the background color of the current whiteboard page
 *@ param color the background color to set
 *
 *The default background color after the whiteboard page is created is set by the setdefaultbackgroundcolor interface
*/
- (void)onSelectBackgroundColor:(UIColor *)color
{
    [[[TICManager sharedInstance] getBoardController] setBackgroundColor:color];
}

/**
 *@ brief set text size
 *@ param size the text size to set
 *
 *Actual pixel value (size * height of whiteboard / 10000) PX
*/
- (void)onTextSizeChanged:(float)thin
{
    [[[TICManager sharedInstance] getBoardController] setTextSize:thin];
}

/**
 *@ brief sets whether doodling is allowed on the whiteboard
 *@ param enable whether doodling is allowed. True indicates that whiteboard can Doodle and false means whiteboard can not doodle
 *
 *After the whiteboard is created, it is allowed to doodle by default
*/
- (void)onDrawStateChanged:(BOOL)state
{
    [[[TICManager sharedInstance] getBoardController] setDrawEnable:state];
}

/**
 *@ brief set whether data synchronization is enabled on whiteboard
 *Is @ param enable enabled
 *
 *After the whiteboard is created, data synchronization is turned on and off by default. All local whiteboard operations will not be synchronized to the remote end and the server
*/
- (void)onSyncDataChanged:(BOOL)state
{
    [[[TICManager sharedInstance] getBoardController] setDataSyncEnable:state];
}

/**
 *@ brief sets the background H5 page of the current whiteboard page
 *@ param URL the background H5 page URL to set
 *
 *This interface is mutually exclusive with the setbackgroundimage interface
*/
- (void)onSetBackgroundH5:(NSString *)url
{
    [[[TICManager sharedInstance] getBoardController] setBackgroundH5:url];
}

/**
 *@ brief set text style
 *The text style to be set by @ param style
*/
- (void)onSetTextStyle:(int)style
{
    [[[TICManager sharedInstance] getBoardController] setTextStyle:(TEduBoardTextStyle)style];
}

/**
 *@ brief cancels the last action of the current whiteboard page
*/
- (void)onUndo
{
    [[[TICManager sharedInstance] getBoardController] undo];
}

/**
 *@ brief redo the previous undo of the current whiteboard page
*/
- (void)onRedo
{
    [[[TICManager sharedInstance] getBoardController] redo];
}

/**
 *@ brief clear the graffiti, and clear the background color and background image
 */
- (void)onClear
{
    [[[TICManager sharedInstance] getBoardController] clear];
}

/**
 *@ brief clear graffiti
 */
- (void)onClearDraw
{
    [[[TICManager sharedInstance] getBoardController] clearDraws];
}

/**
 *@ brief reset whiteboard
 *
 *After calling this interface, all whiteboard pages and files will be deleted
*/
- (void)onReset
{
    [[[TICManager sharedInstance] getBoardController] reset];
}

/**
 *@ brief sets the background image of the current whiteboard page
 *@ param URL the URL of the background image to be set. The encoding format is utf8
 *@ param mode the image fill alignment mode to use
 *
 *When the URL is a valid local file address, the file is automatically uploaded to Cos
*/
- (void)onSetBackgroundImage:(NSString *)path
{
    [[[TICManager sharedInstance] getBoardController] setBackgroundImage:path mode:TEDU_BOARD_IMAGE_FIT_MODE_CENTER];
}

Video streaming and streaming

When there is a push stream from the remote stream, our message event will be triggered:

/**
 *Status notification of remote main channel (i.e. camera) screen corresponding to userid
 *@ param userid user ID
 *Is the @ param available screen open
 **/
- (void)onTICUserVideoAvailable:(NSString *)userId available:(BOOL)available
{
    NSLog(@"onTICUserVideoAvailable userId: %@, available = %d", userId, available);
    [self.rendersView onTICUserVideoAvailable:userId available:available];
}

Corresponding to our operation, accept or stop accepting the remote stream:

- (void)onTICUserVideoAvailable:(NSString *)userId available:(BOOL)available {
    [[[TICManager sharedInstance] getTRTCCloud] muteRemoteVideoStream:userId mute:!available];
}

When our server pushes us to say that there is a user on the stage, we first add oneASCollectionNode itemIn ourZJRenderNodeSwitch the flow on and off

- (void)updateVideoStatus:(bool)available {
    dispatch_async(dispatch_get_main_queue(), ^{
        if ([_key isEqualToString:_key]) {
            if (available) {
                NSLog(@"startLocalPreview:");
                [[[TICManager sharedInstance] getTRTCCloud] startLocalPreview:YES view:_backgroundNode.view];
            } else {
                NSLog(@"stopLocalPreview:");
                [[[TICManager sharedInstance] getTRTCCloud] stopLocalPreview];
            }
        } else {
            if (available) {
                [[[TICManager sharedInstance] getTRTCCloud] startRemoteView:_hash_id view:_backgroundNode.view];
            } else {
                [[[TICManager sharedInstance] getTRTCCloud] stopRemoteView:_hash_id];
            }
        }
    });
}

Finally, when you get the server push, if you include yourself in the downlist, you can directly turn off your local stream push

//Step down, stop pushing video and audio and operate whiteboard
if ([key isEqualToString:_my_key]) {
    //Stop local video streaming  
    [[[TICManager sharedInstance] getTRTCCloud] stopLocalPreview];
    //Stop local audio streaming
    [[[TICManager sharedInstance] getTRTCCloud] stopLocalAudio];
    //Stop operating whiteboard permissions
    [[[TICManager sharedInstance] getBoardController] setDrawEnable:NO];
}

Audio volume operation

//Hardware device event callback
- (void)onUserVoiceVolume:(NSArray<TRTCVolumeInfo *> *)userVolumes totalVolume:(NSInteger)totalVolume {
    [self.rendersView onUserVoiceVolume:userVolumes totalVolume:totalVolume];
}

// ZJRendersView.m
- (void)onUserVoiceVolume:(NSArray<TRTCVolumeInfo *> *)userVolumes totalVolume:(NSInteger)totalVolume {
    for (TRTCVolumeInfo *info in userVolumes) {
        if (keys[info.userId]) {
            ZJRenderNode *node = [_collectionNode nodeForItemAtIndexPath:keys[info.userId]];
            [node updateVolumn:(info.volume / 10)];
        }
    }
}

// ZJRenderNode
//Update the volume UI and set the volume change through the hidden property value
- (void)updateVolumn:(NSUInteger)count {
    dispatch_async(dispatch_get_main_queue(), ^{
        NSUInteger i = 0;
        for (i = 0; i < 10 - count; ++i) {
            [_renderNodes[i] setHidden:YES];
        }

        for (NSUInteger j = i; j < 10; ++j) {
            [_renderNodes[j] setHidden:NO];
        }
    });
}

summary

At this point, even if the development of our core functions is completed, the lack of im here can be combined with the previous article chat interface design to try.

combinationASCollectionNodeTencent cloud’s real-time video and audio TRTC SDK and Tencent cloud completed a live interactive live broadcast of multiple people on the stage. From the experience and live broadcast effect, Tencent cloud’s real-time video and audio capability is still very good. The delay is within the acceptable range of hundreds of milliseconds. It is worth recommending.

Recommended Today

Regular expression sharing for checking primes

This regular expression is shown as follows: Regular expressions for checking prime numbers or not To use this positive regular expression, you need to convert the natural number into multiple 1 strings. For example, 2 should be written as “11”, 3 should be written as “111”, 17 should be written as “11111111111”. This kind of […]