Only  $9.9! Get 50,000 minutes with our Starter Plan, perfect for your MVP project.
Only $9.9! Get 50,000 minutes with our Starter Plan, perfect for your MVP project.
Grab It Now 
RTC Engine
Overview
  • Web
    • Demo Quick Run
    • SDK Quick Start
    • Basic Features
      • Screen Sharing
      • Live Streaming
      • Media Device
      • Audio Volume
      • Set Encoding Profile
      • Detect Network Quality
      • Detect Capabilities
    • Advance Features
      • Enable AI Denoiser
      • Enable Audio Mixer
      • Enable Watermark
      • Enable Beauty and Effects
      • SEI Message
      • Custom Capturing and Rendering
    • Best Practices
      • Optimize Multi-Person Video Calls
      • Handle Autoplay Restriction
      • Handle Firewall Restriction
    • API List
    • Released Notes
    • Supported Platforms
    • Web FAQs
  • Android
    • Integration
      • 1.API Examples
      • 2.Importing the SDK
      • 3.Entering a Room
      • 4.Subscribing to Audio/Video Streams
      • 5.Publish Audio/Video Streams
      • 6.Exiting a Room
      • 7.Sensing Network Quality
      • 8.Enabling Screen Sharing
      • 9.Setting Video Quality
      • 10.Rotating Videos
    • Testing Newwork Quality
    • Custom Capturing and Rendering
    • Custom Audio Capturing and Playback
    • Client APIs
      • Overview
      • TRTCCloud
      • TRTCStatistics
      • TRTCCloudListener
      • TXAudioEffectManager
      • TXBeautyManager
      • TXDeviceManager
      • Type Definition
      • Deprecated Interface
      • Error Codes
    • Solution
      • Real-Time Chorus (TUIKaraoke)
        • Quick Integration
        • Implementation Steps
        • Song Synchronization
        • Lyric Synchronization
        • Vocal Synchronization
        • Mixing Stream Solution
        • TRTCKaraoke APIs
        • FAQs
    • Release Notes
  • iOS
    • Integration
      • 1.API Examples
      • 2.Importing the SDK
      • 3.Entering a Room
      • 4.Subscribing to Audio/Video Streams
      • 5.Publish Audio/Video Streams
      • 6.Exiting a Room
      • 7.Sensing Network Quality
      • 8.Enabling Screen Sharing
      • 9.Setting Video Quality
      • 10.Rotating Videos
    • Testing Network Quality
    • Custom Capturing and Rendering
    • Custom Audio Capturing and Playback
    • Client APIs
      • Overview
      • TRTCCloud
      • TRTCCloudDelegate
      • TRTCStatistics
      • TXAudioEffectManager
      • TXBeautyManager
      • TXDeviceManager
      • Type Definition
      • Deprecated Interface
      • ErrorCode
    • Solution
      • Quick Integration
      • Implementation Steps
      • Song Synchronization
      • Lyric Synchronization
      • Vocal Synchronization
      • Mixing Stream Solution
      • TRTCKaraoke APIs
      • FAQs
    • Release Notes
  • macOS
    • Integration
      • 1.API Examples
      • 2.Importing the SDK
      • 3.Entering a Room
      • 4.Subscribing to Audio/Video Streams
      • 5.Publish Audio/Video Streams
      • 6.Exiting a Room
      • 7.Sensing Network Quality
      • 8.Enabling Screen Sharing
      • 9.Sharing Computer Audio
      • 10.Setting Video Quality
      • 11.Rotating Videos
    • Testing Hardware Devices
    • Testing Network Quality
    • Custom Capturing and Rendering
    • Custom Audio Capturing and Playback
    • Client APIs
      • Overview
      • TRTCCloud
      • TRTCCloudDelegate
      • TRTCStatistics
      • TXAudioEffectManager
      • TXBeautyManager
      • TXDeviceManager
      • Type Definition
      • Deprecated Interface
      • ErrorCode
      • Release Notes
    • Release Notes
  • Windows
    • Integration
      • 1.API Examples
      • 2.Importing the SDK
      • 3.Entering a Room
      • 4.Subscribing to Audio/Video Streams
      • 5.Publish Audio/Video Streams
      • 6.Exiting a Room
      • 7.Sensing Network Quality
      • 8.Enabling Screen Sharing
      • 9.Setting Video Quality
      • 10.Rotating Videos
    • Testing Hardware Devices
    • Testing Network Quality
    • Custom Capturing and Rendering
    • Custom Audio Capturing and Playback
    • Client APIs
      • Overview
      • ITRTCCloud
      • ITRTCStatistics
      • TRTCCloudCallback
      • ITXAudioEffectManager
      • ITXDeviceManager
      • Type Definition
      • Deprecated Interface
      • Error Codes
    • Release Notes
  • Electron
    • Integration
      • 1.API Examples
      • 2.Importing the SDK
      • 3.Entering a Room
      • 4.Subscribing to Audio/Video Streams
      • 5.Publish Audio/Video Streams
      • 6.Exiting a Room
      • 7.Sensing Network Quality
      • 8.Enabling Screen Sharing
      • 9.Sharing Computer Audio
      • 10.Setting Video Quality
      • 11.Rotating Videos
    • Client APIs
      • Overview
      • Error Codes
  • Flutter
    • Integration
      • 1.API Examples
      • 2.Importing the SDK
      • 3.Entering a Room
      • 4.Subscribing to Audio/Video Streams
      • 5.Publish Audio/Video Streams
      • 6.Exiting a Room
      • 7.Sensing Network Quality
      • 8.Enabling Screen Sharing
      • 9.Sharing Computer Audio
      • 10.Setting Video Quality
      • 11.Rotating Videos
    • Client APIs
      • Overview
      • Error Codes
  • Unity
    • Integration
      • 1.API Examples
      • 2Importing the SDK
    • Client APIs
      • Overview
      • Error Codes
  • Qt
    • Integration
      • 1.Importing the SDK
  • Overview
    • Overview
  • Concepts
  • Features
  • Performance Statistics
  • Pricing
    • RTC-Engine Packages
    • Billing of On-Cloud Recording
    • Billing of MixTranscoding and Relay to CDN
    • Billing Explanation for Subscription Package Duration
    • Billing of Monitoring Dashboard
    • Free Minutes
    • Pay-As-You-Go
  • Tencent RTC Quickplay: Experience Ultimate Real-Time Audio and Video Interaction!
  • FAQs
    • FAQs for Beginners
    • Migration Guide
      • Twilio Video to Tencent RTC
      • Billing
      • Features
      • UserSig
      • Firewall Restrictions
      • How to Downsize Installation Package
      • TRTCCalling for Web
      • Audio and Video Quality
      • Others
RTC Engine

Custom Capturing and Rendering

This document describes how to use the TRTC SDK to implement custom video capturing and rendering.

Custom Video Capturing

The custom video capturing feature of the TRTC SDK can be used in two steps: enabling the feature and sending video frames to the SDK. For detailed directions of specific APIs, see below. We also provide API examples for different platforms:

Enabling custom video capturing

To enable the custom video capturing feature of the TRTC SDK, you need to call the enableCustomVideoCapture API of TRTCCloud. Then, the TRTC SDK's camera capturing and image processing logic will be skipped, and only its encoding and transfer capabilities will be retained. Below is the sample code:
Android
iOS&Mac
Windows
TRTCCloud mTRTCCloud = TRTCCloud.shareInstance();
mTRTCCloud.enableCustomVideoCapture(TRTCCloudDef.TRTC_VIDEO_STREAM_TYPE_BIG, true);

self.trtcCloud = [TRTCCloud sharedInstance];
[self.trtcCloud enableCustomVideoCapture:TRTCVideoStreamTypeBig enable:YES];

liteav::ITRTCCloud* trtc_cloud = liteav::ITRTCCloud::getTRTCShareInstance();
trtc_cloud->enableCustomVideoCapture(TRTCVideoStreamType::TRTCVideoStreamTypeBig, true);


Sending custom video frames

Then, you can use the sendCustomVideoData API of TRTCCloud to populate the TRTC SDK with your own video data. Below is the sample code:
explain
In order to avoid performance loss, there are different format requirements for the video data input to the TRTC SDK on different platforms. For more information, see LiteAVSDK Overview.
Android
iOS&Mac
Windows
// Two schemes are available for Android: Texture (recommended) and Buffer. Texture is used as an example here.
TRTCCloudDef.TRTCVideoFrame videoFrame = new TRTCCloudDef.TRTCVideoFrame();
videoFrame.texture = new TRTCCloudDef.TRTCTexture();
videoFrame.texture.textureId = textureId;
videoFrame.texture.eglContext14 = eglContext;
videoFrame.width = width;
videoFrame.height = height;
videoFrame.timestamp = timestamp;
videoFrame.pixelFormat = TRTCCloudDef.TRTC_VIDEO_PIXEL_FORMAT_Texture_2D;
videoFrame.bufferType = TRTCCloudDef.TRTC_VIDEO_BUFFER_TYPE_TEXTURE;
mTRTCCloud.sendCustomVideoData(TRTCCloudDef.TRTC_VIDEO_STREAM_TYPE_BIG, videoFrame);

// On iOS and macOS, the video captured by the camera is in NV12 format. The natively supported and best-performing video frame format is CVPixelBufferRef, and I420 and OpenGL 2D texture formats are also supported. CVPixelBufferRef is used as an example here, which is recommended.
TRTCVideoFrame *videoFrame = [[TRTCVideoFrame alloc] init];
videoFrame.pixelFormat = TRTCVideoPixelFormat_NV12;
videoFrame.bufferType = TRTCVideoBufferType_PixelBuffer;
videoFrame.pixelBuffer = imageBuffer;
videoFrame.timestamp = timeStamp;

[[TRTCCloud sharedInstance] sendCustomVideoData:TRTCVideoStreamTypeBig frame:videoFrame];

// Only the Buffer scheme is available for Windows currently and is recommended for feature implementation.
liteav::TRTCVideoFrame frame;
frame.timestamp = getTRTCShareInstance()->generateCustomPTS();
frame.videoFormat = liteav::TRTCVideoPixelFormat_I420;
frame.bufferType = liteav::TRTCVideoBufferType_Buffer;
frame.length = buffer_size;
frame.data = array.data();
frame.width = YUV_WIDTH;
frame.height = YUV_HEIGHT;
getTRTCShareInstance()->sendCustomVideoData(&frame);



Custom Video Rendering

Custom rendering is mainly divided into rendering of the local video and rendering of the remote video. You can set the callback for local/remote custom rendering, and the TRTC SDK will pass the corresponding video frames (TRTCVideoFrame) through the callback function onRenderVideoFrame. Then, you can customize the rendering of the received video frames. This process requires certain knowledge of OpenGL. We also provide API examples for different platforms:

Setting the local video rendering callback

Android
iOS&Mac
Windows
mTRTCCloud.setLocalVideoRenderListener(TRTCCloudDef.TRTC_VIDEO_PIXEL_FORMAT_Texture_2D, TRTCCloudDef.TRTC_VIDEO_BUFFER_TYPE_TEXTURE, new TRTCCloudListener.TRTCVideoRenderListener() {
@Override
public void onRenderVideoFrame(String suserId int streamType, TRTCCloudDef.TRTCVideoFrame frame) {
// For more information, see the custom rendering tool class `com.tencent.trtc.mediashare.helper.CustomFrameRender` in `TRTC-API-Example`
}
});

self.trtcCloud = [TRTCCloud sharedInstance];
[self.trtcCloud setLocalVideoRenderDelegate:self pixelFormat:TRTCVideoPixelFormat_NV12 bufferType:TRTCVideoBufferType_PixelBuffer];

```
// For specific implementation, see `test_custom_render.cpp` in `TRTC-API-Example-Qt`
void TestCustomRender::onRenderVideoFrame(
const char* userId,
liteav::TRTCVideoStreamType streamType,
liteav::TRTCVideoFrame* frame) {
if (gl_yuv_widget_ == nullptr) {
return;
}

if (streamType == liteav::TRTCVideoStreamType::TRTCVideoStreamTypeBig) {
// Adjust the rendering window
emit renderViewSize(frame->width, frame->height);
// Draw video frames
gl_yuv_widget_->slotShowYuv(reinterpret_cast<uchar*>(frame->data),
frame->width, frame->height);
}
}
```


Setting the rendering callback of remote video

Android
iOS&Mac
Windows
mTRTCCloud.setRemoteVideoRenderListener(userId, TRTCCloudDef.TRTC_VIDEO_PIXEL_FORMAT_I420, TRTCCloudDef.TRTC_VIDEO_BUFFER_TYPE_BYTE_ARRAY, new TRTCCloudListener.TRTCVideoRenderListener() {
@Override
public void onRenderVideoFrame(String userId, int streamType, TRTCCloudDef.TRTCVideoFrame frame) {
// For more information, see the custom rendering tool class `com.tencent.trtc.mediashare.helper.CustomFrameRender` in `TRTC-API-Example`
}
});

- (void)onRenderVideoFrame:(TRTCVideoFrame *)frame
userId:(NSString *)userId
streamType:(TRTCVideoStreamType)streamType
{
// If `userId` is `nil`, the image rendered is the local image; otherwise it is a remote image.
CFRetain(frame.pixelBuffer);
__weak __typeof(self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
TestRenderVideoFrame *strongSelf = weakSelf;
UIImageView* videoView = nil;
if (userId) {
videoView = [strongSelf.userVideoViews objectForKey:userId];
}
else {
videoView = strongSelf.localVideoView;
}
videoView.image = [UIImage imageWithCIImage:[CIImage imageWithCVImageBuffer:frame.pixelBuffer]];
videoView.contentMode = UIViewContentModeScaleAspectFit;
CFRelease(frame.pixelBuffer);
});
}

```
// For specific implementation, see `test_custom_render.cpp` in `TRTC-API-Example-Qt`
void TestCustomRender::onRenderVideoFrame(
const char* userId,
liteav::TRTCVideoStreamType streamType,
liteav::TRTCVideoFrame* frame) {
if (gl_yuv_widget_ == nullptr) {
return;
}

if (streamType == liteav::TRTCVideoStreamType::TRTCVideoStreamTypeBig) {
// Adjust the rendering window
emit renderViewSize(frame->width, frame->height);
// Draw video frames
gl_yuv_widget_->slotShowYuv(reinterpret_cast<uchar*>(frame->data),
frame->width, frame->height);
}
}
```