WebRTC音視頻通話-實(shí)現(xiàn)GPUImage視頻美顏濾鏡效果
在WebRTC音視頻通話的GPUImage美顏效果圖如下
可以看下
之前搭建ossrs服務(wù),可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
之前實(shí)現(xiàn)iOS端調(diào)用ossrs音視頻通話,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
之前WebRTC音視頻通話高分辨率不顯示畫(huà)面問(wèn)題,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
修改SDP中的碼率Bitrate,可以查看:https://blog.csdn.net/gloryFlow/article/details/132263021
一、GPUImage是什么?
GPUImage是iOS上一個(gè)基于OpenGL進(jìn)行圖像處理的開(kāi)源框架,內(nèi)置大量濾鏡,架構(gòu)靈活,可以在其基礎(chǔ)上很輕松地實(shí)現(xiàn)各種圖像處理功能。
GPUImage中包含各種濾鏡,這里我不會(huì)使用那么多,使用的是GPUImageLookupFilter及GPUImagePicture
GPUImage中有一個(gè)專(zhuān)門(mén)針對(duì)lookup table進(jìn)行處理的濾鏡函數(shù)GPUImageLookupFilter,使用這個(gè)函數(shù)就可以直接對(duì)圖片進(jìn)行濾鏡添加操作了。代碼如下
/**
GPUImage中有一個(gè)專(zhuān)門(mén)針對(duì)lookup table進(jìn)行處理的濾鏡函數(shù)GPUImageLookupFilter,使用這個(gè)函數(shù)就可以直接對(duì)圖片進(jìn)行濾鏡添加操作了。
originalImg是你希望添加濾鏡的原始圖片
@param image 原圖
@return 處理后的圖片
*/
+ (UIImage *)applyLookupFilter:(UIImage *)image lookUpImage:(UIImage *)lookUpImage {
if (lookUpImage == nil) {
return image;
}
UIImage *inputImage = image;
UIImage *outputImage = nil;
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
//添加濾鏡
GPUImageLookupFilter *lookUpFilter = [[GPUImageLookupFilter alloc] init];
//導(dǎo)入之前保存的NewLookupTable.png文件
GPUImagePicture *lookupImg = [[GPUImagePicture alloc] initWithImage:lookUpImage];
[lookupImg addTarget:lookUpFilter atTextureLocation:1];
[stillImageSource addTarget:lookUpFilter atTextureLocation:0];
[lookUpFilter useNextFrameForImageCapture];
if([lookupImg processImageWithCompletionHandler:nil] && [stillImageSource processImageWithCompletionHandler:nil]) {
outputImage= [lookUpFilter imageFromCurrentFramebuffer];
}
return outputImage;
}
這個(gè)需要lookUpImage,圖列表如下
由于暫時(shí)沒(méi)有整理demo的git
這里在使用applyLomofiFilter再試下效果
SDApplyFilter.m中的幾個(gè)方法
+ (UIImage *)applyBeautyFilter:(UIImage *)image {
GPUImageBeautifyFilter *filter = [[GPUImageBeautifyFilter alloc] init];
[filter forceProcessingAtSize:image.size];
GPUImagePicture *pic = [[GPUImagePicture alloc] initWithImage:image];
[pic addTarget:filter];
[pic processImage];
[filter useNextFrameForImageCapture];
return [filter imageFromCurrentFramebuffer];
}
/**
Amatorka濾鏡 Rise濾鏡,可以使人像皮膚得到很好的調(diào)整
@param image image
@return 處理后的圖片
*/
+ (UIImage *)applyAmatorkaFilter:(UIImage *)image
{
GPUImageAmatorkaFilter *filter = [[GPUImageAmatorkaFilter alloc] init];
[filter forceProcessingAtSize:image.size];
GPUImagePicture *pic = [[GPUImagePicture alloc] initWithImage:image];
[pic addTarget:filter];
[pic processImage];
[filter useNextFrameForImageCapture];
return [filter imageFromCurrentFramebuffer];
}
/**
復(fù)古型濾鏡,感覺(jué)像舊上海灘
@param image image
@return 處理后的圖片
*/
+ (UIImage *)applySoftEleganceFilter:(UIImage *)image
{
GPUImageSoftEleganceFilter *filter = [[GPUImageSoftEleganceFilter alloc] init];
[filter forceProcessingAtSize:image.size];
GPUImagePicture *pic = [[GPUImagePicture alloc] initWithImage:image];
[pic addTarget:filter];
[pic processImage];
[filter useNextFrameForImageCapture];
return [filter imageFromCurrentFramebuffer];
}
/**
圖像黑白化,并有大量噪點(diǎn)
@param image 原圖
@return 處理后的圖片
*/
+ (UIImage *)applyLocalBinaryPatternFilter:(UIImage *)image
{
GPUImageLocalBinaryPatternFilter *filter = [[GPUImageLocalBinaryPatternFilter alloc] init];
[filter forceProcessingAtSize:image.size];
GPUImagePicture *pic = [[GPUImagePicture alloc] initWithImage:image];
[pic addTarget:filter];
[pic processImage];
[filter useNextFrameForImageCapture];
return [filter imageFromCurrentFramebuffer];
}
/**
單色濾鏡
@param image 原圖
@return 處理后的圖片
*/
+ (UIImage *)applyMonochromeFilter:(UIImage *)image
{
GPUImageMonochromeFilter *filter = [[GPUImageMonochromeFilter alloc] init];
[filter forceProcessingAtSize:image.size];
GPUImagePicture *pic = [[GPUImagePicture alloc] initWithImage:image];
[pic addTarget:filter];
[pic processImage];
[filter useNextFrameForImageCapture];
return [filter imageFromCurrentFramebuffer];
}
使用GPUImageSoftEleganceFilter復(fù)古型濾鏡,感覺(jué)像舊上海灘效果圖如下
使用GPUImageLocalBinaryPatternFilter圖像黑白化效果圖如下
使用GPUImageMonochromeFilter 效果圖如下
二、WebRTC實(shí)現(xiàn)音視頻通話中視頻濾鏡處理
之前實(shí)現(xiàn)iOS端調(diào)用ossrs音視頻通話,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
這個(gè)已經(jīng)有完整的代碼了,這里暫時(shí)做一下調(diào)整。
為RTCCameraVideoCapturer的delegate指向代理
- (RTCVideoTrack *)createVideoTrack {
RTCVideoSource *videoSource = [self.factory videoSource];
self.localVideoSource = videoSource;
// 如果是模擬器
if (TARGET_IPHONE_SIMULATOR) {
if (@available(iOS 10, *)) {
self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];
} else {
// Fallback on earlier versions
}
} else{
self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:self];
}
RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
return videoTrack;
}
實(shí)現(xiàn)RTCVideoCapturerDelegate的方法didCaptureVideoFrame
#pragma mark - RTCVideoCapturerDelegate處理代理
- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame {
// DebugLog(@"capturer:%@ didCaptureVideoFrame:%@", capturer, frame);
// 調(diào)用SDWebRTCBufferFliter的濾鏡處理
RTCVideoFrame *aFilterVideoFrame;
if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didCaptureVideoFrame:)]) {
aFilterVideoFrame = [self.delegate webRTCClient:self didCaptureVideoFrame:frame];
}
// 操作C 需要手動(dòng)釋放 否則內(nèi)存暴漲
// CVPixelBufferRelease(_buffer)
// 拿到pixelBuffer
// ((RTCCVPixelBuffer*)frame.buffer).pixelBuffer
if (!aFilterVideoFrame) {
aFilterVideoFrame = frame;
}
[self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];
}
之后調(diào)用SDWebRTCBufferFliter,實(shí)現(xiàn)濾鏡效果。
實(shí)現(xiàn)將((RTCCVPixelBuffer *)frame.buffer).pixelBuffer進(jìn)行渲染,這里用到了EAGLContext、CIContext
EAGLContext是OpenGL繪制句柄或者上下文,在繪制試圖之前,需要指定使用創(chuàng)建的上下文繪制。
CIContext是用來(lái)渲染CIImage,將作用在CIImage上的濾鏡鏈應(yīng)用到原始的圖片數(shù)據(jù)中。我這里需要將UIImage轉(zhuǎn)換成CIImage。
具體代碼實(shí)現(xiàn)如下
SDWebRTCBufferFliter.h
#import <Foundation/Foundation.h>
#import "WebRTCClient.h"
@interface SDWebRTCBufferFliter : NSObject
- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client didCaptureVideoFrame:(RTCVideoFrame *)frame;
@end
SDWebRTCBufferFliter.m
#import "SDWebRTCBufferFliter.h"
#import <VideoToolbox/VideoToolbox.h>
#import "SDApplyFilter.h"
@interface SDWebRTCBufferFliter ()
// 濾鏡
@property (nonatomic, strong) EAGLContext *eaglContext;
@property (nonatomic, strong) CIContext *coreImageContext;
@property (nonatomic, strong) UIImage *lookUpImage;
@end
@implementation SDWebRTCBufferFliter
- (instancetype)init
{
self = [super init];
if (self) {
self.eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES3];
self.coreImageContext = [CIContext contextWithEAGLContext:self.eaglContext options:nil];
self.lookUpImage = [UIImage imageNamed:@"lookup_jiari"];
}
return self;
}
- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client didCaptureVideoFrame:(RTCVideoFrame *)frame {
CVPixelBufferRef pixelBufferRef = ((RTCCVPixelBuffer *)frame.buffer).pixelBuffer;
// CFRetain(pixelBufferRef);
if (pixelBufferRef) {
CIImage *inputImage = [CIImage imageWithCVPixelBuffer:pixelBufferRef];
CGImageRef imgRef = [_coreImageContext createCGImage:inputImage fromRect:[inputImage extent]];
UIImage *fromImage = nil;
if (!fromImage) {
fromImage = [UIImage imageWithCGImage:imgRef];
}
UIImage *toImage;
toImage = [SDApplyFilter applyMonochromeFilter:fromImage];
//
// if (toImage == nil) {
// toImage = [SDApplyFilter applyLookupFilter:fromImage lookUpImage:self.lookUpImage];
// } else {
// toImage = [SDApplyFilter applyLookupFilter:fromImage lookUpImage:self.lookUpImage];
// }
if (toImage == nil) {
toImage = fromImage;
}
CGImageRef toImgRef = toImage.CGImage;
CIImage *ciimage = [CIImage imageWithCGImage:toImgRef];
[_coreImageContext render:ciimage toCVPixelBuffer:pixelBufferRef];
CGImageRelease(imgRef);//必須釋放
fromImage = nil;
toImage = nil;
ciimage = nil;
inputImage = nil;
}
RTCCVPixelBuffer *rtcPixelBuffer =
[[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBufferRef];
RTCVideoFrame *filteredFrame =
[[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer
rotation:frame.rotation
timeStampNs:frame.timeStampNs];
return filteredFrame;
}
@end
至此可以看到在WebRTC音視頻通話中GPUImage視頻美顏濾鏡的具體效果了。
三、小結(jié)
WebRTC音視頻通話-實(shí)現(xiàn)GPUImage視頻美顏濾鏡效果。主要用到GPUImage處理視頻畫(huà)面CVPixelBufferRef,將處理后的CVPixelBufferRef生成RTCVideoFrame,通過(guò)調(diào)用localVideoSource中實(shí)現(xiàn)的didCaptureVideoFrame方法。內(nèi)容較多,描述可能不準(zhǔn)確,請(qǐng)見(jiàn)諒。
本文地址:https://blog.csdn.net/gloryFlow/article/details/132265842文章來(lái)源:http://www.zghlxwxcb.cn/news/detail-652742.html
學(xué)習(xí)記錄,每天不停進(jìn)步。文章來(lái)源地址http://www.zghlxwxcb.cn/news/detail-652742.html
到了這里,關(guān)于WebRTC音視頻通話-實(shí)現(xiàn)GPUImage視頻美顏濾鏡效果iOS的文章就介紹完了。如果您還想了解更多內(nèi)容,請(qǐng)?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!