-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for AVSampleBufferDisplayLayer format (iOS) #8910
Comments
You can create Create CVPixelBufferRef from OpenGL class RenderView: GLKView {
var frameBuffer: SRFrameBuffer?
func drawRect() {
// ....
let pixelBuffer = frameBuffer?.target {
glFinish()
renderDelegate?.renderView(self, dispatchCVPixelBuffer: pixelBuffer)
}
}
}
extension AVSampleBufferDisplayLayer {
func process(_ pixelBuffer: CVPixelBuffer) {
autoreleasepool {
var timingInfo: CMSampleTimingInfo = .invalid
var sampleBufferOut: CMSampleBuffer?
var formatDescription: CMFormatDescription?
CMVideoFormatDescriptionCreateForImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer,
formatDescriptionOut: &formatDescription)
if let formatDescription = formatDescription {
CMSampleBufferCreateForImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer,
dataReady: true,
makeDataReadyCallback: nil,
refcon: nil,
formatDescription: formatDescription,
sampleTiming: &timingInfo,
sampleBufferOut: &sampleBufferOut)
if let sampleBufferOut = sampleBufferOut {
let attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBufferOut, createIfNecessary: true)
let attachment = unsafeBitCast(CFArrayGetValueAtIndex(attachments, 0), to: CFMutableDictionary.self)
CFDictionarySetValue(attachment,
Unmanaged.passUnretained(kCMSampleAttachmentKey_DisplayImmediately).toOpaque(),
Unmanaged.passUnretained(kCFBooleanTrue).toOpaque())
self.enqueue(sampleBufferOut)
if self.status == .failed {
self.flush()
print("AVSampleBufferDisplayLayer enqueue status failed")
self.enqueue(sampleBufferOut)
}
if let error = self.error {
print("AVSampleBufferDisplayLayer enqueue error:\(error)")
}
}
}
}
}
} |
Thanks for the tip, but i don't get it to works. The pixelBuffer is always empty, i think i have a problem while binding the framebuffer and rendering from |
fbo should using frameBuffer's fbo. if let frameBuffer = frameBuffer {
fboId = GLint(frameBuffer.frameBuffer)
dimension = frameBuffer.size
flip.pointee = 0
} |
Yep i'm using fbo, but i'm missing something else (is my test ok ?) Also override func draw(_ rect: CGRect) {
EAGLContext.setCurrent(self.context)
var flip: CInt = 1
if let frameBuffer = frameBuffer {
glGetIntegerv(GLenum(GL_DRAW_FRAMEBUFFER_BINDING), &frameBuffer.frameBuffer)
withUnsafeMutablePointer(to: &flip) { flip in
if HandlerManager.shared.activePlayer().mpvRenderContext != nil {
fbo = GLint(frameBuffer.frameBuffer)
var data = mpv_opengl_fbo(fbo: Int32(fbo),
w: Int32(self.drawableWidth),
h: Int32(self.drawableHeight),
internal_format: 0)
withUnsafeMutablePointer(to: &data) { data in
var params: [mpv_render_param] = [
mpv_render_param(type: MPV_RENDER_PARAM_OPENGL_FBO, data: .init(data)),
mpv_render_param(type: MPV_RENDER_PARAM_FLIP_Y, data: .init(flip)),
mpv_render_param()
]
mpv_render_context_render(HandlerManager.shared.activePlayer().mpvRenderContext, ¶ms)
}
} else {
glClearColor(0, 0, 0, 1)
}
}
}
glFlush()
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), 0)
// Test if we have something in target ?
if let pixelBuffer = frameBuffer?.target {
glFinish()
//renderDelegate?.renderView(self, dispatchCVPixelBuffer: pixelBuffer)
var cgImage: CGImage?
VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &cgImage)
let image1 = UIImage.init(cgImage: cgImage!)
let image2 = UIImage(ciImage: CIImage(cvPixelBuffer: pixelBuffer))
let image3 = pixelBuffer.cgImage()?.image()
print("image empty ?", image1, image2, image3)
}
} |
I dug this up and pushed it here: tmm1@d438818 |
Hi @tmm1 Thanks a lot ! Watching logs, i see that:
Looks like vo Do you have any sample code or tips on how to use it ? |
It can only render CVPixBuf so I think you need |
Tested with a h264 video, still failing
|
Yea it won't work if the pixel format is yuv420p. You need the pixel format to be videotoolbox_vld. I can't tell why it's not decoding with videotoolbox, there should be more info in preceding Alternatively you can convert yuv420p into videotoolbox_vld using lavf hwupload. I added support for that in https://www.mail-archive.com/ffmpeg-devel@ffmpeg.org/msg121506.html |
Yes it's works with the filter. Thanks! |
Hi i did everything you said @tmm1 , it's working but the video is laggy, the log doesn't display any error Did you have this issue @skrew ? Is there something i missing ? thank you Screen.Recording.2023-02-01.at.7.55.15.PM.mov |
Here the logs
|
I am also encountering the video laggy issue too when using I guess the reason cause video laggy is that sample buffer display immediately. The frame is displayed when decoded, that may cause frame not synced with actual presentation time. CFDictionarySetValue(
(CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0),
kCMSampleAttachmentKey_DisplayImmediately,
kCFBooleanTrue
); After I comment // renderview.swift
let timebasePointer = UnsafeMutablePointer<CMTimebase?>.allocate(capacity: 1)
let status = CMTimebaseCreateWithSourceClock(allocator: kCFAllocatorDefault, sourceClock: CMClockGetHostTimeClock(), timebaseOut: timebasePointer)
displayLayer.controlTimebase = timebasePointer.pointee
if let controlTimeBase = displayLayer.controlTimebase, status == noErr {
CMTimebaseSetTime(controlTimeBase, time: CMTime.zero)
CMTimebaseSetRate(controlTimeBase, rate: 1)
} // vo_avfoundation.m
CMTimebaseRef timebase = [p->displayLayer controlTimebase];
CMTime nowTime = CMTimebaseGetTime(timebase);
CVPixelBufferRef pixbuf = (CVPixelBufferRef)img->planes[3];
CMSampleTimingInfo info = {
.presentationTimeStamp = nowTime,
.duration = kCMTimeInvalid,
.decodeTimeStamp = kCMTimeInvalid
};
...
// CFDictionarySetValue(
// (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0),
// kCMSampleAttachmentKey_DisplayImmediately,
// kCFBooleanTrue
// ); |
Updated: After I put static void flip_page(struct vo *vo)
{
struct priv *p = vo->priv;
struct mp_image *img = p->next_image;
if (!img)
return;
mp_image_unrefp(&p->next_image);
}
static void draw_frame(struct vo *vo, struct vo_frame *frame)
{
struct priv *p = vo->priv;
mp_image_t *mpi = NULL;
if (!frame->redraw && !frame->repeat)
mpi = mp_image_new_ref(frame->current);
talloc_free(p->next_image);
if (!mpi)
return;
CVPixelBufferRef pixbuf = (CVPixelBufferRef)mpi->planes[3];
CMSampleTimingInfo info = {
.presentationTimeStamp = kCMTimeInvalid,
.duration = kCMTimeInvalid,
.decodeTimeStamp = kCMTimeInvalid
};
CMSampleBufferRef buf = NULL;
CMFormatDescriptionRef format = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixbuf, &format);
CMSampleBufferCreateReadyWithImageBuffer(
NULL,
pixbuf,
format,
&info,
&buf
);
CFRelease(format);
CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(buf, YES);
CFDictionarySetValue(
(CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0),
kCMSampleAttachmentKey_DisplayImmediately,
kCFBooleanTrue
);
[p->displayLayer enqueueSampleBuffer:buf];
CFRelease(buf);
p->next_image = mpi;
p->next_vo_pts = frame->pts;
} PS: I do not know if it's the right place to enqueueSampleBuffer. |
Hi @byMohamedali do you have subtitle display issue with this vo ? External subtitles are not showing. |
Yes i have the same issue when using avfoundation, i'm investigating how we could solve that, please let me know if you find a solution :) |
In #7857 @tmm1 said he working on adding
AVSampleBufferDisplayLayer
on iOS via--wid
option.I would like to replace my current
GLES
implementation withAVSampleBufferDisplayLayer
, asOpenGL
is deprecated for long time now by Apple.It would be fine to get the good buffer format from
libmpv
to queue frames toCMSampleBuffer
What do you think ?
The text was updated successfully, but these errors were encountered: