-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add onStarted
and onStopped
events
#2273
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
Log.i(CameraView.TAG, "invokeOnStarted()") | ||
|
||
val reactContext = context as ReactContext | ||
reactContext.getJSModule(RCTEventEmitter::class.java).receiveEvent(id, "cameraStarted", null) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @mrousavy Maybe we should probably add the native timestamp in this event?
As otherwise, JS thread may delay a lot to consume these events, so if someone wants absolute precision he won't be able to do so in JS thread, making this feature less useful?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey - why would you need the timestamp, what would that actually change?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can think of a feature where you want to synchronize video frames with another source, i.e bluetooth data from a device, in matter of ms delay.
To do so, one may want to have absolute control of when the video started/stopped recording frames, and match this with the device's stream input.
Without native timestamp on these events, we can only mark the timestamp on JS thread when this event is captured. Though, in case the JS thread is blocked and this event is processed later in the queue, we can end with completely false timestamp
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tbh I don't agree here - this should definitely be done with a Frame Processor then, where you get a full stream of frames with their exactly accurate timestamps, and they are definitely in time as it is synchronous and blocking.
It is possible to return the frame's size in the |
No, because there is no "frame" if there is no VideoPipeline for example. You could get the size from your If you are not using |
If I specify the video format to 1080P, the frame's size is 1920x1080 on Android and 1080x1920 on iOS. If this is a fixed behavior, I can handle this difference based on platforms and don't need to get the frame size in an event. But it would be great to have an event to return the actual frame size after the camera starts. |
What
Adds two events:
onStarted
: Called when the Camera started streaming framesonStopped
: Called when the Camera stops streaming framesChanges
Tested on
Related issues