This repository is a fork of livekit-ffi to enable the use of WebRTC with the livekit.org.webrtc
prefix in the Android library. This allows integrating two different WebRTC libraries within the same project.
To get started, clone the repository and switch to the desired tag:
git clone https://github.com/DisplayNote/dn_livekit_ffi.git
cd dn_livekit_ffi
git checkout <tag-to-generate>
To get new changes from upstream branch we must to:
Create a new branch from main
called support/ffi-vx.xx.xx
in order to get the changes there.
git checkout -b support/ffi-v0.13.0
Get all new changes from upstream:
git fetch upstream --tags
Do a rebase:
git rebase ffi-v0.13.0
Then you must to solve conflicts, then, you must to compile, first webrtc and then ffi
git rebase --apply
Once changes are applied and webrtc is built, generate the Conan build directory, which will contain the necessary files for uploading to Conan.
To get the webrtc zip file generated you must to set the environ LK_ARTIFACT_WEBRTC
where will be the path where the webrtc zip generated is located.
generate_conan_build.bat --platform windows
generate_conan_build.bat --platform android
./generate_conan_build.sh --platform android
Note: Windows builds cannot be compiled from Linux.
After execution, a livekit-ffi_conan
directory will be created at the root of the project, containing the necessary files for Conan package export and upload.
You must be to insert the correct version to upload in your conanfile.py
To export the package for different profiles, execute the following command:
conan export-pkg . livekit-ffi/0.7.2@dn/stable -pr android.arm64-v8a.debug -f &&
conan export-pkg . livekit-ffi/0.7.2@dn/stable -pr android.arm64-v8a.release -f &&
conan export-pkg . livekit-ffi/0.7.2@dn/stable -pr android.armeabi-v7a.debug -f &&
conan export-pkg . livekit-ffi/0.7.2@dn/stable -pr android.armeabi-v7a.release -f &&
conan export-pkg . livekit-ffi/0.7.2@dn/stable -pr msvc19.x86_64.debug -f &&
conan export-pkg . livekit-ffi/0.7.2@dn/stable -pr msvc19.x86_64.release -f
Finally, upload the package to the Conan repository:
conan upload livekit-ffi/0.7.2@dn/stable -r dn --all
Replace 0.7.2
with the appropriate tag version as needed.
This fork ensures compatibility with projects requiring multiple WebRTC implementations while maintaining seamless integration with Conan package management.

Use this SDK to add real-time video, audio and data features to your Rust app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.
- Receiving tracks
- Publishing tracks
- Data channels
- Simulcast
- SVC codecs (AV1/VP9)
- Adaptive Streaming
- Dynacast
- Hardware video enc/dec
- VideoToolbox for MacOS/iOS
- Supported Platforms
- Windows
- MacOS
- Linux
- iOS
- Android
livekit-api
: Server APIs and auth token generationlivekit
: LiveKit real-time SDKlivekit-ffi
: Internal crate, used to generate bindings for other languageslivekit-protocol
: LiveKit protocol generated code
When adding the SDK as a dependency to your project, make sure to add the
necessary rustflags
to your cargo config, otherwise linking may fail.
Also, please refer to the list of the supported platform toolkits.
Currently, Tokio is required to use this SDK, however we plan to make the async executor runtime agnostic.
use livekit_api::access_token;
use std::env;
fn create_token() -> Result<String, access_token::AccessTokenError> {
let api_key = env::var("LIVEKIT_API_KEY").expect("LIVEKIT_API_KEY is not set");
let api_secret = env::var("LIVEKIT_API_SECRET").expect("LIVEKIT_API_SECRET is not set");
let token = access_token::AccessToken::with_api_key(&api_key, &api_secret)
.with_identity("rust-bot")
.with_name("Rust Bot")
.with_grants(access_token::VideoGrants {
room_join: true,
room: "my-room".to_string(),
..Default::default()
})
.to_jwt();
return token
}
use livekit_api::services::room::{CreateRoomOptions, RoomClient};
#[tokio::main]
async fn main() {
let room_service = RoomClient::new("http://localhost:7880").unwrap();
let room = room_service
.create_room("my_room", CreateRoomOptions::default())
.await
.unwrap();
println!("Created room: {:?}", room);
}
use livekit::prelude::*;
#[tokio::main]
async fn main() -> Result<()> {
let (room, mut room_events) = Room::connect(&url, &token).await?;
while let Some(event) = room_events.recv().await {
match event {
RoomEvent::TrackSubscribed { track, publication, participant } => {
// ...
}
_ => {}
}
}
Ok(())
}
...
use futures::StreamExt; // this trait is required for iterating on audio & video frames
use livekit::prelude::*;
match event {
RoomEvent::TrackSubscribed { track, publication, participant } => {
match track {
RemoteTrack::Audio(audio_track) => {
let rtc_track = audio_track.rtc_track();
let mut audio_stream = NativeAudioStream::new(rtc_track);
tokio::spawn(async move {
// Receive the audio frames in a new task
while let Some(audio_frame) = audio_stream.next().await {
log::info!("received audio frame - {audio_frame:#?}");
}
});
},
RemoteTrack::Video(video_track) => {
let rtc_track = video_track.rtc_track();
let mut video_stream = NativeVideoStream::new(rtc_track);
tokio::spawn(async move {
// Receive the video frames in a new task
while let Some(video_frame) = video_stream.next().await {
log::info!("received video frame - {video_frame:#?}");
}
});
},
}
},
_ => {}
}
- basic room: simple example connecting to a room.
- wgpu_room: complete example app with video rendering using wgpu and egui.
- mobile: mobile app targeting iOS and Android
- play_from_disk: publish audio from a wav file
- save_to_disk: save received audio to a wav file
LiveKit aims to provide an open source, end-to-end WebRTC stack that works everywhere. We have two goals in mind with this SDK:
- Build a standalone, cross-platform LiveKit client SDK for Rustaceans.
- Build a common core for other platform-specific SDKs (e.g. Unity, Unreal, iOS, Android)
Regarding (2), we've already developed a number of client SDKs for several platforms and encountered a few challenges in the process:
- There's a significant amount of business/control logic in our signaling protocol and WebRTC. Currently, this logic needs to be implemented in every new platform we support.
- Interactions with media devices and encoding/decoding are specific to each platform and framework.
- For multi-platform frameworks (e.g. Unity, Flutter, React Native), the aforementioned tasks proved to be extremely painful.
Thus, we posited a Rust SDK, something we wanted build anyway, encapsulating all our business logic and platform-specific APIs into a clean set of abstractions, could also serve as the foundation for our other SDKs!
We'll first use it as a basis for our Unity SDK (under development), but over time, it will power our other SDKs, as well.
LiveKit Ecosystem | |
---|---|
Real-time SDKs | React Components ยท Browser ยท iOS/macOS ยท Android ยท Flutter ยท React Native ยท Rust ยท Node.js ยท Python ยท Unity (web) ยท Unity (beta) |
Server APIs | Node.js ยท Golang ยท Ruby ยท Java/Kotlin ยท Python ยท Rust ยท PHP (community) |
Agents Frameworks | Python ยท Playground |
Services | Livekit server ยท Egress ยท Ingress ยท SIP |
Resources | Docs ยท Example apps ยท Cloud ยท Self-hosting ยท CLI |