Use this SDK to add realtime video, audio and data features to your Swift app. By connecting to LiveKit Cloud or a self-hosted server, you can quickly build applications such as multi-modal AI, live streaming, or video calls with just a few lines of code.
Note
Version 2 of the Swift SDK contains breaking changes from Version 1. Read the migration guide for a detailed overview of what has changed.
Docs and guides are at https://docs.livekit.io.
There is full source code of a iOS/macOS Swift UI Example App.
For minimal examples view this repo 👉 Swift SDK Examples
LiveKit for Swift is available as a Swift Package.
Add the dependency and also to your target
let package = Package(
...
dependencies: [
.package(name: "LiveKit", url: "https://github.com/livekit/client-sdk-swift.git", .upToNextMajor("2.0.19")),
],
targets: [
.target(
name: "MyApp",
dependencies: ["LiveKit"]
)
]
}
Go to Project Settings -> Swift Packages.
Add a new package and enter: https://github.com/livekit/client-sdk-swift
LiveKit provides an UIKit based VideoView
class that renders video tracks. Subscribed audio tracks are automatically played.
import LiveKit
import UIKit
class RoomViewController: UIViewController {
lazy var room = Room(delegate: self)
lazy var remoteVideoView: VideoView = {
let videoView = VideoView()
view.addSubview(videoView)
// Additional initialization ...
return videoView
}()
lazy var localVideoView: VideoView = {
let videoView = VideoView()
view.addSubview(videoView)
// Additional initialization ...
return videoView
}()
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .white
let url = "ws://your_host"
let token = "your_jwt_token"
Task {
do {
try await room.connect(url: url, token: token)
// Connection successful...
// Publishing camera & mic...
try await room.localParticipant.setCamera(enabled: true)
try await room.localParticipant.setMicrophone(enabled: true)
} catch {
// Failed to connect
}
}
}
}
extension RoomViewController: RoomDelegate {
func room(_: Room, participant _: LocalParticipant, didPublishTrack publication: LocalTrackPublication) {
guard let track = publication.track as? VideoTrack else { return }
DispatchQueue.main.async {
self.localVideoView.track = track
}
}
func room(_: Room, participant _: RemoteParticipant, didSubscribeTrack publication: RemoteTrackPublication) {
guard let track = publication.track as? VideoTrack else { return }
DispatchQueue.main.async {
self.remoteVideoView.track = track
}
}
}
See iOS Screen Sharing instructions.
Since VideoView
is a UI component, all operations (read/write properties etc) must be performed from the main
thread.
Other core classes can be accessed from any thread.
Delegates will be called on the SDK's internal thread.
Make sure any access to your app's UI elements are from the main thread, for example by using @MainActor
or DispatchQueue.main.async
.
It is recommended to use weak var when storing references to objects created and managed by the SDK, such as Participant
, TrackPublication
etc. These objects are invalid when the Room
disconnects, and will be released by the SDK. Holding strong reference to these objects will prevent releasing Room
and other internal objects.
VideoView.track
property does not hold strong reference, so it's not required to set it to nil
.
LiveKit will automatically manage the underlying AVAudioSession
while connected. The session will be set to playback
category by default. When a local stream is published, it'll be switched to
playAndRecord
. In general, it'll pick sane defaults and do the right thing.
However, if you'd like to customize this behavior, you would override AudioManager.customConfigureAudioSessionFunc
to manage the underlying session on your own. See example here for the default behavior.
To integrate with CallKit for background-triggered incoming calls, LiveKit's audio session must be synchronized with CallKit's audio session:
- Add
import LiveKitWebRTC
to your CallProvider file. - In your
CXProviderDelegate
implementation, add the following:
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession){
LKRTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
// ...
}
func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {
LKRTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
// ...
}
- Publishing the camera track is not supported by iOS Simulator.
It is recommended to turn off rendering of VideoView
s that scroll off the screen and isn't visible by setting false
to isEnabled
property and true
when it will re-appear to save CPU resources.
UICollectionViewDelegate
's willDisplay
/ didEndDisplaying
has been reported to be unreliable for this purpose. Specifically, in some iOS versions didEndDisplaying
could get invoked even when the cell is visible.
The following is an alternative method to using willDisplay
/ didEndDisplaying
:
// 1. define a weak-reference set for all cells
private var allCells = NSHashTable<ParticipantCell>.weakObjects()
// in UICollectionViewDataSource...
public func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: ParticipantCell.reuseIdentifier, for: indexPath)
if let cell = cell as? ParticipantCell {
// 2. keep weak reference to the cell
allCells.add(cell)
// configure cell etc...
}
return cell
}
// 3. define a func to re-compute and update isEnabled property for cells that visibility changed
func reComputeVideoViewEnabled() {
let visibleCells = collectionView.visibleCells.compactMap { $0 as? ParticipantCell }
let offScreenCells = allCells.allObjects.filter { !visibleCells.contains($0) }
for cell in visibleCells.filter({ !$0.videoView.isEnabled }) {
print("enabling cell#\(cell.hashValue)")
cell.videoView.isEnabled = true
}
for cell in offScreenCells.filter({ $0.videoView.isEnabled }) {
print("disabling cell#\(cell.hashValue)")
cell.videoView.isEnabled = false
}
}
// 4. set a timer to invoke the func
self.timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true, block: { [weak self] _ in
self?.reComputeVideoViewEnabled()
})
// alternatively, you can call `reComputeVideoViewEnabled` whenever cell visibility changes (such as scrollViewDidScroll(_:)),
// but this will be harder to track all cases such as cell reload etc.
For the full example, see 👉 UIKit Minimal Example
- Create a
LocalVideoTrack
by callingLocalVideoTrack.createCameraTrack(options: CameraCaptureOptions(fps: 60))
. - Publish with
LocalParticipant.publish(videoTrack: track, publishOptions: VideoPublishOptions(encoding: VideoEncoding(maxFps: 60)))
.
If your app is targeting macOS Catalina, make sure to do the following to avoid crash (ReplayKit not found):
- Explicitly add "ReplayKit.framework" to the Build Phases > Link Binary with Libraries section
- Set it to Optional
- I am not sure why this is required for ReplayKit at the moment.
- If you are targeting macOS 11.0+, this is not required.
Please join us on Slack to get help from our devs / community members. We welcome your contributions(PRs) and details can be discussed there.
LiveKit Ecosystem | |
---|---|
Realtime SDKs | React Components · Browser · Swift Components · iOS/macOS/visionOS · Android · Flutter · React Native · Rust · Node.js · Python · Unity (web) · Unity (beta) |
Server APIs | Node.js · Golang · Ruby · Java/Kotlin · Python · Rust · PHP (community) |
Agents Frameworks | Python · Playground |
Services | LiveKit server · Egress · Ingress · SIP |
Resources | Docs · Example apps · Cloud · Self-hosting · CLI |