(Ironically, a prototype itself...) 😅
Status: Work In Progress
- Make it easier to prototype basic Machine Learning apps with SwiftUI
- Provide an easy interface for commonly built views to assist with prototyping and idea validation
- Effectively a wrapper around the more complex APIs, providing a simpler interface (perhaps not all the same functionality, but enough to get you started and inspired!)
Here are a few basic examples you can use today.
- Ensure you have created your Xcode project
- Ensure you have added the PrototypeKit package to your project (instructions above -- coming soon)
- Select your project file within the project navigator.

- Ensure that your target is selected

- Select the info tab.
- Right-click within the "Custom iOS Target Properties" table, and select "Add Row"

- Use
Privacy - Camera Usage Description
for the key. Type the reason your app will use the camera as the value.

Utilise PKCameraView
PKCameraView()
Full Example
import SwiftUI
import PrototypeKit
struct ContentView: View {
var body: some View {
VStack {
PKCameraView()
}
.padding()
}
}
- Required Step: Drag in your Create ML / Core ML model into Xcode.
- Change
FruitClassifier
below to the name of your Model. - You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)
Utilise ImageClassifierView
ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
Full Example
import SwiftUI
import PrototypeKit
struct ImageClassifierViewSample: View {
@State var latestPrediction: String = ""
var body: some View {
VStack {
ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
Text(latestPrediction)
}
}
}
Utilise LiveTextRecognizerView
LiveTextRecognizerView(detectedText: $detectedText)
Full Example
import SwiftUI
import PrototypeKit
struct TextRecognizerView: View {
@State var detectedText: [String] = []
var body: some View {
VStack {
LiveTextRecognizerView(detectedText: $detectedText)
ScrollView {
ForEach(Array(detectedText.enumerated()), id: \.offset) { line, text in
Text(text)
}
}
}
}
}
Utilise LiveBarcodeRecognizerView
LiveBarcodeRecognizerView(detectedBarcodes: $detectedBarcodes)
Full Example
import SwiftUI
import PrototypeKit
struct BarcodeRecognizerView: View {
@State var detectedBarcodes: [String] = []
var body: some View {
VStack {
LiveBarcodeRecognizerView(detectedBarcodes: $detectedBarcodes)
ScrollView {
ForEach(Array(detectedBarcodes.enumerated()), id: \.offset) { index, barcode in
Text(barcode)
}
}
}
}
}
Utilise recognizeSounds
modifier to detect sounds in real-time. This feature supports both the system sound classifier and custom Core ML models.
.recognizeSounds(recognizedSound: $recognizedSound)
For custom configuration, you can use the SoundAnalysisConfiguration
:
.recognizeSounds(
recognizedSound: $recognizedSound,
configuration: SoundAnalysisConfiguration(
inferenceWindowSize: 1.5, // Window size in seconds
overlapFactor: 0.9, // Overlap between consecutive windows
mlModel: yourCustomModel // Optional custom Core ML model
)
)
Full Example
import SwiftUI
import PrototypeKit
struct SoundRecognizerView: View {
@State var recognizedSound: String?
var body: some View {
VStack {
Text("Recognized Sound: \(recognizedSound ?? "None")")
}
.recognizeSounds(recognizedSound: $recognizedSound)
}
}
Is this production ready?
no.