Skip to content

Files

Latest commit

14b6dd6 · Apr 10, 2025

History

History
220 lines (162 loc) · 5.43 KB

README.md

File metadata and controls

220 lines (162 loc) · 5.43 KB

PrototypeKit

Swift

(Ironically, a prototype itself...) 😅

Status: Work In Progress

Goals 🥅

  • Make it easier to prototype basic Machine Learning apps with SwiftUI
  • Provide an easy interface for commonly built views to assist with prototyping and idea validation
  • Effectively a wrapper around the more complex APIs, providing a simpler interface (perhaps not all the same functionality, but enough to get you started and inspired!)

Examples

Here are a few basic examples you can use today.

Camera Tasks

Start Here

  1. Ensure you have created your Xcode project
  2. Ensure you have added the PrototypeKit package to your project (instructions above -- coming soon)
  3. Select your project file within the project navigator.
Screenshot 2024-02-02 at 3 42 28 pm
  1. Ensure that your target is selected
Screenshot 2024-02-02 at 3 43 22 pm
  1. Select the info tab.
  2. Right-click within the "Custom iOS Target Properties" table, and select "Add Row"
Screenshot 2024-02-02 at 3 44 40 pm
  1. Use Privacy - Camera Usage Description for the key. Type the reason your app will use the camera as the value.
Screenshot 2024-02-02 at 3 46 30 pm

Live Camera View

Utilise PKCameraView

PKCameraView()
Full Example
import SwiftUI
import PrototypeKit

struct ContentView: View {
    var body: some View {
        VStack {
            PKCameraView()
        }
        .padding()
    }
}

Live Image Classification

  1. Required Step: Drag in your Create ML / Core ML model into Xcode.
  2. Change FruitClassifier below to the name of your Model.
  3. You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)

Utilise ImageClassifierView

ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
                                latestPrediction: $latestPrediction)
Full Example
import SwiftUI
import PrototypeKit

struct ImageClassifierViewSample: View {
    
    @State var latestPrediction: String = ""
    
    var body: some View {
        VStack {
            ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
                                latestPrediction: $latestPrediction)
            Text(latestPrediction)
        }
    }
}

Live Text Recognition

Utilise LiveTextRecognizerView

LiveTextRecognizerView(detectedText: $detectedText)
Full Example
import SwiftUI
import PrototypeKit

struct TextRecognizerView: View {
    
    @State var detectedText: [String] = []
    
    var body: some View {
        VStack {
            LiveTextRecognizerView(detectedText: $detectedText)
            
            ScrollView {
                ForEach(Array(detectedText.enumerated()), id: \.offset) { line, text in
                    Text(text)
                }
            }
        }
    }
}

Live Barcode Recognition

Utilise LiveBarcodeRecognizerView

LiveBarcodeRecognizerView(detectedBarcodes: $detectedBarcodes)
Full Example
import SwiftUI
import PrototypeKit

struct BarcodeRecognizerView: View {
    
    @State var detectedBarcodes: [String] = []
    
    var body: some View {
        VStack {
            LiveBarcodeRecognizerView(detectedBarcodes: $detectedBarcodes)
            
            ScrollView {
                ForEach(Array(detectedBarcodes.enumerated()), id: \.offset) { index, barcode in
                    Text(barcode)
                }
            }
        }
    }
}

Live Sound Recognition

Utilise recognizeSounds modifier to detect sounds in real-time. This feature supports both the system sound classifier and custom Core ML models.

.recognizeSounds(recognizedSound: $recognizedSound)

For custom configuration, you can use the SoundAnalysisConfiguration:

.recognizeSounds(
    recognizedSound: $recognizedSound,
    configuration: SoundAnalysisConfiguration(
        inferenceWindowSize: 1.5,  // Window size in seconds
        overlapFactor: 0.9,        // Overlap between consecutive windows
        mlModel: yourCustomModel   // Optional custom Core ML model
    )
)
Full Example
import SwiftUI
import PrototypeKit

struct SoundRecognizerView: View {
    @State var recognizedSound: String?
    
    var body: some View {
        VStack {
            Text("Recognized Sound: \(recognizedSound ?? "None")")
        }
        .recognizeSounds(recognizedSound: $recognizedSound)
    }
}

FAQs

Is this production ready?
no.