Skip to content

lukereichold/VisualActionKit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VisualActionKit

GitHub license SPM compatible Twitter

Human action classification for video, offline and natively on iOS via Core ML

Uses the kinetics-i3d model to classify videos into one of 400 different action classes defined in Kinetics 400

⚠️ This project requires Xcode 12.

Reference: See accompanying blog post

Open in Colab

Installation

To install via Swift Package Manager, add VisualActionKit to your Package.swift file. Alternatively, add it from Xcode directly.

let package = Package(
    ...
    dependencies: [
        .package(url: "https://github.com/lukereichold/VisualActionKit.git", from: "0.1.0")
    ],
    ...
)

Then import VisualActionKit wherever you’d like to use it:

import VisualActionKit

Usage

let url = Bundle.module.url(forResource: "writing", withExtension: "mp4")
let asset = AVAsset(url: url)

try Classifier.shared.classify(asset) { predictions in
    print(predictions)
}

Contribute

Contributions welcome. Please check out the issues.