Below are code samples on how to run MediaPipe on both mobile and desktop. We currently support MediaPipe APIs on mobile for Android only but will add support for Objective-C shortly.
Hello World! on Android should be the first mobile Android example users go through in detail. It teaches the following:
- Introduction of a simple MediaPipe graph running on mobile GPUs for Sobel edge detection.
- Building a simple baseline Android application that displays "Hello World!".
- Adding camera preview support into the baseline application using the Android CameraX API.
- Incorporating the Sobel edge detection graph to process the live camera preview and display the processed video in real-time.
Hello World! on iOS is the iOS version of Sobel edge detection example.
Object Detection with GPU illustrates how to use MediaPipe with a TFLite model for object detection in a GPU-accelerated pipeline.
Object Detection with CPU illustrates using the same TFLite model in a CPU-based pipeline. This example highlights how graphs can be easily adapted to run on CPU v.s. GPU.
Object Detection and Tracking with GPU illustrates how to use MediaPipe for object detection and tracking.
Face Detection with GPU illustrates how to use MediaPipe with a TFLite model for face detection in a GPU-accelerated pipeline. The selfie face detection TFLite model is based on "BlazeFace: Sub-millisecond Neural Face Detection on Mobile GPUs", and model details are described in the model card.
Face Detection with CPU illustrates using the same TFLite model in a CPU-based pipeline. This example highlights how graphs can be easily adapted to run on CPU v.s. GPU.
Hand Detection with GPU illustrates how to use MediaPipe with a TFLite model for hand detection in a GPU-accelerated pipeline.
Hand Tracking with GPU illustrates how to use MediaPipe with a TFLite model for hand tracking in a GPU-accelerated pipeline.
Multi-Hand Tracking with GPU illustrates how to use MediaPipe with a TFLite model for multi-hand tracking in a GPU-accelerated pipeline.
Hair Segmentation on GPU illustrates how to use MediaPipe with a TFLite model for hair segmentation in a GPU-accelerated pipeline. The selfie hair segmentation TFLite model is based on "Real-time Hair segmentation and recoloring on Mobile GPUs", and model details are described in the model card.
Hello World for C++ shows how to run a simple graph using the MediaPipe C++ APIs.
Feature Extraction and Model Inference for YouTube-8M Challenge shows how to use MediaPipe to prepare training data for the YouTube-8M Challenge and do the model inference with the baseline model.
Preparing Data Sets with MediaSequence shows how to use MediaPipe for media processing to prepare video data sets for training a TensorFlow model.
AutoFlip shows how to use MediaPipe to build an automatic video cropping pipeline that can convert an input video to arbitrary aspect ratios.
Object Detection on Desktop shows how to run object detection models (TensorFlow and TFLite) using the MediaPipe C++ APIs.
Face Detection on Desktop with Webcam shows how to use MediaPipe with a TFLite model for face detection on desktop using CPU or GPU with live video from a webcam.
Hand Tracking on Desktop with Webcam shows how to use MediaPipe with a TFLite model for hand tracking on desktop using CPU or GPU with live video from a webcam.
Multi-Hand Tracking on Desktop with Webcam shows how to use MediaPipe with a TFLite model for multi-hand tracking on desktop using CPU or GPU with live video from a webcam.
Hair Segmentation on Desktop with Webcam shows how to use MediaPipe with a TFLite model for hair segmentation on desktop using GPU with live video from a webcam.
Below are code samples on how to run MediaPipe on Google Coral Dev Board.
Object Detection on Coral with Webcam shows how to run quantized object detection TFlite model accelerated with EdgeTPU on Google Coral Dev Board.
Face Detection on Coral with Webcam shows how to use quantized face detection TFlite model accelerated with EdgeTPU on Google Coral Dev Board.
Below are samples that can directly be run in your web browser. See more details in MediaPipe on the Web and Google Developer blog post