This package provides a TensorFlow.js platform adapter for react native. It provides GPU accelerated execution of TensorFlow.js supporting all major modes of tfjs usage, include:
- Support for both model inference and training
- GPU support with WebGL via expo-gl.
- Support for loading models pretrained models (tfjs-models) from the web.
- IOHandlers to support loading models from asyncStorage and models that are compiled into the app bundle.
These instructions (and this library) assume that you are generally familiar with react native development.
This library relies on expo-gl and expo-gl-cpp. Thus you must use a version of React Native that is supported by Expo.
Some parts of tfjs-react-native are not compatible with managed expo apps. You must use the bare workflow (or just plain react native) if you want to use the following functionality:
- Loading local models using bundleResourceIO. You can instead load models from a webserver.
You can use the React Native CLI or Expo.
On macOS (to develop iOS applications) You will also need to use CocoaPods to install these dependencies.
Note that if you are using in a managed expo app the install instructions may be different.
- Install and configure react-native-unimodules (can be skipped if in an expo app)
- Install and configure expo-gl-cpp and expo-gl
- Install and configure expo-camera
- Install and configure async-storage
- Install and configure react-native-fs
- Install @tensorflow/tfjs -
npm install @tensorflow/tfjs
- Install @tensorflow/tfjs-react-native -
npm install @tensorflow/tfjs-react-native
After this point, if you are using Xcode to build for ios, you should use a ‘.workspace’ file instead of the ‘.xcodeproj’
Step 3: Configure Metro
This step is only needed if you want to use the bundleResourceIO loader.
Edit your metro.config.js
to look like the following. Changes are noted in
the comments below.
const { getDefaultConfig } = require('metro-config');
module.exports = (async () => {
const defaultConfig = await getDefaultConfig();
const { assetExts } = defaultConfig.resolver;
return {
resolver: {
// Add bin to assetExts
assetExts: [...assetExts, 'bin'],
}
};
})();
Before using tfjs in a react native app, you need to call tf.ready()
and wait for it to complete. This is an async function so you might want to do this in a componentDidMount
or before the app is rendered.
The example below uses a flag in the App state to indicate that TensorFlow is ready.
import * as tf from '@tensorflow/tfjs';
import '@tensorflow/tfjs-react-native';
export class App extends React.Component {
constructor(props) {
super(props);
this.state = {
isTfReady: false,
};
}
async componentDidMount() {
// Wait for tf to be ready.
await tf.ready();
// Signal to the app that tensorflow.js can now be used.
this.setState({
isTfReady: true,
});
}
render() {
//
}
}
You can take a look at integration_rn59/App.tsx
for an example of what using tfjs-react-native looks like. In future we will add an example to the tensorflow/tfjs-examples repository.
The Webcam demo folder has an example of a style transfer app.
Many tfjs-models use web APIs for rendering or input, these are not generally compatible with React Native, to use them you generally need to feed a tensor into the model and do any rendering of the model output with react native components. If there is no API for passing a tensor into a tfjs-model, feel free to file a GitHub issue.
When reporting bugs with tfjs-react-native please include the following information:
- Is the app created using expo? If so is it a managed or bare app?
- Which version of react native and the dependencies in the install instructions above are you using?
- What device(s) are you running on? Note that not all simulators support webgl and thus may not work with tfjs-react-native.
- What error messages are you seeing? Are there any relevant messages in the device logs?
- How could this bug be reproduced? Is there an example repo we can use to replicate the issue?