Examples
Complete example applications demonstrating react-litert usage.
Image Classification
A complete image classification example using MobileNet V2.
Location: examples/image-classification
Features
- Load and run MobileNet V2 model
- Upload and classify images
- Display predictions with confidence scores
- WebGPU/WASM acceleration
Key Code
import { LiteRtProvider, useLiteRtTfjsModel } from 'react-litert';
import * as tf from '@tensorflow/tfjs-core';
function App() {
return (
<LiteRtProvider
config={{
wasmRoot: '/litert-wasm/',
preferAccelerators: ['webgpu', 'wasm'],
tfBackend: 'webgpu',
}}
>
<ImageClassifier />
</LiteRtProvider>
);
}
function ImageClassifier() {
const { status, run } = useLiteRtTfjsModel({
modelUrl: '/models/mobilenet_v2_1.0_224.tflite',
});
async function classifyImage(imageFile: File) {
// Load and preprocess image
const image = await loadImage(imageFile);
const tensor = tf.browser.fromPixels(image)
.resizeBilinear([224, 224])
.expandDims(0)
.div(255.0);
// Run inference
const output = await run(tensor);
const predictions = await output.data();
// Process results
return getTopPredictions(predictions);
}
// ... rest of component
}
Running the Example
cd examples/image-classification
npm install
npm run dev
Creating Your Own Example
Basic Setup
- Install dependencies:
npm install react-litert @tensorflow/tfjs-core
- Set up the provider:
import { LiteRtProvider } from 'react-litert';
function App() {
return (
<LiteRtProvider
config={{
wasmRoot: '/litert-wasm/',
preferAccelerators: ['webgpu', 'wasm'],
}}
>
<YourComponent />
</LiteRtProvider>
);
}
- Use a model:
import { useLiteRtTfjsModel } from 'react-litert';
function YourComponent() {
const { status, run } = useLiteRtTfjsModel({
modelUrl: '/models/your_model.tflite',
});
async function runInference(input) {
const output = await run(input);
return output;
}
// ... rest of component
}
Common Patterns
Image Preprocessing
import * as tf from '@tensorflow/tfjs-core';
async function preprocessImage(imageFile: File): Promise<tf.Tensor4D> {
const img = new Image();
img.src = URL.createObjectURL(imageFile);
await img.decode();
return tf.browser
.fromPixels(img)
.resizeBilinear([224, 224])
.expandDims(0)
.div(255.0) as tf.Tensor4D;
}
Handling Model Output
async function processOutput(output: tf.Tensor) {
const data = await output.data();
const predictions = Array.from(data);
// Get top 5 predictions
const top5 = predictions
.map((score, index) => ({ index, score }))
.sort((a, b) => b.score - a.score)
.slice(0, 5);
return top5;
}
Error Handling
function ModelComponent() {
const { status, error, run } = useLiteRtTfjsModel({
modelUrl: '/models/model.tflite',
});
if (status === 'error') {
return <div>Error: {error?.message}</div>;
}
async function handleRun() {
try {
const output = await run(input);
// Handle success
} catch (err) {
console.error('Inference failed:', err);
// Handle error
}
}
}
More Examples
Check out the examples directory in the repository for more complete examples.
Contributing Examples
Have a great example? Consider contributing it to the repository! Examples help other developers learn how to use react-litert effectively.
Next Steps
- Getting Started - Learn the basics
- API Reference - Detailed API documentation
- Advanced Usage - Advanced patterns and techniques