Skip to content

Latest commit

 

History

History

SemanticSegmentation

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

Qualcomm® AI Hub Apps

Semantic Segmentation Sample App

This sample app performs semantic segmentation on live camera input.

The app aims to showcase an example of combining streaming camera, TFLite, and OpenCV.

Prerequisites

  1. Clone this repository with Git-LFS enabled.
  2. Download Android Studio. Version 2023.1.1 or newer is required.
  3. Enable USB debugging on your Android device.

Build the APK

  1. Download or export a compatible model from AI Hub Models.
  2. Copy the .tflite file to src/main/assets/<your_model>.tflite
  3. In ../gradle.properties, modify the value of semanticsegmentation_tfLiteModelAsset to the name of your model file (<your_model>.tflite)
  4. Open the PARENT folder (android) (NOT THIS FOLDER) in Android Studio, run gradle sync, and build the SemanticSegmentation target.

Supported Hardware (TF Lite Delegates)

By default, this app supports the following hardware:

Comments have been left in TFLiteHelpers.java and AIHubDefaults.java to guide you on how to add support for additional TF Lite delegates that could target other hardware.

AI Model Requirements

Model Runtime Formats

  • TensorFlow Lite (.tflite)

I/O Specification

INPUT Description Shape Data Type
Image An RGB image [1, Height, Width, 3] float32 input expecting inputs normalized by a per-channel mean and standard deviation (see app code for details)
OUTPUT Description Shape Data Type
Classes CityScapes Classes [1, Height', Width', 19] float32 lower resolution class logit predictions

Refer to the CityScapes segmentation model.py for class label information.

The app is developed to work best with a Width/Height ratio of 2.

Compatible AI Hub Models

The app is currently compatible with the unquantized FFNet variants:

Replicating an AI Hub Profile / Inference Job

Each AI Hub profile or inference job, once completed, will contain a Runtime Configuration section.

Modify TFLiteHelpers.java according to the runtime configuration applied to the job. Comment stubs are included to help guide you (search for TO REPLICATE AN AI HUB JOB...)

Note that if your job uses delegates other than QNN NPU, GPUv2, and TFLite, then you'll also need to add support for those delegates to the app.

Technologies Used by this App

Expected Camera Environment

This app uses models trained on the Cityscapes Dataset. That means it will only produce valid results if the camera is pointed at street scenes! When in doubt, point the camera at the following sample image to verify accuracy:

Cityscapes-like example image

License

This app is released under the BSD-3 License found at the root of this repository.

All models from AI Hub Models are released under separate license(s). Refer to the AI Hub Models repository for details on each model.

The QNN SDK dependency is also released under a separate license. Please refer to the LICENSE file downloaded with the SDK for details.