GstInference
GstInference

GstInference

Regular price
$0.00
Sale price
$0.00

GstInference is an open-source project from RidgeRun Engineering that provides a framework for integrating deep learning inference into GStreamer. Either use one of the included elements to do out-of-the box inference using the most popular deep learning architectures or inherit from the base classes and utilities to support your own custom architecture.

GstInference’s objective is to put inference in the hands of the developer without having to worry about frameworks, platforms, or media handling. It also has a strong focus on performance, which most Python frameworks don't really consider.

GstInference can be used to create deep learning media applications and plugins easily and faster.

Application programmers can build a deep learning pipeline easily, without writing a single line of code, using the provided set of plugins. GstInference provides elements for various deep learning tasks (e.g., classification and detection) as well as overlay elements to visualize the results with minimum effort.

Plugin programmers can use GstInference's set of extensible base classes to create new inference or overlay plugins. ONNX, TensorFlow, and TensorFlow Lite models are available for GstInference.

By purchasing this product, you will receive a link to download free binaries such as:

  • Google's EdgeTPU binaries for GstInference
  • TensorFlow GPU-optimized binaries for NVIDIA Jetson
  • TinyYoloV2 and V3 models
  • MobileNetV2, V1, and V4 models
  • Inception V1, V2, V3, and V4 models
  • FaceNetV1 and ResNet50 models

View our video about this product.

Visit our RidgeRun GitHub repository for more information.

Supported Frameworks

  • TensorFlow
  • TensorFlow Lite
  • TensorRT
  • ONNX

Supported Hardware

  • NVIDIA Jetson
  • NVIDIA GPUs
  • NXP i.MX8
  • Google Coral
  • x86 architecture based CPUs