Skip to content

mosheliv/tensortrt-yolo-python-api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TensorRT api for the deepstream yolo implemetation

This is a work in progress, but it works for me... It is rough around the corners and lots of place for improvment - feel free to contribute:

Inference time with yolov2-tiny is 85ms/image (including resizing and NMS and decoding results) on 10W Nano. should be twice as fast on a MAXN mode Nano.

Installing

  1. sudo apt-get install libgflags-dev cmake
  2. git clone https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps
  3. export YOLO_ROOT=`pwd`/deepstream_reference_apps/yolo
  4. cd $YOLO_ROOT/apps/trt-yolo
  5. edit CMakeLists.txt and :
    1. change 'set(CMAKE_CXX_FLAGS_RELEASE "-O2")' to 'set(CMAKE_CXX_FLAGS_RELEASE "-O2 -fPIC")'
    2. add after it a line 'set(CUDA_NVCC_FLAGS "${CUDA_NVCC_FLAGS} --compiler-options -fPIC" )'
  6. mkdir build && cd build
  7. cmake -D CMAKE_BUILD_TYPE=Release .. as its read only. You can also chmod +w it
  8. make && sudo make install
  9. edit $YOLO-ROOT/config/yolov2-tiny.txt and change all the links to absolute paths (config_file_path, wts_file_path, labels_file_path, test_images)
  10. check that the cpp app works ok by doing:
    1. cd $YOLO_ROOT/apps/trt-yolo/build
    2. put an image or two in $YOLO_ROOT/data/test_images.txt
    3. ./trt-yolo-app --flagfile=$YOLO_ROOT/config/yolov2-tiny.txt
  11. cd $YOLO_ROOT/apps/trt-yolo/build/lib
  12. git clone https://github.com/mosheliv/tensortrt-yolo-python-api
  13. cd tensortrt-yolo-python-api
  14. source link_shared.sh
  15. python t.py --flagfile=$YOLO_ROOT/config/yolov2-tiny.txt your_image.jpg

About

Python api for tensorrt implementation of yolov2

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published