embeded/jetson2022. 3. 28. 17:03

 

[링크 : https://towardsdatascience.com/how-to-deploy-onnx-models-on-nvidia-jetson-nano-using-deepstream-b2872b99a031]

git clone https://github.com/thatbrguy/Deep-Stream-ONNX.git
cd Deep-Stream-ONNX

wget https://github.com/onnx/models/blob/main/vision/object_detection_segmentation/tiny-yolov2/model/tinyyolov2-8.tar.gz
wget https://github.com/onnx/models/blob/main/vision/object_detection_segmentation/tiny-yolov2/model/tinyyolov2-8.tar.gz
# 구글 드라이브 다운로드

tar -xvf sample.tar.gz
tar -xvf tinyyolov2-7.tar.gz
cp tiny_yolov2/Model.onnx tiny_yolov2.onnx

cd custom_bbox_parser/
$ git diff
diff --git a/custom_bbox_parser/Makefile b/custom_bbox_parser/Makefile
index 5bab5a4..b764725 100644
--- a/custom_bbox_parser/Makefile
+++ b/custom_bbox_parser/Makefile
@@ -1,7 +1,8 @@
 CUDA_VER:=10
 SRCFILES:=nvdsparsebbox_tiny_yolo.cpp
 TARGET_LIB:=libnvdsinfer_custom_bbox_tiny_yolo.so
-DEEPSTREAM_PATH:=/home/nano/deepstream_sdk_v4.0_jetson
+#DEEPSTREAM_PATH:=/home/nano/deepstream_sdk_v4.0_jetson
+DEEPSTREAM_PATH:=/opt/nvidia/deepstream/deepstream-6.0

 ifeq ($(CUDA_VER),)
   $(error "CUDA_VER is not set")
diff --git a/custom_bbox_parser/nvdsparsebbox_tiny_yolo.cpp b/custom_bbox_parser/nvdsparsebbox_tiny_yolo.cpp
index c6251e5..0825e68 100644
--- a/custom_bbox_parser/nvdsparsebbox_tiny_yolo.cpp
+++ b/custom_bbox_parser/nvdsparsebbox_tiny_yolo.cpp
@@ -432,7 +432,7 @@ extern "C" bool NvDsInferParseCustomYoloV2Tiny(

     // Obtaining the output layer.
     const NvDsInferLayerInfo &layer = outputLayersInfo[0];
-    assert (layer.dims.numDims == 3);
+    assert (layer.inferDims.numDims == 3);

     // Decoding the output tensor of TinyYOLOv2 to the NvDsInferParseObjectInfo format.
     std::vector<NvDsInferParseObjectInfo> objects =

 

[링크 : https://github.com/thatbrguy/Deep-Stream-ONNX]

 

 

 *** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***

Opening in BLOCKING MODE
Opening in BLOCKING MODE

Using winsys: x11
WARNING: [TRT]: Detected invalid timing cache, setup a local cache instead
INFO: [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT image           3x416x416
1   OUTPUT kFLOAT grid            125x13x13


Runtime commands:
        h: Print this help
        q: Quit

        p: Pause
        r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.


**PERF:  FPS 0 (Avg)    FPS 1 (Avg)     FPS 2 (Avg)     FPS 3 (Avg)
**PERF:  0.00 (0.00)    0.00 (0.00)     0.00 (0.00)     0.00 (0.00)
** INFO: <bus_callback:194>: Pipeline ready

Opening in BLOCKING MODE
Opening in BLOCKING MODE
Opening in BLOCKING MODE
Opening in BLOCKING MODE
** INFO: <bus_callback:180>: Pipeline running



-------------------------------
0:00:00.369440298  8235     0x31571330 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1161> [UID = 1]: Warning, OpenCV has been deprecated. Using NMS for clustering instead of cv::groupRectangles with topK = 20 and NMS Threshold = 0.5
ERROR: Deserialize engine failed because file path: /home/jetson/work/Deep-Stream-ONNX/config/../tiny_yolov2.onnx_b1_fp16.engine open error
0:00:01.781999412  8235     0x31571330 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1889> [UID = 1]: deserialize engine from file :/home/jetson/work/Deep-Stream-ONNX/config/../tiny_yolov2.onnx_b1_fp16.engine failed
0:00:01.782126640  8235     0x31571330 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1996> [UID = 1]: deserialize backend context from engine from file :/home/jetson/work/Deep-Stream-ONNX/config/../tiny_yolov2.onnx_b1_fp16.engine failed, try rebuild
0:00:01.782165021  8235     0x31571330 INFO                 nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
0:01:43.015792426  8235     0x31571330 INFO                 nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1947> [UID = 1]: serialize cuda engine to file: /home/jetson/work/Deep-Stream-ONNX/tiny_yolov2.onnx_b1_gpu0_fp16.engine successfully
0:01:43.196589822  8235     0x31571330 INFO                 nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/home/jetson/work/Deep-Stream-ONNX/config/config_infer_custom_yolo.txt sucessfully
NvMMLiteOpen : Block : BlockType = 261
NvMMLiteOpen : Block : BlockType = 261
NvMMLiteOpen : Block : BlockType = 261
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteOpen : Block : BlockType = 4
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 0
H264: Profile = 66, Level = 0
NVMEDIA_ENC: bBlitMode is set to TRUE
NVMEDIA_ENC: bBlitMode is set to TRUE
 

'embeded > jetson' 카테고리의 다른 글

deepstream onnx part.2  (0) 2022.03.29
jetson nano python numpy Illegal instruction (core dumped)  (0) 2022.03.29
azure custom vision - precision, recall  (0) 2022.03.28
flud nvidia cuda  (0) 2022.03.28
jetson nano deepstream  (0) 2022.02.10
Posted by 구차니