i am a student that is learning i have a project that requires 720p

hi i have seen that some people use a higher reselution but i cant get it to work i asked ai in the hope it coud help my but it did not

i am now using the example GitHub - hailo-ai/hailo-rpi5-examples

with a custom trained .hef model trained on yolov8n the one trained on 640 image size is working fine with the examples but it it is realy inconvident. i tryed training one on 1280 image size but than it is getting nothing on the base code.

what do i need to change to make it see better

Hi @pascal_van_dam ,

At DeGirum (a SW partner of Hailo), we developed PySDK, a python package, to make application development easy with Hailo devices. I understand that the model trained on 640x640 may not work well on higher resolution images. One way to do this, is to use tiling in your inference flow.

Tiling processes smaller sections at full resolution, preserving small objects’ pixel detail and distinguishing features for accurate detection. There are several ways to implement tiling. You can find a detailed implementation guide here.

Please let me know if you face any issues.

i tryed running it in google colab but this error shows up


ValueError                                Traceback (most recent call last)

/tmp/ipython-input-2175539740.py in <cell line: 0>()
      1 # imports and variables used in most cells
      2 import degirum as dg
----> 3 import degirum_tools as dg_tools
      4 import cv2
      5 



10 frames


scipy/linalg/_cythonized_array_utils.pyx in init scipy.linalg._cythonized_array_utils()


/usr/local/lib/python3.12/dist-packages/numpy/random/_pickle.py in <module>
----> 1 from .mtrand import RandomState
      2 from ._philox import Philox
      3 from ._pcg64 import PCG64, PCG64DXSM
      4 from ._sfc64 import SFC64
      5 


numpy/random/mtrand.pyx in init numpy.random.mtrand()


with this code

imports and variables used in most cells

import degirum as dg

import degirum_tools as dg_tools

import cv2

from degirum_tools.tile_compound_models import TileExtractorPseudoModel, TileModel, LocalGlobalTileModel, BoxFusionTileModel, BoxFusionLocalGlobalTileModel

from degirum_tools import NmsBoxSelectionPolicy, NmsOptions

# Base NMS options.

nms_options = NmsOptions(

threshold=0.6,

use_iou=True,

box_select=NmsBoxSelectionPolicy.MOST_PROBABLE,

)

Hey @pascal_van_dam,

Welcome to the Hailo Community!

We have a tiling app in here : hailo-apps-infra/hailo_apps/hailo_app_python/apps/tiling at dev · hailo-ai/hailo-apps-infra · GitHub

Hope this helps!

HI @pascal_van_dam , Thanks for trying DeGirum PySDK. I understand the issue. This is happening because you did not restart the kernel after !pip install degirum_tools degirum. Note that once you install the libraries you need not run pip command again. Just run import degirum, degirum_tools directly..

Please let me know if you face any issues again.

hi @Darshil_Modi thanks for the responce i tryed running it this time on the raspberry pi 5 with hailo module but i get this error
Traceback (most recent call last):
File “/home/jaronai/Desktop/new new test/tilling.py”, line 25, in
model = dg.load_model(
^^^^^^^^^^^^^^
File “/home/jaronai/Desktop/new new test/venv/lib/python3.11/site-packages/degirum/init.py”, line 244, in load_model
return zoo.load_model(model_name, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/jaronai/Desktop/new new test/venv/lib/python3.11/site-packages/degirum/log.py”, line 92, in sync_wrap
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File “/home/jaronai/Desktop/new new test/venv/lib/python3.11/site-packages/degirum/zoo_manager.py”, line 330, in load_model
setattr(model, key, value)
File “/home/jaronai/Desktop/new new test/venv/lib/python3.11/site-packages/degirum/model.py”, line 434, in device_type
raise DegirumException(
degirum.exceptions.DegirumException: None of the device types in the list [‘HAILORT/HAILO8’] is supported by model yolo11n_visdrone_person–640x640_quant_hailort_multidevice_1. Supported device types are: [‘HAILORT/HAILO8L’].

hay there is no documentation becouse the .readme is not correct for the tilling but for an other project

Hi @pascal_van_dam

The example uses device_type=HAILORT/HAILO8but the same model works for both Hailo8 and Hailo8L. From the error message, it looks like your system has Hailo8L. Please change device_type=HAILORT/HAILO8Land try again.

Hello, I’m encountering an issue loading a local AI model using the Degirum PySDK. I’m receiving the error message `degirum.exceptions.DegirumException: Model ‘connectoren-v1’ is not found in model zoo ‘/home/jaronai/Desktop/new new test/model’`. Here are the details: * **Error Message:** `Model ‘connectoren-v1’ is not found in model zoo ‘/home/jaronai/Desktop/new new test/model’` * **Python Code:** ```python import degirum as dg local_zoo_path = “/home/jaronai/Desktop/new new test/model” model_name = ‘connectoren-v1’ model = dg.load_model( model_name=model_name, inference_host_address=“@local”, zoo_url=local_zoo_path ) ``` * **File Structure:** The `tree` command confirms that the files are correctly placed within the folder: ``` $ tree model/ model/ ├── connectoren-v1.hef └── connectoren-v1.json ``` * **JSON Content:** The JSON configuration appears correct and contains the `ModelType` and `ModelPath` keys that match the filenames. ```json { “ConfigVersion”: 10, “DEVICE”: [ { “DeviceType”: “HAILO8L”, “RuntimeAgent”: “HAILORT”, “SupportedDeviceTypes”: “HAILORT/HAILO8L” } ], “MODEL_PARAMETERS”: { “ModelPath”: “connectoren-v1.hef” }, “ModelType”: “connectoren-v1” } ``` I have checked for typos, hidden characters, and file permissions, but the problem persists. Do you have any idea what might be going on?

Hi @pascal_van_dam

Please add a field called checksum to the JSON. The value can be any dummy value (but not empty). See below for example:

{
  "ConfigVersion": 11,
  "Checksum": "a19c2781505c16650e5d7304283afb4c9c8c475ade8aba2f0705e3d08dfc1380",
  "DEVICE": [
    {
      "DeviceType": "HAILO8",
      "RuntimeAgent": "HAILORT",
      "SupportedDeviceTypes": "HAILORT/HAILO8"
    }
  ],
  "PRE_PROCESS": [
    {
      "InputN": 1,
      "InputH": 640,
      "InputW": 640,
      "InputC": 3,
      "InputQuantEn": true
    }
  ],
  "MODEL_PARAMETERS": [
    {
      "ModelPath": "yolov8n_coco--640x640_quant_hailort_hailo8_1.hef"
    }
  ],
  "POST_PROCESS": [
    {
      "OutputPostprocessType": "DetectionYoloHailo",
      "OutputNumClasses": 80,
      "LabelsPath": "labels_yolov8n_coco.json"
    }
  ]
}

okay thank you for the reply i am going to be 2 weeks away so i am going to try wen i come back

1 Like