The following example shows how to evaluate AI Models for RZ/V2H. The created inference will be able to run on the DRP-AI (INT8). The inference output may not be accurate because the training needs to be done to further calibrate the AI Model.
For this demo we are using the Darknet YoloV2 VOC.
Install TVM Translator here ( Follow RZ/V2H Docker Installation )
TVM Translation
Step 1) Start Docker Container
mkdir data
docker run -it --name drp-ai_tvm_v2h_container_${USER} -v $(pwd)/data:/drp-ai_tvm/data drp-ai_tvm_v2h_image_${USER}
Step 2) Preparation - add the following to access the TVM Compile Scripts.
# Added the following paths to use the TVM Scripts
PYTHONPATH=/drp-ai_tvm/tvm/python:/drp-ai_tvm/tutorials/
# Create Symbolic links for the following bash scripts
# These are requried to run TVM translator
ln -s /drp-ai_tvm/tutorials/run_*