Open
Description
I want to run inference on my exported .joint
model to be able to evaluate its INT8 COCO mAP performance. I only see one way of doing this:
pulsar run\
my_model.joint\
--input resnet18_export_data/images/cat.jpg\
--output_gt inference_results
And then read the the generated .npy
file. I want to avoid reading from file as it is a very time consuming operation. Is there a way of returning the results from the system call itself? Are you planning to implement this?
Metadata
Metadata
Assignees
Labels
No labels