Skip to content

Run inference with .joint model #6

Open
@mikel-brostrom

Description

@mikel-brostrom

I want to run inference on my exported .joint model to be able to evaluate its INT8 COCO mAP performance. I only see one way of doing this:

pulsar run\
    my_model.joint\
    --input resnet18_export_data/images/cat.jpg\
    --output_gt inference_results

And then read the the generated .npy file. I want to avoid reading from file as it is a very time consuming operation. Is there a way of returning the results from the system call itself? Are you planning to implement this?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions