Skip to content

Inference Model

netspresso.np_qai.benchmarker.NPQAIBenchmarker.inference_model(input_model_path, target_device_name, inputs, job_name=None, options=InferenceOptions(), retry=True)

Inference a model on a device in the QAI hub.

Parameters:

Name Type Description Default
input_model_path Union[str, Path]

The path to the input model.

required
target_device_name Union[Device, List[Device]]

The device to benchmark the model on.

required
inputs Union[Dataset, DatasetEntries, str]

The input data to use for the inference.

required
job_name Optional[str]

The name of the job.

None
options Union[InferenceOptions, str]

The options to use for the inference.

InferenceOptions()
retry bool

Whether to retry the inference if it fails.

True

Returns:

Type Description
Union[InferenceJob, List[InferenceJob]]

Union[InferenceJob, List[InferenceJob]]: Returns an inference job object if successful.

Note

For details, see submit_inference_job in QAI Hub API.