Skip to content

Ability to select number of CPU cores to run inference on while using OpenVINO Inference Engine as backend #14852

@shelkesagar29

Description

@shelkesagar29
System information (version)
  • OpenCV => 4.1.0
  • Operating System / Platform => Ubuntu
  • Compiler => g++
Detailed description

OpenCV should have a feature to select the number of CPU cores used for inference when using OpenVINO inference engine as backend.
Right now, this can be only achieved using OpenVINO IE API.


In op_inf_engine.cpp file at
https://github.com/opencv/opencv/blob/master/modules/dnn/src/op_inf_engine.cpp

function

void InfEngineBackendNet::initPlugin(InferenceEngine::ICNNNetwork& net)

can accept the number of CPU cores (num_cores) parameter when DNN_TARGET_CPU is selected.

This can be further passed as

std::map cpu_config<std::string, std::string>
cpu_config[InferenceEngine::PluginConfigParams::KEY_CPU_THREADS_NUM]=std::string(num_cores)
netExec = plugin.LoadNetwork(net, cpu_config);

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions