-
Notifications
You must be signed in to change notification settings - Fork 143
Home
This list gives an overview of all modules available inside the contrib repository.
- ARM CPU Plugin -- allows performing deep neural models inference on ARM CPUs, using OpenVINO Runtime API.
- NVIDIA GPU Plugin -- allows performing deep neural models inference on NVIDIA GPUs, using OpenVINO Runtime API.
- OpenVINO Java API -- provides Java wrappers for OpenVINO Runtime API.
- PyTorch extensions for Model Optimizer -- native PyTorch to OpenVINO IR converter
You can build OpenVINO, so it will include the modules from this repository. Contrib modules are under constant development and it is recommended to use them alongside the master
branch or latest releases of OpenVINO.
Here is the CMake command for you:
$ cd <openvino_build_directory>
$ cmake -DOPENVINO_EXTRA_MODULES=<openvino_contrib>/modules <openvino_source_directory>
$ cmake --build . -j8
As the result, OpenVINO will be built in the openvino_build_directory
with all modules from openvino_contrib
repository. To disable specific modules, use CMake's BUILD_<module_name>
boolean options. Like in this example:
$ cmake -DOPENVINO_EXTRA_MODULES=<openvino_contrib>/modules -DBUILD_java_api=OFF <openvino_source_directory>
In order to keep a clean overview containing all contributed modules, the following files need to be created/adapted:
-
Update this file. Here, you add your module with a single-line description.
-
Add a README.md inside your own module folder. This README explains which functionality (separate functions) is available, explains in somewhat more detail what the module is expected to do. If any extra requirements are needed to build the module without problems, add them here also.
We welcome community contributions to the openvino_contrib
repository. If you have an idea how to improve the modules, please share it with us.
All guidelines for contributing to the repository can be found here.