Skip to content

Commit

Permalink
[CAPI] remove legacy CAPI source code and document (openvinotoolkit#2…
Browse files Browse the repository at this point in the history
…1916)

* [CAPI] remove legacy CAPI source code and documentation

* Solve install header file missing issue
  • Loading branch information
riverlijunjie authored Jan 5, 2024
1 parent 64e3a4f commit d3398d2
Show file tree
Hide file tree
Showing 15 changed files with 725 additions and 5,125 deletions.
7 changes: 0 additions & 7 deletions .github/workflows/job_cxx_unit_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -211,13 +211,6 @@ jobs:
--gtest_filter=*smoke* \
--gtest_output=xml:${INSTALL_TEST_DIR}/TEST-TemplateFuncTests.xml
- name: Inference Engine C API tests
if: fromJSON(inputs.affected-components).C_API.test
run: |
source ${INSTALL_DIR}/setupvars.sh
${INSTALL_TEST_DIR}/InferenceEngineCAPITests --gtest_print_time=1 \
--gtest_output=xml:${INSTALL_TEST_DIR}/TEST-InferenceEngineCAPITests.xml
- name: OpenVINO C API tests
if: fromJSON(inputs.affected-components).C_API.test
run: |
Expand Down
6 changes: 0 additions & 6 deletions .github/workflows/windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -656,12 +656,6 @@ jobs:
run: |
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_template_func_tests --gtest_print_time=1 --gtest_filter=*smoke* --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TemplateFuncTests.xml
- name: Inference Engine C API tests
if: fromJSON(needs.smart_ci.outputs.affected_components).C_API.test
shell: cmd
run: |
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/InferenceEngineCAPITests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceEngineCAPITests.xml
- name: OpenVINO C API tests
if: ${{ 'false' }} # Ticket: 123594
shell: cmd
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,13 +49,6 @@ Based on the steps, the following code demonstrates how to change the applicatio
:language: cpp
:fragment: ie:create_core

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:create_core

**API 2.0**

.. tab-set::
Expand Down Expand Up @@ -107,14 +100,6 @@ to write extensions. However, you can also load the old extensions to the new Op
:language: cpp
:fragment: ie:load_old_extension

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:load_old_extension


**API 2.0**

.. tab-set::
Expand Down Expand Up @@ -162,14 +147,6 @@ to write extensions. However, you can also load the old extensions to the new Op
:language: cpp
:fragment: ie:read_model

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:read_model


**API 2.0**

.. tab-set::
Expand Down Expand Up @@ -229,14 +206,6 @@ preprocessing may be necessary. See :doc:`preprocessing in API 2.0 <openvino_2_0
:language: cpp
:fragment: ie:compile_model

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:compile_model


**API 2.0**

.. tab-set::
Expand Down Expand Up @@ -287,14 +256,6 @@ If you need to configure devices with additional parameters for OpenVINO Runtime
:language: cpp
:fragment: ie:create_infer_request

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:create_infer_request


**API 2.0**

.. tab-set::
Expand Down Expand Up @@ -349,13 +310,6 @@ The Inference Engine API fills inputs with data of the ``I32`` precision (**not*
:language: cpp
:fragment: ie:get_input_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_input_tensor

.. tab-item:: IR v11
:sync: ir-v11

Expand All @@ -375,13 +329,6 @@ The Inference Engine API fills inputs with data of the ``I32`` precision (**not*
:language: cpp
:fragment: ie:get_input_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_input_tensor

.. tab-item:: ONNX
:sync: onnx

Expand All @@ -401,13 +348,6 @@ The Inference Engine API fills inputs with data of the ``I32`` precision (**not*
:language: cpp
:fragment: ie:get_input_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_input_tensor


.. tab-item:: Model created in code
:sync: model
Expand All @@ -428,13 +368,6 @@ The Inference Engine API fills inputs with data of the ``I32`` precision (**not*
:language: cpp
:fragment: ie:get_input_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_input_tensor


**API 2.0**

Expand Down Expand Up @@ -574,13 +507,6 @@ API 2.0 fills inputs with data of the ``I64`` precision (aligned with the origin
:language: cpp
:fragment: ie:inference

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:inference

.. tab-item:: Async
:sync: async

Expand All @@ -600,13 +526,6 @@ API 2.0 fills inputs with data of the ``I64`` precision (aligned with the origin
:language: cpp
:fragment: ie:start_async_and_wait

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:start_async_and_wait


**API 2.0**

Expand Down Expand Up @@ -693,13 +612,6 @@ The Inference Engine API processes outputs as they are of the ``I32`` precision
:language: cpp
:fragment: ie:get_output_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_output_tensor

.. tab-item:: IR v11
:sync: ir-v11

Expand All @@ -719,13 +631,6 @@ The Inference Engine API processes outputs as they are of the ``I32`` precision
:language: cpp
:fragment: ie:get_output_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_output_tensor

.. tab-item:: ONNX
:sync: onnx

Expand All @@ -745,13 +650,6 @@ The Inference Engine API processes outputs as they are of the ``I32`` precision
:language: cpp
:fragment: ie:get_output_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_output_tensor


.. tab-item:: Model created in code
:sync: model
Expand All @@ -772,13 +670,6 @@ The Inference Engine API processes outputs as they are of the ``I32`` precision
:language: cpp
:fragment: ie:get_output_tensor

.. tab-item:: C
:sync: c

.. doxygensnippet:: docs/snippets/ie_common.c
:language: cpp
:fragment: ie:get_output_tensor


**API 2.0**

Expand Down
100 changes: 0 additions & 100 deletions docs/snippets/ie_common.c

This file was deleted.

8 changes: 0 additions & 8 deletions docs/snippets/ov_preprocessing_migration.c
Original file line number Diff line number Diff line change
Expand Up @@ -145,11 +145,3 @@ int main_new() {
return 0;
}

int main_old() {
{
//! [c_api_ppp]
// No preprocessing related interfaces provided by C API 1.0
//! [c_api_ppp]
}
return 0;
}
Loading

0 comments on commit d3398d2

Please sign in to comment.