Deep learning-based models can be easily trained on NVIDIA GPUs since there is a vast availability of popular frameworks which supports them.
One of the most convenient parts of OpenVINO is that it comes with OpenCV which is already compiled and built to support Intel GPU and Intel NCS 2.
https://link.medium.com/K6bLpTezdkb
OpenCV or OpenVINO does not provide you tools to train a neural network.
[…] instead of directly using the trained model for inference, OpenVINO requires us to create an optimized model which they call Intermediate Representation (IR) using a Model Optimizer tool.
OpenVINO optimizes running this model on specific hardware through the Inference Engine plugin. This plugin is available for all intel hardware (GPUs, CPUs, VPUs, FPGAs).
While OpenCV DNN in itself is highly optimized, with the help of Inference Engine we can further increase its performance.
https://learnopencv.com/using-openvino-with-opencv/