Skip to content

jojo96/intel-openvino-colab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Intel-OpenVINO-Colab

Using Google Colab to do inference using Intel OpenVINO.

Tutorial

Intel OpenVINO on Google Colab

How to use

Step 1: Importing Libraries. Ref: AllModels.ipynb

!pip install openvino

Step 2: Setting up environment. Ref: AllModels.ipynb

from openvino.inference_engine import IENetwork 
from openvino.inference_engine import IECore
import warnings
from google.colab.patches import cv2_imshow
warnings.filterwarnings("ignore", category=DeprecationWarning)

def load_IR_to_IE(model_xml):
    ### Load the Inference Engine API
    plugin = IECore()
    ### Loading the IR files to IENetwork class
    model_bin = model_xml[:-3]+"bin" 
    network = IENetwork(model=model_xml, weights=model_bin)
    ### Loading the network
    executable_net = plugin.load_network(network,"CPU")
    print("Network succesfully loaded into the Inference Engine")
    return executable_net
    
def synchronous_inference(executable_net, image):
    ### Get the input blob for the inference request
    input_blob = next(iter(executable_net.inputs))
    ### Perform Synchronous Inference
    result = executable_net.infer(inputs = {input_blob: image})
    return result

For use cases refer notebook.

Notebooks

Demo1: Inference Demo

Demo2: IE File Generation and inference

References

The model descriptions have been borrowed from Intel and code has been adapted from this repository.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages