Skip to content

Latest commit

 

History

History
46 lines (36 loc) · 2.53 KB

README.md

File metadata and controls

46 lines (36 loc) · 2.53 KB

Video Analytics Tool using YoloV5 and Streamlit

😇 Motivation

As AI engineers, we love data and we love to see graphs and numbers! So why not project the inference data on some platform to understand the inference better? When a model is deployed on the edge for some kind of monitoring, it takes up rigorous amount of frontend and backend developement apart from deep learning efforts — from getting the live data to displaying the correct output. So, I wanted to replicate a small scale video analytics tool and understand what all feature would be useful for such a tool and what could be the limitations?

🖼️ Demo

dashboard_1_local_video.mp4

🔑 Features

For detailed insights, do check out my Medium Blog

  1. Choose input source - Local, RTSP or Webcam
  2. Input class threshold
  3. Set FPS drop warning threshold
  4. Option to save inference video
  5. Input class confidence for drift detection
  6. Option to save poor performing frames
  7. Display objects in current frame
  8. Display total detected objects so far
  9. Display System stats - Ram, CPU and GPU usage
  10. Display poor performing class
  11. Display minimum and maximum FPS recorded during inference

💫 How to use?

  1. Clone this repo
  2. Install all the dependencies
  3. Download deepsort checkpoint file and paste it in deep_sort_pytorch/deep_sort/deep/checkpoint
  4. Run -> streamlit run app.py

⭐ Recent changelog

  1. Updated yolov5s weight file name in detect() in app.py
  2. Added drive link to download DeepSort checkpoint file (45Mb).

❤️ Extras

Do checkout the Medium article and give this repo a ⭐

Note

The input video should be in same folder where app.py is. If you want to deploy the app in cloud and use it as a webapp then - download the user uploaded video to temporary folder and pass the path and video name to the respective function in app.py . This is Streamlit bug. Check Stackoverflow.