Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cloud Inference Fix #113

Merged
merged 1 commit into from
May 9, 2019
Merged

Cloud Inference Fix #113

merged 1 commit into from
May 9, 2019

Conversation

kiritigowda
Copy link
Collaborator

No description provided.

@kiritigowda kiritigowda requested a review from hansely May 9, 2019 18:50
@kiritigowda kiritigowda self-assigned this May 9, 2019
@kiritigowda kiritigowda added the enhancement New feature or request label May 9, 2019
@kiritigowda kiritigowda merged commit 8d48481 into ROCm:master May 9, 2019
@kiritigowda kiritigowda deleted the kg/cloud-inference-fix branch May 9, 2019 20:24
japarada pushed a commit to japarada/MIVisionX that referenced this pull request May 14, 2019
* nnir_to_openvx print (ROCm#103)

* Ubuntu 18.04 warnings fix (ROCm#104)

* Support for ONNX V1.3 (ROCm#101)

ONNX 1.3 support for ResNet50

* adding grouped convolution and updating readme (ROCm#105)

* Update README.md

* Update convolution_layer.cpp

* Loom Sample & Readme Updates (ROCm#106)

* Loom Readme updates

* Loom Samples Added

* Samples Readme Updates

* ROCm Version Updated

* Readme updates (ROCm#107)

* loom logo added

* logo updates for loom

* Samples readme updates

* Main readme updates

* WinML Readme updates

* OpenVX Readme updates

* loom readme updates

* Apps readme updates

* updates (ROCm#108)

* Extensions readme updates

* main readme update

* loom readme updates

* loom link updates

* updates to loom

* updates

* Set up script updates

* YoloV2 Fix (ROCm#109)

* cloud inference updates (ROCm#111)

* client app images added

* Readme for Cloud Application

* Cloud Inference Fix (ROCm#112)

* Server Rename fix

* Readme Link Fix

* Client Readme updates

* Server usage help added

* Model Compiler Path Set

* Model Compiler Scripts updated

* Default Model Compiler Path Added

* Server Readme fix

* Cloud Inference readme fix

* Server/Client bug fix

* Cloud Inference Fix (ROCm#113)

* Cloud Inference - Server Help (ROCm#114)

* Cloud Inference Application - Graphs & Enhancements (ROCm#115)

* graph added/polished

* Update annInferenceApp.pro

* Update inference_receiver.h

* Update inference_viewer.cpp

* Update inference_viewer.cpp

* Update perf_graph.ui

* Update inference_receiver.cpp

* Title update

* Update inference_receiver.cpp

* onnx_to_nnir help text fix (ROCm#118)

* bug fix

* bug fix

* bug fix

* Update perf_chart.cpp

* bug fix
Akilesh2 pushed a commit to Akilesh2/MIVisionX that referenced this pull request Nov 15, 2022
* Mods to handle RGB and BGR subpixel layouts

* Test suite mods to handle RGB/BGR subpixel layouts in color to greyscale conversions

* Rename files from rgb_to_greyscale to color_to_greyscale
fiona-gladwin pushed a commit to fiona-gladwin/MIVisionX that referenced this pull request Jun 6, 2024
* cpp support for oneHotLabels

* Python changes for one hot encoding to remove cupy usage

* Resolve PR comments

* Resolve PR comments

* Resolve the PR comments

* Resolve the ingternal PR comments

* Resolve the PR comments

* Change to reinterpret_cast

* Merged CPU and GPU functionalities for one hot labels

* Add a special test case for one hot encoding

* Add one hot encoding as a arg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant