Skip to content
This repository has been archived by the owner on Dec 1, 2021. It is now read-only.

Support dual inference at same time on FPGA #1102

Open
tk26eng opened this issue Jun 19, 2020 · 3 comments
Open

Support dual inference at same time on FPGA #1102

tk26eng opened this issue Jun 19, 2020 · 3 comments
Assignees
Labels
enhancement New feature or request

Comments

@tk26eng
Copy link
Contributor

tk26eng commented Jun 19, 2020

Now runtime only support one inference on FPGA.
But sometimes running two models is useful.
We might need the feature for that.

@tk26eng tk26eng self-assigned this Jun 19, 2020
@tk26eng tk26eng added the enhancement New feature or request label Jun 19, 2020
@primenumber
Copy link
Contributor

Related issue: #666

@kalpitthakkar-lm
Copy link
Contributor

@tk26eng @primenumber

Just a related question:
Is there a possibility of using multiple FPGAs for inference?
(Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)

@tk26eng
Copy link
Contributor Author

tk26eng commented Jun 23, 2020

@kalpitthakkar-lm

@tk26eng @primenumber

Just a related question:
Is there a possibility of using multiple FPGAs for inference?
(Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)

Maybe we don't have any plan to use multiple FPGAs for inference.
Bigger FPGA is easier way to handle large model on FPGA instead of multiple FPGAs.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants