Suggestion neural-network architecture with small parameter size #814
Replies: 2 comments
-
Hello, |
Beta Was this translation helpful? Give feedback.
0 replies
-
Please have a look at the RESULTS.md in each recipe. You can find the training commands for each pre-trained model. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In current icefall architecture, number of parameters for different architectures are usually large than 1 million parameters. For example:
If somehow in some very small device, which only allows parameter size of ~ 80K parameter, the choice of neural-network architecture seems to be very restricted.
Currently what I can think of is only to use TDNN-F from old Kaldi. Is there any more advanced method which also support small parameter size?
On the other hand, for very small device, is BPE model possible?
Beta Was this translation helpful? Give feedback.
All reactions