Replies: 1 comment 2 replies
-
I have a similar experience with a modestly better 1080 Nvidia card. A little less time, maybe 15 minutes shorter per hundred epochs. The training is SLOW compared to colab or any modern PC, but the results speak for themselves. I only had to make a few quality of life improvements such as increasing learning rate to 0.005 or so, and I was able to not only change my voice but record professional sounding music. Big plus. Thank you all. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
This is my small contribution based on my personal experience since few versions ago to the current one., I don't see many people do that so I decided to take the time and share.
I believe that my feedback can help the developers and some future users as it's related to minimum requirements.
This feedback is based on my old PC and GPU (almost 10 years old)
When I first started to try this FORK version I could only use:
batch_size = 1
was the MAXIMUM I could use without getting OOM (Out Of Memory)168 - 174 minutes
to get to100 epochs
Using the latest version: (as for 22-04-2023) with the EXACT same PC / hardware:
batch_size = 3
is the MAXIMUM I can use without getting OOM (Out Of Memory)126 - 132 minutes
to get to100 epochs
Obviously it is VERY slow compare to cloud training or any local MODERN PC with latest generation GPU, but this is exactly why I'm sharing this, as my PC isn't even listed on the minimum recommended hardware but still working + show improvements.
Since I'm not a programmer, I don't know HOW the developers did this magic but I must mention that you can SURE mention that it works with old tech such as GTX 980 (not even TI but the classic model)
I hope that this contribution of mine helps a bit based on my personal feedback,
I can now train much faster without OOM on the same 10 years old PC.
Please keep up the good work, much love 💙
Beta Was this translation helpful? Give feedback.
All reactions