You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great works! I am trying to train the model on laion2B datasets.
My training details are: base_size=4096, learning_rate=1e-4, AdamW optimizer.
After 130000 iterations, the loss curve is as follows, it seems to flatten...
And the visualized images are:
prompts are:
A tiger in a lab coat with a 1980s Miami vibe, turning a well oiled science content machine, digital art
A group photo
a fish on a bike
star citizen aurora
Are you in a similar situation during training, what is the lowest loss you can achieve during training?? Hope to get your suggestions!
The text was updated successfully, but these errors were encountered:
Thanks for your great works! I am trying to train the model on laion2B datasets.
My training details are: base_size=4096, learning_rate=1e-4, AdamW optimizer.
After 130000 iterations, the loss curve is as follows, it seems to flatten...
And the visualized images are:
prompts are:
A tiger in a lab coat with a 1980s Miami vibe, turning a well oiled science content machine, digital art
A group photo
a fish on a bike
star citizen aurora
Are you in a similar situation during training, what is the lowest loss you can achieve during training?? Hope to get your suggestions!
The text was updated successfully, but these errors were encountered: