Peer reviewed learning - How to get involved #2
Replies: 8 comments 16 replies
-
Put this feedback on my PR as well, but didn't see the discussion board: All in all, great 'first dip into finetuning' for people! There are some things lacking though in my opinion: Will edit tomorrow morning w/ further thoughts as it's 2am 😴 |
Beta Was this translation helpful? Give feedback.
-
Well I too wish there were a few more things that could be explained or written for clarification
|
Beta Was this translation helpful? Give feedback.
-
Hi. I want to ask if it is possible to add a tool like ReviewNB to make it more convenient to review or learn from other people's PRs as most changes in this course will be made on Jupyter Notebooks? Or is there already another way to do this? |
Beta Was this translation helpful? Give feedback.
-
@burtenshaw This is a great repo. Do you anticipate having any sprints for the In Progress or Planned modules? (especially the planned Vision Language Models and Synthetic Datasets modules) |
Beta Was this translation helpful? Give feedback.
-
Here are my initial thoughts from a casual python user:
|
Beta Was this translation helpful? Give feedback.
-
I saw on bluesky there is a discord channel but not having luck finding where it's been posted. |
Beta Was this translation helpful? Give feedback.
-
I tried to run SFT from Google collab, training runs on wandb with default system resources. The process crashed twice. I don't see the anything in the logs. Any idea? Should I use GPU?
|
Beta Was this translation helpful? Give feedback.
-
Hi, @burtenshaw. I am not sure the DPO Finetuning tutorial generates valid models. I ran the dpo finetuning example notebook exactly as provided in the course: However, the corresponding fine-tuned model only responds with assistant's empty messages. In this example, the generated fine-tuned model is "thatupiso/SmolLM2-FT-DPO2": I noticed Training Loss goes to 0 right after the first epoch: |
Beta Was this translation helpful? Give feedback.
-
👋 Let's discuss the course
Participtation is open, free, and now!
This course is open and peer reviewed. To get involved with the course open a pull request and submit your work for review. Here are the steps:
This should help you learn and to build a community-driven course that is always improving.
Reviewing
We will review students work as the course runs in a peer review and voluntary approach.
december-2024
branch of the repo from your fork. @burtenshaw will merge these when the module is finished so future students can learn.Engagement Time
To make this course flexible and easy to follow, the modules use optional levels, extra reading, and concise written material. This means that you can go as deep or as shallow on a topic as you need, and still take away the basics. The course comes with three levels: 🐢, 🐕, and 🦁.
Here are some guidelines on commitments.
It's really up to you how seriously you take the course, but we're on board with anything.
Beta Was this translation helpful? Give feedback.
All reactions