Skip to content

Latest commit

 

History

History
25 lines (19 loc) · 937 Bytes

5.mdx

File metadata and controls

25 lines (19 loc) · 937 Bytes

<FrameworkSwitchCourse {fw} />

Fine-tuning, Check![[fine-tuning-check]]

That was fun! In the first two chapters you learned about models and tokenizers, and now you know how to fine-tune them for your own data. To recap, in this chapter you:

{#if fw === 'pt'}

  • Learned about datasets in the Hub
  • Learned how to load and preprocess datasets, including using dynamic padding and collators
  • Implemented your own fine-tuning and evaluation of a model
  • Implemented a lower-level training loop
  • Used 🤗 Accelerate to easily adapt your training loop so it works for multiple GPUs or TPUs

{:else}

  • Learned about datasets in the Hub
  • Learned how to load and preprocess datasets
  • Learned how to fine-tune and evaluate a model with Keras
  • Implemented a custom metric

{/if}