Skip to content

theouterlimitz/PoetryGenerator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Explanation: Upload a PDF of various poetry and let the model train on it, use a prompt, and allow the generation of poetry.

Libraries:

transformers: Provides pre-trained language models (like GPT-2) and utilities for fine-tuning. datasets: Facilitates loading and processing of text data for training. torch: The underlying deep learning framework for Transformers. Model Setup:

Loads GPT-2 tokenizer and model (pre-trained on a large corpus of text). Data Preparation:

load_pdf_data: Extracts text from the uploaded PDF. build_text_dataset: Creates a dataset from the text for training. Model Training:

Fine-tune the GPT-2 model on the dataset derived from your PDF. This teaches the model the specific style and patterns of your poetry. Poem Generation:

generate_poem: Uses the fine-tuned model to generate a new poem. You can provide an optional prompt to guide the poem's theme or starting point. max_length controls the maximum length of the generated poem. How to Use:

Upload your PDF of poems in Colab. Run the code. It will fine-tune GPT-2 on your poems. A poem in the style of your PDF will be printed. Feel free to experiment with different prompts! Key Improvements:

Uses GPT-2: A powerful language model known for generating creative text. Fine-tuning: Adapts the model to your specific writing style. Flexibility: Allows you to provide a prompt to guide the generated poem. Let me know if you have any other questions.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published