Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add a converter for prompt-template #817

Closed
tikikun opened this issue Dec 1, 2023 · 4 comments
Closed

feat: add a converter for prompt-template #817

tikikun opened this issue Dec 1, 2023 · 4 comments
Assignees
Labels

Comments

@tikikun
Copy link
Contributor

tikikun commented Dec 1, 2023

Refer to #754

Problem
Currently inference engine is doing their own way formatting user chat using their own way of templating the chat, this make the normal chat template not compatible with popular chat tempalte

Success Criteria
Need to write a few simple functions to convert from popular template -> template for supported engine to parse prompt_template option

@tikikun tikikun added the type: feature request A new feature label Dec 1, 2023
@tikikun tikikun self-assigned this Dec 1, 2023
@tikikun
Copy link
Contributor Author

tikikun commented Dec 4, 2023

.

@dan-homebrew dan-homebrew moved this to Todo in Jan & Cortex Dec 5, 2023
@tikikun
Copy link
Contributor Author

tikikun commented Dec 5, 2023

cc @vuonghoainam also please help qa the function @hahuyhoang411

function splitString(promptTemplate) {
    // Split the string using the markers
    const systemMarker = "{system_message}";
    const promptMarker = "{prompt}";

    if (promptTemplate.includes(systemMarker) && promptTemplate.includes(promptMarker)) {
        // Find the indices of the markers
        const systemIndex = promptTemplate.indexOf(systemMarker);
        const promptIndex = promptTemplate.indexOf(promptMarker);

        // Extract the parts of the string
        const system_prompt = promptTemplate.substring(0, systemIndex);
        const user_prompt = promptTemplate.substring(systemIndex + systemMarker.length, promptIndex);
        const ai_prompt = promptTemplate.substring(promptIndex + promptMarker.length);

        // Return the split parts
        return { system_prompt, user_prompt, ai_prompt };
    } else if (promptTemplate.includes(promptMarker)) {
        // Extract the parts of the string for the case where only promptMarker is present
        const promptIndex = promptTemplate.indexOf(promptMarker);
        const user_prompt = promptTemplate.substring(0, promptIndex);
        const ai_prompt = promptTemplate.substring(promptIndex + promptMarker.length);

        // Return the split parts
        return { user_prompt, ai_prompt };
    }

    // Return an error if none of the conditions are met
    return { error: "Cannot split" };
}

@tikikun
Copy link
Contributor Author

tikikun commented Dec 5, 2023

ideally it should split the bloke format to normal format but still containing some edge case @hahuyhoang411 can use in your own model list usage to see where it broke

@tikikun tikikun moved this from Todo to Done in Jan & Cortex Dec 6, 2023
@hahuyhoang411
Copy link
Contributor

Tested with all of our models in the Hub. Thanks for the fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Archived in project
Development

No branches or pull requests

2 participants