Skip to content

Commit

Permalink
Merge pull request #480 from coasys/improve-llm-prompt
Browse files Browse the repository at this point in the history
Conversation summary LLM prompt improvements + couple of small fixes
  • Loading branch information
jhweir authored Dec 17, 2024
2 parents 20eef54 + 0beebfa commit 70d54d8
Show file tree
Hide file tree
Showing 9 changed files with 77 additions and 88 deletions.
Binary file modified app/build/icon.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion app/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
"@coasys/flux-poll-view": "*",
"@coasys/flux-types": "*",
"@coasys/flux-ui": "*",
"@coasys/flux-utils": "0.9.0",
"@coasys/flux-utils": "0.9.1",
"@coasys/flux-vue": "*",
"@coasys/flux-webrtc-view": "0.8.1-fix.1",
"@coasys/nillion-file-store": "*",
Expand Down
Binary file modified app/src/assets/images/icon.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
3 changes: 3 additions & 0 deletions app/src/views/main/MainView.vue
Original file line number Diff line number Diff line change
Expand Up @@ -196,6 +196,8 @@ import { Community } from "@coasys/flux-api";
import { useRoute } from "vue-router";
import { registerNotification } from "../../utils/registerMobileNotifications";
import { ad4mConnect } from "@/ad4mConnect";
import { ensureLLMTask } from "@coasys/flux-utils";
export default defineComponent({
name: "MainAppView",
Expand Down Expand Up @@ -247,6 +249,7 @@ export default defineComponent({
},
async mounted() {
registerNotification();
ensureLLMTask();
ad4mConnect.addEventListener("authstatechange", async (e) => {
let oldState = this.oldAuthState;
Expand Down
2 changes: 1 addition & 1 deletion packages/ui/meta.json

Large diffs are not rendered by default.

131 changes: 60 additions & 71 deletions packages/utils/src/synergy.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { Ad4mClient } from "@coasys/ad4m";
import { Ad4mClient, AITask } from "@coasys/ad4m";
import { getAd4mClient } from "@coasys/ad4m-connect/utils";
import {
Conversation,
Expand Down Expand Up @@ -127,24 +127,10 @@ async function linkTopic(perspective, itemId, topicId, relevance) {
await relationship.save();
}

async function LLMProcessing(
newItem,
latestSubgroups,
latestSubgroupItems,
allTopics,
attemptKey?: number,
errorMessage?: string
) {
console.log(
"LLMProcessing: ",
newItem,
latestSubgroups,
latestSubgroupItems,
allTopics,
attemptKey
);

export async function ensureLLMTask(): Promise<AITask> {
const taskPrompt = `
You are here as an integrated part of a chat system - you're answers will be directly parsed by JSON.parse().
So make sure to always (!) respond with valid JSON!!
I'm passing you a JSON object with the following properties: 'previousSubgroups' (string block broken up into sections by line breaks <br/>), 'previousMessages' (string array), 'newMessage' (string), and 'existingTopics' (string array).
{ previousSubgroups: [], previousMessages: [], newMessage: 'Some text', existingTopics: [] }
Firstly, analyze the 'newMessage' string and identify between 1 and 5 topics (each a single word string in lowercase) that are relevant to the content of the 'newMessage' string. If any of the topics you choose are similar to topics listed in the 'existingTopics' array, use the existing topic instead of creating a new one (e.g., if one of the new topics you picked was 'foods' and you find an existing topic 'food', use 'food' instead of creating a new topic that is just a plural version of the existing topic). For each topic, provide a relevance score between 0 and 100 (0 being irrelevant and 100 being highly relevant) that indicates how relevant the topic is to the content of the 'newMessage' string.
Expand All @@ -162,7 +148,10 @@ async function LLMProcessing(
4. **'newSubgroupSummary'**: a 1 to 3 sentence paragraph (string) summary of the conents of the conversation. If changedSubject is true, base the summary solely on the new message, otherwise base it on both the new message and the last messages. Don't reference previous conversations.
5. **'newConversationName'**: a 1 to 3 word title (string) describing the contents of the previousSubgroups plus the newSubgroupSummary. Don't reference previous conversations.
6. **'newConversationSummary'**: a 1 to 3 sentence paragraph (string) summary of the the previousSubgroups plus the newSubgroupSummary. Don't reference previous conversations.
Make sure the response is in a format that can be parsed using JSON.parse(). Don't wrap it in code syntax.
Make sure the response is in a format that can be parsed using JSON.parse(). Don't wrap it in code syntax, don't append text outside of quotes, don't use the assign operator ("=").
If you make a mistake and we can't parse you're output, I will give you the same input again, plus another field "jsonParseError" holding the error we got from JSON.parse().
So if you see that field, take extra care about that specific mistake and don't make it again!
Don't talk about the errors in the summaries or topics.
`;

const examples = [
Expand All @@ -188,63 +177,63 @@ async function LLMProcessing(
const tasks = await client.ai.tasks();
let task = tasks.find((t) => t.name === "flux-synergy-task");
if (!task) task = await client.ai.addTask("flux-synergy-task", "default", taskPrompt, examples);
return task
}

let prompt = `{
previousSubgroups: [${latestSubgroups.map((s: any) => s.summary).join(" <br/> ")}],
previousMessages: [${latestSubgroupItems.map((si: any) => si.text).join(", ")}],
newMessage: '${newItem.text}',
existingTopics: [${allTopics.map((t: any) => t.name).join(", ")}]
}`;

if (errorMessage) {
prompt += `
<br/><br/>
The last request to the model failed with the following error message: ${errorMessage}.
<br/><br/>
Please try to provide a valid JSON response that avoids this error.
`;
}

const response = await client.ai.prompt(task.taskId, prompt);
console.log("LLM Response: ", response);
async function LLMProcessing(
newItem,
latestSubgroups,
latestSubgroupItems,
allTopics,
) {
let prompt = {
previousSubgroups: [latestSubgroups.map((s: any) => s.summary).join(" <br/> ")],
previousMessages: [latestSubgroupItems.map((si: any) => si.text).join(", ")],
newMessage: newItem.text,
existingTopics: [allTopics.map((t: any) => t.name).join(", ")]
};

const task = await ensureLLMTask();
const client: Ad4mClient = await getAd4mClient();
let parsedData;
try {
parsedData = JSON5.parse(response);
} catch (error) {
console.error("Failed to parse LLM response:", error);
if (!attemptKey || attemptKey < 5) {
// retry up to 5 times if LLM fails to produce valid JSON
return await LLMProcessing(
newItem,
latestSubgroups,
latestSubgroupItems,
allTopics,
attemptKey ? attemptKey + 1 : 1,
error.message
);
} else {
// give up and return empty data
console.error("Failed to parse LLM response after 5 attempts. Returning empty data.");
return {
topics: [],
changedSubject: false,
newSubgroupName: "",
newSubgroupSummary: "",
newConversationName: "",
newConversationSummary: "",
};
}
let attempts = 0
while(!parsedData && attempts < 5) {
attempts += 1
console.log("LLM Prompt:", prompt)
const response = await client.ai.prompt(task.taskId, JSON.stringify(prompt));
console.log("LLM Response: ", response);
response.replace("False", "false");
response.replace("True", "true");
try {
parsedData = JSON5.parse(response);
} catch (error) {
console.error("LLM response parse error:", error)
//@ts-ignore
prompt.jsonParseError = error;
}
}

return {
topics: parsedData.topics || [],
changedSubject: parsedData.changedSubject || false,
newSubgroupName: parsedData.newSubgroupName || "",
newSubgroupSummary: parsedData.newSubgroupSummary || "",
newConversationName: parsedData.newConversationName || "",
newConversationSummary: parsedData.newConversationSummary || "",
};
if(parsedData){
return {
topics: parsedData.topics || [],
changedSubject: parsedData.changedSubject || false,
newSubgroupName: parsedData.newSubgroupName || "",
newSubgroupSummary: parsedData.newSubgroupSummary || "",
newConversationName: parsedData.newConversationName || "",
newConversationSummary: parsedData.newConversationSummary || "",
};
} else {
// give up and return empty data
console.error("Failed to parse LLM response after 5 attempts. Returning empty data.");
return {
topics: [],
changedSubject: false,
newSubgroupName: "",
newSubgroupSummary: "",
newConversationName: "",
newConversationSummary: "",
};
}
}

export function transformItem(type, item) {
Expand Down
6 changes: 4 additions & 2 deletions views/synergy-demo-view/src/utils/index.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,10 @@ export async function getConvoData(perspective, channelId, match?, setMatchIndex
const subgroupItems = await getSubgroupItems(perspective, subgroup.baseExpression);
subgroup.groupType = "subgroup";
subgroup.topics = await findTopics(perspective, subgroup.baseExpression);
subgroup.start = subgroupItems[0].timestamp;
subgroup.end = subgroupItems[subgroupItems.length - 1].timestamp;
if(subgroupItems.length) {
subgroup.start = subgroupItems[0].timestamp;
subgroup.end = subgroupItems[subgroupItems.length - 1].timestamp;
}
subgroup.participants = [];
subgroup.children = await Promise.all(
subgroupItems.map(async (item: any, itemIndex) => {
Expand Down
13 changes: 8 additions & 5 deletions views/webrtc-view/src/components/Transcriber/Transcriber.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -118,13 +118,16 @@ export default function Transcriber({
const transcriptCard = document.getElementById(
`transcript-${previousId}`
);
transcriptCard.classList.add(styles.slideLeft);
setTimeout(() => {
transcriptCard.classList.add(styles.hide);
if(transcriptCard){
transcriptCard.classList.add(styles.slideLeft);
setTimeout(() => {
setTranscripts((ts) => ts.filter((t) => t.id !== previousId));
transcriptCard.classList.add(styles.hide);
setTimeout(() => {
setTranscripts((ts) => ts.filter((t) => t.id !== previousId));
}, 500);
}, 500);
}, 500);
}

// save message
// @ts-ignore
const message = (await messageRepo.create({
Expand Down
8 changes: 0 additions & 8 deletions yarn.lock
Original file line number Diff line number Diff line change
Expand Up @@ -1379,14 +1379,6 @@
reflect-metadata "^0.1.13"
type-graphql "1.1.1"

"@coasys/flux-utils@0.9.0":
version "0.9.0"
resolved "https://registry.yarnpkg.com/@coasys/flux-utils/-/flux-utils-0.9.0.tgz#995d52dd5a50be7b18f2b2ce0fb363f0cde3386e"
integrity sha512-JDKYX7mvDbJQVZL9kM60dMUX/e1w0mhPLdhfeBiUwKXCAGUyKWg9q5Mvx30e/A4YzXSAVFRsDZzVngthg7EUOw==
dependencies:
"@coasys/flux-constants" "*"
"@coasys/flux-types" "*"

"@coasys/flux-vue@*":
version "0.7.6"
resolved "https://registry.yarnpkg.com/@coasys/flux-vue/-/flux-vue-0.7.6.tgz#8acd77dafd7ed914eb1ec609df72534dcdc90e74"
Expand Down

0 comments on commit 70d54d8

Please sign in to comment.