-
Notifications
You must be signed in to change notification settings - Fork 302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cody Web: Add high-level memoization for chat/transcript UI #5036
Conversation
@@ -48,6 +48,10 @@ export const Chat: React.FunctionComponent<React.PropsWithChildren<ChatboxProps> | |||
}) => { | |||
const { reload: reloadMentionProviders } = useContextProviders() | |||
const telemetryRecorder = useTelemetryRecorder() | |||
|
|||
const transcriptRef = useRef(transcript) | |||
transcriptRef.current = transcript |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we are updating the ref.current each time, then how is it different from using transcript
var directly? Just to not have transcript
as a dependency in feedbackButtonsOnSubmit = useCallback
? If so, then we can simply remove it from the deps list and use eslint-disable.
I wish I had Cody in the browser to ask this question to.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we are updating the ref.current each time, then how is it different from using transcript var directly?
references don't trigger useCallback memoization, so we will still have the same function even though ref.current is changing
Just to not have transcript as a dependency in feedbackButtonsOnSubmit = useCallback?
In this case function that we're passing to useCallback will save the first trascript value in its closure scope and we won't have the most recent value when we will call this function.
contextItems: ContextItem[] | undefined | ||
model?: Model['model'] | ||
isForFirstMessage: boolean | ||
className?: string | ||
|
||
/** For use in storybooks only. */ | ||
__storybook__initialOpen?: boolean | ||
}> = ({ contextItems, model, isForFirstMessage, className, __storybook__initialOpen }) => { | ||
}> = memo(({ contextItems, model, isForFirstMessage, className, __storybook__initialOpen }) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
@@ -296,22 +296,32 @@ export const CodyWebChatProvider: FC<PropsWithChildren<CodyWebChatProviderProps> | |||
[client, vscodeAPI, setLastActiveChatID] | |||
) | |||
|
|||
const contextInfo = useMemo( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, React sucks! context API should be somehow doing it out of the box.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, they were discussing this at some point reactjs/rfcs#150
dai-shi even got it working via library here use-context-selecor
https://github.com/dai-shi/use-context-selector?tab=readme-ov-file but I think it never got in production unfortunately
8a257e4
to
e9f620a
Compare
2e0d848
to
6f7381c
Compare
Fixes part of https://linear.app/sourcegraph/issue/CODY-3943/optimize-performance-for-large-chat-windows-in-cody It looks like since #5036 we've lost memoization in a few places. As a result, our chat UI doesn't perform well with more than 4-5 messages (depending on how long these chat messages are) This PR tries to improve it by fixing memoization on expensive components (Transcript, HumanMessage, ...etc) | Before | After | | ------- | ------- | | <video src="https://github.com/user-attachments/assets/79e7b93c-ac82-4a8c-9b2e-6c712af3fb2c" /> | <video src="https://github.com/user-attachments/assets/223b3513-0f30-47cc-b24c-c54f4861329c" /> | This PR also fixes a problem with one-box intention detection; previously, it produced a state where we didn't have a follow-up message UI during intent resolution. ## Test plan - Check with the React profile that we don't render components during message response in places where we don't need to re-render anything - Check that intent detection in one-box doesn't produce UI flashes
Fixes part of https://linear.app/sourcegraph/issue/CODY-3943/optimize-performance-for-large-chat-windows-in-cody It looks like since sourcegraph#5036 we've lost memoization in a few places. As a result, our chat UI doesn't perform well with more than 4-5 messages (depending on how long these chat messages are) This PR tries to improve it by fixing memoization on expensive components (Transcript, HumanMessage, ...etc) | Before | After | | ------- | ------- | | <video src="https://github.com/user-attachments/assets/79e7b93c-ac82-4a8c-9b2e-6c712af3fb2c" /> | <video src="https://github.com/user-attachments/assets/223b3513-0f30-47cc-b24c-c54f4861329c" /> | This PR also fixes a problem with one-box intention detection; previously, it produced a state where we didn't have a follow-up message UI during intent resolution. ## Test plan - Check with the React profile that we don't render components during message response in places where we don't need to re-render anything - Check that intent detection in one-box doesn't produce UI flashes
Attempt to fix https://linear.app/sourcegraph/issue/SRCH-724/prinova-found-an-issue-on-cody-web
The problem that some customers have is somewhere in the react rendering pipeline, I couldn't reproduce this problem from the issue on my machine but I noticed that even though we have
useCallback
anduseMemo
hooks in high-level components we still don't use memoization on components level for anything, this means that the whole chate (all messages) would be re-rendered if one of messages has been changed (by re-rendering I mean react reconciliation).This PR
memo()
wrappers for heavy UI components such as transcript interactions and its children UI like human input, assistant message, ..etc)Note: I chained this PR with un-related to this PR telemetry fix just to avoid double package versions in NPM (I will merge them in sequence as they will be reviewed)
Test plan