Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(chore): Adding tags and usage for LLM Models #1894

Merged
merged 12 commits into from
Jul 18, 2024
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,11 @@ import javax.swing.Icon

data class ChatModelsResponse(val models: List<ChatModelProvider>) {
data class ChatModelProvider(
val default: Boolean,
val codyProOnly: Boolean,
val provider: String?,
val title: String?,
val model: String,
val deprecated: Boolean = false
val tags: MutableList<String>,
val usage: MutableList<String>,
) {
fun getIcon(): Icon? =
when (provider) {
Expand All @@ -34,5 +33,11 @@ data class ChatModelsResponse(val models: List<ChatModelProvider>) {
provider?.let { append(" by $provider") }
}
}

public val codyProOnly: Boolean
get() = tags.contains("pro")

public val deprecated: Boolean
get() = tags.contains("deprecated")
}
}
6 changes: 3 additions & 3 deletions src/main/kotlin/com/sourcegraph/cody/chat/AgentChatSession.kt
Original file line number Diff line number Diff line change
Expand Up @@ -356,11 +356,11 @@ private constructor(
val chatModelProvider =
state.llm?.let {
ChatModelsResponse.ChatModelProvider(
default = it.model == null,
codyProOnly = false,
provider = it.provider,
title = it.title,
model = it.model ?: "")
model = it.model ?: "",
usage = it.usage,
tags = it.tags)
}

val connectionId = createNewPanel(project) { it.server.chatNew() }
Expand Down
3 changes: 1 addition & 2 deletions src/main/kotlin/com/sourcegraph/cody/chat/ui/LlmDropdown.kt
Original file line number Diff line number Diff line change
Expand Up @@ -59,8 +59,7 @@ class LlmDropdown(
?: availableModels.find { it.model == selectedFromHistory?.model }

selectedItem =
if (selectedModel?.codyProOnly == true && isCurrentUserFree())
availableModels.find { it.default }
if (selectedModel?.codyProOnly == true && isCurrentUserFree()) availableModels.getOrNull(0)
else selectedModel

val isEnterpriseAccount =
Copy link
Contributor

@pkukielka pkukielka Jul 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jamesmcnamara

  1. We need to remove this line and just set isVisible = true always.
  2. Also, I noticed that there is now enterprise tag present, shouldn't we filter out LLMs displayed for enterprise users based on if that tag is present? Or maybe for an enterprise account we are guaranteed to get a correct list?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I pushed first change to your branch already.

I'm not sure if we need the second.

Copy link
Contributor Author

@jamesmcnamara jamesmcnamara Jul 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated this because we don't want it to be visible for enterprise users that don't have multiple models enabled. However I'm still struggling with the integration tests failing and I'm not sure how to debug them. Do you know any good tricks for figuring out why they keep timing out? I tried opening the IDE to the test file and trying the document code action but it seemed to work fine.

To your second point, I think the current idea is that an enterprise user can use any model that the server provided (but most likely they will all be enterprise models).

Also, unrelated, but do you know how to access the generated kotlin code from the Cody repo? I don't see it referenced anywhere.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Problem with integration tests was that updated recordings were missing (requires run of the ./gradlew :recordingIntegrationTest). I updated them.
Interestingly it seems we were not reporting that fact properly, I will make sure we fixed that.

Also, unrelated, but do you know how to access the generated kotlin code from the Cody repo? I don't see it referenced anywhere.

Right now they are just working as a manual reference, BUT it will change very soon as there is @RXminuS PR in flight which adress that.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ object DeprecatedChatLlmMigration {
notify: (Set<String>) -> Unit
) {

val defaultModel = models.find { it.default } ?: return
val defaultModel = models.getOrNull(0) ?: return

fun isDeprecated(modelState: LLMState): Boolean =
models.firstOrNull { it.model == modelState.model }?.deprecated ?: false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,18 @@ class LLMState : BaseState() {

@get:OptionTag(tag = "provider", nameAttribute = "") var provider: String? by string()

@get:OptionTag(tag = "tags", nameAttribute = "") var tags: MutableList<String> by list()

@get:OptionTag(tag = "usage", nameAttribute = "") var usage: MutableList<String> by list()

Comment on lines +16 to +19
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know the implications of this. Where is it getting this state from? Is this going to break immediately?

Copy link
Contributor

@pkukielka pkukielka Jul 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is state saved in a IntelliJ xml settings files.
You will need to add migration which will convert old format to the new one .
For the reference you can look at the DeprecatedChatLlmMigration.

In practice you will most likely have to:

  1. Keep ChatModelProvider::deprecated field but mark it as @Deprecated.
  2. In the migration function (see DeprecatedChatLlmMigration::migrate for reference):
  • read all the old chats
  • for each chat read the content from deprecated field
  • set tags appropriately
  • reset deprecated to be null (as it won't be used anymore, we will remove it in future)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright. I created ChatTagsLllmMigration that I believe accomplishes this.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! It looks good to me now, I just added one question about the default but otherwise I think it is mergeable.

companion object {
fun fromChatModel(chatModelProvider: ChatModelsResponse.ChatModelProvider): LLMState {
return LLMState().also {
it.model = chatModelProvider.model
it.title = chatModelProvider.title
it.provider = chatModelProvider.provider
it.tags = chatModelProvider.tags
it.usage = chatModelProvider.usage
}
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,11 @@ class SettingsMigrationTest : BasePlatformTestCase() {
it.defaultLlm =
LLMState.fromChatModel(
ChatModelsResponse.ChatModelProvider(
true, false, "Cyberdyne", "Terminator", "T-800"))
"Cyberdyne",
"Terminator",
"T-800",
mutableListOf("pro"),
mutableListOf("chat", "edit")))
it.defaultEnhancedContext =
EnhancedContextState().also {
it.isEnabled = true
Expand Down Expand Up @@ -144,7 +148,11 @@ class SettingsMigrationTest : BasePlatformTestCase() {
it.llm =
LLMState.fromChatModel(
ChatModelsResponse.ChatModelProvider(
false, true, "Uni of IL", "HAL", "HAL 9000"))
"Uni of IL",
"HAL",
"HAL 9000",
mutableListOf("pro"),
mutableListOf("chat", "edit")))
it.messages =
mutableListOf(
MessageState().also {
Expand Down Expand Up @@ -190,22 +198,20 @@ class SettingsMigrationTest : BasePlatformTestCase() {
fun `test DeprecatedChatLlmMigration`() {
Copy link
Contributor

@pkukielka pkukielka Jul 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is incorrect I think.
That is migration function, so it specifically migrates data form one old format to another.
It needs to work after your changes.
Then your migration function can pick up state after all existing migrations, and migrate it further (so you will have to add new test for it)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understood. I reverted those changes and added a new test for my migration.

fun createLlmModel(
version: String,
isDefault: Boolean,
isDeprecated: Boolean
tags: List<String>,
): ChatModelsResponse.ChatModelProvider {
return ChatModelsResponse.ChatModelProvider(
isDefault,
false,
"Anthropic",
"Claude $version",
"anthropic/claude-$version",
isDeprecated)
tags.toMutableList(),
mutableListOf("chat", "edit"))
}

val claude20 = createLlmModel("2.0", isDefault = false, isDeprecated = true)
val claude21 = createLlmModel("2.1", isDefault = false, isDeprecated = true)
val claude30 = createLlmModel("3.0", isDefault = true, isDeprecated = false)
val models = listOf(claude20, claude21, claude30)
val claude20 = createLlmModel("2.0", listOf("deprecated"))
val claude21 = createLlmModel("2.1", listOf("deprecated"))
val claude30 = createLlmModel("3.0", listOf())
val models = listOf(claude30, claude20, claude21)

val accountData =
mutableListOf(
Expand Down
Loading