-
Notifications
You must be signed in to change notification settings - Fork 309
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use AI-Mask within Worker #19
base: main
Are you sure you want to change the base?
Conversation
Someone is attempting to deploy a commit to a Personal Account owned by @jacoblee93 on Vercel. @jacoblee93 first needs to authorize it. |
Hey really sorry, I got very busy this weekend. I will look as soon as I am able! |
app/worker.ts
Outdated
@@ -262,3 +307,7 @@ self.addEventListener("message", async (event: { data: any }) => { | |||
data: "OK", | |||
}); | |||
}); | |||
|
|||
(async () => { | |||
aiMaskClient = await AIMaskClient.getWorkerClient() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not just await
above as well as needed? It doesn't seem like it should be a long running operation?
This seems like it'll cause a race condition?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is supposed to be executed be executed at startup. I added an await on the main thread to verify this is done before doing anything else.
Agree that events in the worker are not safe guarded, could be better implemented but I think it's enough for here.
</label> | ||
<div className="my-4 ml-auto mr-auto"> | ||
<label htmlFor="modelSelector" className='mr-2'>Model provider</label> | ||
<select id="modelSelector" value={modelProvider} onChange={e => setModelProvider(e.target.value as ModelProvider)} className="p-2 bg-white text-black rounded border"> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would actually love to keep Ollama the default here since it's a lot more effective with the 7B parameter models
And would like to keep it as a toggle too - am thinking toggle + select once Browser Only
is selected. But I can make that change later!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
set ollama as default but kept select for simplicity
@@ -262,3 +307,7 @@ self.addEventListener("message", async (event: { data: any }) => { | |||
data: "OK", | |||
}); | |||
}); | |||
|
|||
(async () => { | |||
aiMaskClient = await AIMaskClient.getWorkerClient(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See below, could this just trigger on some initialization
event, then send a initializationComplete
once this await
returns or it times out?
Overall this looks fantastic!!! I can make all the changes I've brought up - the bigger problem is that I just can't seem to run the extension 😅 Am I missing a step? |
Hey also sorry for the delay, just got back on this. I went through your comments and fixed most of it. My code was mostly a first throw for a working demo with workers. I’m actually wondering if there’s really something about this extension, compared to ollama, and if it’s worth continuing efforts. I’d love to hear your take on what such an extension could have that something like ollama won’t. |
Replacing previous PR #16
Simple modificiation to support AI-Mask extension as a model provider
I have somehow managed to communicate with the extension from within the web worker, so the computation is not blocking anymore.
Deployed here: https://fully-local-pdf-chatbot-topaz.vercel.app/
TODO: