Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix imports and client param error in Simple Wikipedia Demo #216

Merged
merged 1 commit into from
Dec 13, 2022

Conversation

georgewritescode
Copy link
Contributor

Associated with this issue: #215

  • What kind of change does this PR introduce? (Bug fix, feature, docs update, ...)
    Bug fixes to Simple Wikipedia Demo error
  • Add missing imports
  • Remove unused pytorch import
  • Change batch size params to align to the current Python client
  • What is the current behavior? (You can also link to an open issue here)
    See issue

  • Does this PR introduce a breaking change? (What changes might users need to make in their application due to this PR?)
    No

  • Have unit tests been run against this PR? (Has there also been any additional testing?)
    Tests ran, no additional testing

  • Please check if the PR fulfills these requirements

  • The commit message follows our guidelines
  • Tests for the changes have been added (for bug fixes/features)
  • Docs have been added / updated (for bug fixes / features)

What kind of change does this PR introduce? (Bug fix, feature, docs update, ...)
Bug fixes to Simple Wikipedia Demo error
- Add missing imports
- Remove unused pytorch import
- Change batch size params to align to the current Python client

Associated with this issue: marqo-ai#215
@georgewritescode georgewritescode changed the title Fix imports and client erros in Simple Wikipedia Demo Fix imports and client param error in Simple Wikipedia Demo Dec 11, 2022
@jn2clark
Copy link
Contributor

Thanks @georgewritescode ! My only suggestion is can we change the client_batch_size to be something larger, like 2000? To get the most out of the server batching it needs to receive enough to justify the multiple processes.

@pandu-k
Copy link
Collaborator

pandu-k commented Dec 11, 2022

Thanks @georgewritescode ! My only suggestion is can we change the client_batch_size to be something larger, like 2000? To get the most out of the server batching it needs to receive enough to justify the multiple processes.

Setting this to client_batch_size=20, and removing processes has been the most portable solution in my experience

@pandu-k pandu-k merged commit 86bfdc5 into marqo-ai:mainline Dec 13, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants