Skip to content

Move deps to git submodule #89

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 17, 2024
Merged

Move deps to git submodule #89

merged 2 commits into from
May 17, 2024

Conversation

qihqi
Copy link
Collaborator

@qihqi qihqi commented May 17, 2024

  • Make deps into submodule - the hash of particular submodule is stored in git's database
  • Made run_interactive.py with tiny to a CI
  • fix some issues with run_interactive

@qihqi qihqi requested review from FanhaiLu1 and wang2yn84 May 17, 2024 20:42
@qihqi qihqi force-pushed the hanq_dev branch 2 times, most recently from 2ffafbd to 2a512a4 Compare May 17, 2024 21:15
@@ -0,0 +1,6 @@
[submodule "deps/JetStream"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it mean the default is head? In automation test or integration test, we would need the tag or commit id for the project.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the default is this hash:
image

To update this hash one ch's into the directory and run git pull .. etc.

@qihqi qihqi requested a review from FanhaiLu1 May 17, 2024 21:44
@@ -158,20 +158,15 @@ def main(argv):
decode_state, result_tokens = engine.generate(params, decode_state)
result_tokens = result_tokens.convert_to_numpy()
res = result_tokens.get_result_at_slot(slot)
stop_tokens = set(tokenizer.tokenizer.stop_tokens)
stop_tokens = set(tokenizer.stop_tokens)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JetStream didn't implement the stop_tokens for SentencePiece. You might need to add that function to JetStream.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# print(Fore.GREEN + output_str, end="", flush=True)

# print(Style.RESET_ALL + "\n")
# print("---- Streaming decode finished.")

print("---- All output tokens.")
print(sampled_tokens_list)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried in my local, actually the decode function doesn't work. Not sure if it's my local env or not.

@qihqi qihqi requested a review from wang2yn84 May 17, 2024 21:57
@qihqi qihqi merged commit d23e36c into main May 17, 2024
@qihqi qihqi deleted the hanq_dev branch May 17, 2024 22:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants