Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infinity transcript mode may stuck in ram? #530

Closed
FNsi opened this issue Mar 26, 2023 · 3 comments
Closed

Infinity transcript mode may stuck in ram? #530

FNsi opened this issue Mar 26, 2023 · 3 comments
Labels
generation quality Quality of model output need more info The OP should provide more details about the issue

Comments

@FNsi
Copy link
Contributor

FNsi commented Mar 26, 2023

After ctx > 2048 or whatever set in -c, While close the terminal, the transcript may have a chance continuously running in system.

Linux amd64 5.19 ubuntu base.

@FNsi
Copy link
Contributor Author

FNsi commented Mar 26, 2023

After ctx > 2048 or whatever set in -c, While close the terminal, the transcript may have a chance continuously running in system.

Linux amd64 5.19 ubuntu base.

Also may just Because my system.

@gjmulder gjmulder added need more info The OP should provide more details about the issue generation quality Quality of model output labels Mar 26, 2023
@gjmulder
Copy link
Collaborator

@FNsi
Copy link
Contributor Author

FNsi commented Mar 27, 2023

Please review and use the issue template

Sorry, I reproduce it in any software😅😂 it's obvious because my system.

@FNsi FNsi closed this as not planned Won't fix, can't repro, duplicate, stale Mar 27, 2023
Deadsg pushed a commit to Deadsg/llama.cpp that referenced this issue Dec 19, 2023
…terial-9.1.19

Bump mkdocs-material from 9.1.18 to 9.1.19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
generation quality Quality of model output need more info The OP should provide more details about the issue
Projects
None yet
Development

No branches or pull requests

2 participants