Skip to content

Conversation

@ochafik
Copy link
Owner

@ochafik ochafik commented Nov 2, 2025

Fixes #4 (See ggml-org/llama.cpp#15904 (comment))

@ochafik ochafik merged commit c3e4caf into main Nov 2, 2025
ochafik added a commit that referenced this pull request Nov 2, 2025
build: loudly skip tests we fail to get templates of (e.g. gated)
ochafik added a commit that referenced this pull request Nov 2, 2025
Fixes #4

- Fix parsing of values (nested method calls on function calls, e.g.
`foo(x).bar(y)`)
- Fix tool call capability detection
- Tolerate `ensure_ascii` arg in `tojson` with support in Python jinja2
testing harness (supersedes google#84 -
thanks @cnaples79 - & google#69 - thanks
@rouseabout ),
ochafik added a commit that referenced this pull request Nov 2, 2025
Fixes #4

- Fix parsing of values (nested method calls on function calls, e.g.
`foo(x).bar(y)`)
- Fix tool call capability detection
- Tolerate `ensure_ascii` arg in `tojson` with support in Python jinja2
testing harness (supersedes google#84 -
thanks @cnaples79 - & google#69 - thanks
@rouseabout ),
@danielhanchen
Copy link

@ochafik Extremely nice work - would this allow --jinja on the original GLM 4.6 template to function with tool calling? Ie no re-conversions are necessary? Would this also work out of the box with llama.cpp but I think it uses google/minja

@ochafik
Copy link
Owner Author

ochafik commented Nov 3, 2025

Extremely nice work - would this allow --jinja on the original GLM 4.6 template to function with tool calling? Ie no re-conversions are necessary?

@danielhanchen thanks! It will, but there's also work needed in llama.cpp's chat parsers and grammar generation (ongoing, e.g. ggml-org/llama.cpp#16932 and related PRs)

Would this also work out of the box with llama.cpp but I think it uses google/minja

I created google/minja while working at Google. I no longer do, development is continuing here (and llama.cpp is now syncing minja from this repo)

@danielhanchen
Copy link

@ochafik Oh nice! Great work on minja as well!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support GLM 4.6

3 participants