Skip to content
This repository was archived by the owner on Feb 16, 2026. It is now read-only.

feat: add support for max_tokens#60

Open
dennis-schadeck wants to merge 2 commits intofynnfluegge:mainfrom
dennis-schadeck:main
Open

feat: add support for max_tokens#60
dennis-schadeck wants to merge 2 commits intofynnfluegge:mainfrom
dennis-schadeck:main

Conversation

@dennis-schadeck
Copy link
Copy Markdown

This PR enables setting max_tokens for all LLM.
Defaults are preserved and max_tokens is capped by model context.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant