ion7-llm / chat.template

class

ion7.llm.chat.Template

_vocab ion7.core.Vocab

Functions

Template.new

Template.new(vocab)
vocabion7.core.Vocab
→ ion7.llm.chat.Template

Template:render

Render messages to a prompt string.

Template:render(messages, opts)
messagestable[]Canonical message array.
optstable?
→ string

Template:supports_thinking

True when the embedded template recognises `enable_thinking` (and therefore `opts.thinking` will have an effect).

Template:supports_thinking()
→ boolean

Template:tokenize

Render + tokenise in one call, returning the cdata token array and its length. Convenient when the caller only needs the tokens.

Template:tokenize(messages, opts)
messagestable[]
optstable?Same shape as `:render`.
→ cdata`int32_t[?]` token array.
→ integerToken count.