ion7-llm / ion7-llm

module

ion7-llm

Functions

llm.capabilities

Capability snapshot of the ion7-llm runtime, complementing `ion7.core.capabilities()`. Reflects what the chat pipeline can offer given the linked libcommon bridge.

llm.capabilities()
→ table{

llm.pipeline

Convenience constructor : build a single-session chat pipeline in one call. Returns the manager + engine pair. local cm, engine = llm.pipeline(ctx, vocab, { headroom = 256, default_sampler = llm.sampler.profiles.balanced(), })

llm.pipeline(ctx, vocab, opts)
ctxion7.core.Context
vocabion7.core.Vocab
optstable?Pass-through to both `kv.new` and `Engine.new`.
→ ion7.llm.kv.ContextManager
→ ion7.llm.Engine