module
model.context_factory
Functions
M.vocab
Return the model's vocabulary as an ion7.core.Vocab handle. Cached on the Model — repeated calls are O(1).
M.vocab(self)
→ ion7.core.Vocab
M.context
Create an inference context from this model.
M.context(self, opts)
optstable?
→ ion7.core.Context
raises — When context creation fails after the retry cascade.
M.embedding_context
Create an embedding context : pooling enabled, no logits, CPU-only by default. Used for vector-search workloads where the model only emits a single vector per input rather than token-by-token output.
M.embedding_context(self, opts)
optstable?
→ ion7.core.ContextMarked with `_is_embed = true`.
raises — When creation fails.