llm
LLM call primitive for tracked model interactions.
kitaru.llm() wraps one LiteLLM completion call with Kitaru tracking.
func_normalize_call_name(name) -> strNormalize optional user call names into ID-safe call names.
paramnamestr | NoneReturns
strfunc_provider_name(model) -> str | NoneExtract the provider prefix from a LiteLLM model identifier.
parammodelstrReturns
str | Nonefunc_provider_credential_keys(model) -> tuple[str, ...] | NoneReturn known environment-variable credential keys for a model provider.
parammodelstrReturns
tuple[str, ...] | Nonefunc_read_secret_values(secret_name) -> dict[str, str]Read secret key/value pairs from ZenML for env injection.
paramsecret_namestrReturns
dict[str, str]func_resolve_credential_overlay(selection) -> tuple[dict[str, str], str]Resolve env-first credentials with optional ZenML secret fallback.
paramselectionResolvedModelSelectionReturns
tuple[dict[str, str], str]func_normalize_messages(prompt, *, system) -> list[dict[str, Any]]Normalize string/chat prompt input into LiteLLM message format.
parampromptstr | list[dict[str, Any]]paramsystemstr | NoneReturns
list[dict[str, typing.Any]]func_extract_response_text(raw_response) -> strExtract the text response from a LiteLLM completion response.
paramraw_responseAnyReturns
strfunc_extract_usage(raw_response) -> _LLMUsageExtract usage/cost values from a LiteLLM completion response.
paramraw_responseAnyReturns
kitaru.llm._LLMUsagefunc_temporary_env(additions) -> AnyTemporarily add/override environment variables for one call.
paramadditionsMapping[str, str]Returns
typing.Anyfunc_execute_llm_call(request) -> strExecute one normalized LLM call and persist artifacts/metadata.
paramrequest_LLMRequestReturns
strfunc_llm_checkpoint_call(request) -> strSynthetic checkpoint used for flow-body kitaru.llm() calls.
paramrequest_LLMRequestReturns
strfuncllm(prompt, *, model=None, system=None, temperature=None, max_tokens=None, name=None) -> strMake a tracked LLM call.
parampromptstr | list[dict[str, Any]]User prompt text or a chat-style message list.
parammodelstr | None= NoneModel alias or concrete LiteLLM model identifier.
paramsystemstr | None= NoneOptional system prompt.
paramtemperaturefloat | None= NoneOptional sampling temperature.
parammax_tokensint | None= NoneOptional maximum response tokens.
paramnamestr | None= NoneOptional display name for this call.
Returns
strThe model response text.