These functions initialise and interact with the active LLM, backed by ellmer::chat(). The connection is configured via named presets defined in the secrets YAML under llm.presets; individual provider, model, and url arguments override the preset when supplied.

initiate_llm(
  preset = read_secret("llm.default.preset"),
  provider = NULL,
  model = NULL,
  url = NULL,
  system_prompt = read_secret("llm.system_prompt"),
  ...
)

list_presets()

new_chat(input, ...)

chat(input, new = FALSE, ...)

chat_in_browser(new = FALSE, ...)

chat_in_console(new = FALSE, ...)

get_chat_object()

get_provider()

get_system_prompt()

get_tools()

get_tokens()

Arguments

preset

Name of a preset defined under llm.presets in the secrets YAML. Defaults to llm.default.preset. Each preset must contain a provider (e.g. "ollama", "anthropic") and a model field; an optional url sets a non-default base URL (e.g. an internal Ollama server). Individual provider, model, and url arguments override the preset when supplied.

provider

Override the provider from the preset.

model

Override the model from the preset.

url

Override the base URL from the preset. Passed as base_url to the provider function; omit for providers with fixed endpoints (e.g. Anthropic).

system_prompt

System prompt text. Defaults to llm.system_prompt from the secrets file.

...

Arguments passed on to ellmer::chat().

input

Text for input.

new

Initiate new LLM instance.

Details

Functions:

  • initiate_llm() creates a new ellmer Chat object, appends user context to the system prompt when the user's name and job title are found in the secrets file, registers the session tools (get_df_summary, list_objects, get_colnames), and probes the model to verify actual tool support. Models that do not support tools are silently recreated without them, so the function always leaves a working instance. The active object is stored in pkg_env$chat_object.

  • list_presets() prints all configured presets with their provider, model, and URL, marking the default.

  • chat() sends a single message to the active LLM, initiating a new session first if none exists or if new = TRUE.

  • new_chat() is a shorthand that always initiates a fresh session before sending the message.

  • chat_in_browser() and chat_in_console() open an interactive session via ellmer::live_browser() and ellmer::live_console() respectively.

  • get_chat_object(), get_provider(), get_system_prompt(), get_tools(), and get_tokens() are thin accessors that expose the underlying ellmer Chat object's state for inspection.