chat {ollamar} | R Documentation |
Chat with Ollama models
Description
Chat with Ollama models
Usage
chat(
model,
messages,
stream = FALSE,
output = c("resp", "jsonlist", "raw", "df"),
endpoint = "/api/chat"
)
Arguments
model |
A character string of the model name such as "llama3". |
messages |
A list with list of messages for the model (see examples below). |
stream |
Enable response streaming. Default is FALSE. |
output |
The output format. Default is "resp". Other options are "jsonlist", "raw", "df". |
endpoint |
The endpoint to chat with the model. Default is "/api/chat". |
Value
A httr2 response object, json list, raw or data frame.
Examples
# one message
messages <- list(
list(role = "user", content = "How are you doing?")
)
chat("llama3", messages)
chat("llama3", messages, stream = TRUE)
chat("llama3", messages, stream = TRUE, output = "df")
# multiple messages
messages <- list(
list(role = "user", content = "Hello!"),
list(role = "assistant", content = "Hi! How are you?"),
list(role = "user", content = "Who is the prime minister of the uk?"),
list(role = "assistant", content = "Rishi Sunak"),
list(role = "user", content = "List all the previous messages.")
)
chat("llama3", messages, stream = TRUE)
[Package ollamar version 1.1.1 Index]