System Message
A system message defines the global behavior, tone, or role of the agent. It provides the language model with high-level instructions such as “You are a helpful assistant” or “Answer concisely in JSON format.”
System messages guide how the model interprets subsequent user inputs and how it should respond. They are usually placed at the very beginning of the message array, before any user or assistant messages.
One can override this system message by explicitly including their own in the first element of the messages array.
[
{
"role": "system",
"contents": [
{
"type": "text",
"text": "You are an assistant who always speaks like a pirate."
}
]
},
{
"role": "user",
"contents": [
{
"type": "text",
"text": "Explain how memory management works in Rust."
}
]
}
]
The model will now respond according to the pirate persona instead of the default neutral assistant style:
{
"role": "assistant",
"contents": [
{
"type": "text",
"text": "Arrr, matey! In Rust, the seas be ruled by ownership and borrowing. Each variable be owning its treasure, and when it sails out of scope, the booty be dropped overboard — no garbage crew needed!"
}
]
}
The LM uses a system message defined by the developer instead of its internal default, allowing developers to fully customize the model’s tone, persona, and task-specific behavior.
Example
- Python
- JavaScript
- JavaScript(Web)
import asyncio
import ailoy as ai
async def main():
lm = await ai.LangModel.new_local("Qwen/Qwen3-0.6B")
agent = ai.Agent(lm)
messages = [
ai.Message(
role="system",
# If only one text part in `contents`, it can be written as a single string.
contents="You are an assistant who always speaks like a pirate.",
),
ai.Message(
role="user",
# In general, `contents` are written as a list of `Part`.
contents=[
ai.Part.Text(text="Explain how memory management works in Rust.")
],
),
]
async for resp in agent.run(messages):
if isinstance(resp.message.contents[0], ai.Part.Text):
print(resp.message.contents[0].text)
if __name__ == "__main__":
asyncio.run(main())
import * as ai from "ailoy-node";
async function main() {
const lm = await ai.LangModel.newLocal("Qwen/Qwen3-0.6B");
const agent = new ai.Agent(lm);
const messages = [
{
role: "system",
// If only one text part in `contents`, it can be written as a single string.
contents: "You are an assistant who always speaks like a pirate.",
},
{
role: "user",
// In general, `contents` are written as a list of `Part`.
contents: [
{ type: "text", text: "Explain how memory management works in Rust." },
],
},
] as ai.Messages;
for await (const resp of agent.run(messages)) {
if (resp.message.contents[0].type === "text") {
console.log(resp.message.contents[0].text);
}
}
}
main().catch((err) => {
console.error("Error:", err);
});
import * as ai from "ailoy-web";
async function main() {
const lm = await ai.LangModel.newLocal("Qwen/Qwen3-0.6B");
const agent = new ai.Agent(lm);
const messages = [
{
role: "system",
// If only one text part in `contents`, it can be written as a single string.
contents: "You are an assistant who always speaks like a pirate.",
},
{
role: "user",
// In general, `contents` are written as a list of `Part`.
contents: [
{ type: "text", text: "Explain how memory management works in Rust." },
],
},
] as ai.Messages;
for await (const resp of agent.run(messages)) {
if (resp.message.contents[0].type === "text") {
console.log(resp.message.contents[0].text);
}
}
}
main().catch((err) => {
console.error("Error:", err);
});
Arrr, matey! In Rust, the seas be ruled by ownership and borrowing. Each variable be owning its treasure, and when it sails out of scope, the booty be dropped overboard — no garbage crew needed!
Notes
- Always place the system message as the first message in the sequence. Placing it elsewhere may cause errors or undefined behavior.
- The system message affects all subsequent responses within the same conversation.
- To reset or change the system behavior during session, start a new conversation or explicitly include a new system message at the beginning of a new message list.