Dynamic prompting tailors the input prompt to adaptively include only the most relevant information from prior interactions. This approach ensures that essential context is retained while keeping the token count within limits.
- Definition: The process of creating or modifying the prompt itself at runtime (dynamically).
- It focuses on how you structure and word the instructions, templates, roles, examples, reasoning steps, output format, etc.
- The prompt (especially the system message or user message) changes based on the situation — e.g., different roles, tones, few-shot examples, or instructions depending on the task.
system_message = f"""You are a {role}.
CONTEXT:
{context}
USER PREFERENCES:
{prefs}
TONE:
{tone}
"""References
Shekhar Agrawal; Srinivasa Sunil Chippada; Rathish Mohan. Ultimate Agentic AI with AutoGen for Enterprise Automation: Design, Build, And Deploy Enterprise-Grade AI Agents Using LLMs and AutoGen To Power Intelligent, ... Enterprise Automation (English Edition) (p. 177). Orange Education Pvt Ltd, AVA™. Kindle Edition.
Python code for this
# === Fix for Jupyter Notebook ===
import nest_asyncio
nest_asyncio.apply() # ← This line solves the "event loop is already running" error
import asyncio
from autogen_ext.models.ollama import OllamaChatCompletionClient
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.ui import Console
def create_dynamic_llama_agent(
name: str,
role: str,
context: str,
prefs: str = "",
tone: str = "helpful"
):
system_message = f"""You are a {role}.
CONTEXT:
{context}
USER PREFERENCES:
{prefs}
TONE:
{tone}
Think step by step. Be clear, accurate, and follow the user's preferences.
"""
model_client = OllamaChatCompletionClient(
model="llama3.2:latest", # Use "llama3.2:3b" if you downloaded the small version
temperature=0.7,
num_ctx=8192,
)
agent = AssistantAgent(
name=name,
model_client=model_client,
system_message=system_message,
)
print(f"✅ Created dynamic agent '{name}' with role: {role}")
print(f" Context length: {len(context)} characters\n")
return agent
# ====================== Main Function ======================
async def main():
analyst = create_dynamic_llama_agent(
name="DataAnalyst",
role="data analyst",
context="The latest quarterly sales data shows a 22% drop in Europe, while Asia grew by 15%. Main product 'WidgetX' underperformed.",
prefs="Always include chart suggestions and 2-3 actionable recommendations",
tone="professional and data-driven"
)
task = "Analyze the sales situation and suggest recovery strategies."
print("🚀 Running task with dynamic prompt...\n")
# Streaming output with nice formatting
stream = analyst.run_stream(task=task)
await Console(stream)
# ====================== Run in Jupyter ======================
# Just run this cell
await main()
No comments:
Post a Comment