Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 10 additions & 2 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Azure AI Foundry Project (Required)
# Your Azure AI Foundry project endpoint — used by AzureOpenAIResponsesClient
# Azure AI Foundry Project (Required for most lessons)
# Your Azure AI Foundry project endpoint
AZURE_AI_PROJECT_ENDPOINT="https://..."

# Model deployment name in your Foundry project (e.g., gpt-4o)
Expand All @@ -8,3 +8,11 @@ AZURE_AI_MODEL_DEPLOYMENT_NAME="gpt-4o"
# Azure AI Search (Required for Lesson 05 - Agentic RAG)
AZURE_SEARCH_SERVICE_ENDPOINT="https://..."
AZURE_SEARCH_API_KEY="..."

# GitHub Models (Required for Lesson 06 and Lesson 08 GitHub Models workflows)
GITHUB_TOKEN="..."
GITHUB_ENDPOINT="https://models.inference.ai.azure.com"
GITHUB_MODEL_ID="gpt-4o-mini"

# Azure AI Bing Connection (Required for Lesson 08 - Bing grounding workflow)
BING_CONNECTION_ID="..."
18 changes: 18 additions & 0 deletions 00-course-setup/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -236,6 +236,24 @@ Lesson 5 uses **Azure AI Search** for retrieval-augmented generation. If you pla
| `AZURE_SEARCH_SERVICE_ENDPOINT` | Azure portal → your **Azure AI Search** resource → **Overview** → URL |
| `AZURE_SEARCH_API_KEY` | Azure portal → your **Azure AI Search** resource → **Settings** → **Keys** → primary admin key |

## Additional Setup for Lesson 6 and Lesson 8 (GitHub Models)

Some notebooks in lessons 6 and 8 use **GitHub Models** instead of Azure AI Foundry. If you plan to run those samples, add these variables to your `.env` file:

| Variable | Where to find it |
|----------|-----------------|
| `GITHUB_TOKEN` | GitHub → **Settings** → **Developer settings** → **Personal access tokens** |
| `GITHUB_ENDPOINT` | Use `https://models.inference.ai.azure.com` (default value) |
| `GITHUB_MODEL_ID` | Model name to use (e.g. `gpt-4o-mini`) |

## Additional Setup for Lesson 8 (Bing Grounding Workflow)

The conditional workflow notebook in lesson 8 uses **Bing grounding** via Azure AI Foundry. If you plan to run that sample, add this variable to your `.env` file:

| Variable | Where to find it |
|----------|-----------------|
| `BING_CONNECTION_ID` | Azure AI Foundry portal → your project → **Management** → **Connected resources** → your Bing connection → copy the connection ID |

## Stuck Somewhere?

If you have any issues running this setup, hop into our <a href="https://discord.gg/kzRShWzttr" target="_blank">Azure AI Community Discord</a> or <a href="https://github.com/microsoft/ai-agents-for-beginners/issues?WT.mc_id=academic-105485-koreyst" target="_blank">create an issue</a>.
Expand Down
9 changes: 3 additions & 6 deletions 02-explore-agentic-frameworks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,8 +84,7 @@ def book_flight(date: str, location: str) -> str:


async def main():
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),
)
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())
agent = await provider.create_agent(
name="travel_agent",
instructions="Help the user book travel. Use the book_flight tool when ready.",
Expand Down Expand Up @@ -120,8 +119,7 @@ import os
from agent_framework.azure import AzureAIProjectAgentProvider
from azure.identity import AzureCliCredential

provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),
)
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())

# Data Retrieval Agent
agent_retrieve = await provider.create_agent(
Expand Down Expand Up @@ -177,8 +175,7 @@ Here are some important core concepts of the Microsoft Agent Framework:
from agent_framework.azure import AzureAIProjectAgentProvider
from azure.identity import AzureCliCredential

provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),
)
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())
agent = await provider.create_agent(
name="my_agent",
instructions="You are a helpful assistant.",
Expand Down
7 changes: 2 additions & 5 deletions 04-tool-use/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ Let's use the example of getting the current time in a city to illustrate:
```python
# Initialize the Azure OpenAI client
client = AzureOpenAI(
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),
azure_endpoint = os.getenv("AZURE_AI_PROJECT_ENDPOINT"),
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AzureOpenAI(azure_endpoint=...) expects an Azure OpenAI resource endpoint (e.g., https://<resource>.openai.azure.com), not an Azure AI Foundry project endpoint. Using AZURE_AI_PROJECT_ENDPOINT here will misconfigure the client. Consider reverting this line to os.getenv("AZURE_OPENAI_ENDPOINT") (and keep that variable in docs for AzureOpenAI snippets), or switch the example to use the Foundry/Agent provider API consistently instead of AzureOpenAI.

Suggested change
azure_endpoint = os.getenv("AZURE_AI_PROJECT_ENDPOINT"),
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),

Copilot uses AI. Check for mistakes.
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_version="2024-05-01-preview"
)
Expand Down Expand Up @@ -225,10 +225,7 @@ def get_current_time(location: str) -> str:
...

# Create the client
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),
project_endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
deployment_name=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
)
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())

# Create an agent and run with the tool
agent = await provider.create_agent(name="TimeAgent", instructions="Use available tools to answer questions.", tools=get_current_time)
Expand Down
2 changes: 0 additions & 2 deletions 06-building-trustworthy-agents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -169,8 +169,6 @@ from azure.identity import AzureCliCredential
# Create the provider with human-in-the-loop approval
provider = AzureAIProjectAgentProvider(
credential=AzureCliCredential(),
endpoint=os.getenv("AZURE_AI_PROJECT_ENDPOINT"),
model="gpt-4o-mini",
)

# Create the agent with a human approval step
Expand Down
10 changes: 2 additions & 8 deletions 07-planning-design/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,10 +81,7 @@ class TravelPlan(BaseModel):
subtasks: List[TravelSubTask]
is_greeting: bool

provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),
endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
model="gpt-4o-mini",
)
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())

# Define the user message
system_prompt = """You are a planner agent.
Expand Down Expand Up @@ -157,10 +154,7 @@ from azure.identity import AzureCliCredential

# Create the client

provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),
endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
model=os.getenv("AZURE_OPENAI_CHAT_DEPLOYMENT_NAME"),
)
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())

from pprint import pprint

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,10 +61,8 @@
"metadata": {},
"outputs": [],
"source": [
"# Create the Azure OpenAI Responses client using Azure CLI credentials\n",
"provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),\n",
" project_endpoint=os.environ[\"AZURE_AI_PROJECT_ENDPOINT\"],\n",
")"
"# Create the Azure AI Foundry provider\n",
"provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())\n"
]
},
{
Expand Down Expand Up @@ -238,4 +236,4 @@
},
"nbformat": 4,
"nbformat_minor": 5
}
}
10 changes: 4 additions & 6 deletions 12-context-engineering/code_samples/12-chat_summarization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -59,14 +59,12 @@
"metadata": {},
"outputs": [],
"source": [
"# Load environment variables and create Azure OpenAI client\n",
"# Load environment variables and create Azure AI Foundry provider\n",
"load_dotenv()\n",
"\n",
"provider = AzureAIProjectAgentProvider(credential=AzureCliCredential()),\n",
" deployment_name=os.environ[\"AZURE_AI_MODEL_DEPLOYMENT_NAME\"],\n",
")\n",
"provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())\n",
"\n",
"print(\"✅ Azure OpenAI client configured\")"
"print(\"✅ Azure AI Foundry provider configured\")\n"
]
},
{
Expand Down Expand Up @@ -303,4 +301,4 @@
},
"nbformat": 4,
"nbformat_minor": 4
}
}