Skip to content

Commit

Permalink
Dantaylo/nov2024 merge (#152)
Browse files Browse the repository at this point in the history
* add ai sdk samples for inference and rag

* remove outdated rag sample

* Update chat-template.py (#146)

change comment dashes to underscores.  Docs platform having trouble with the current ids.

* Update chat-template.py (#147)

add extra space to top

* use underscores in doc comments

* add basic projects samples for docs

* update custom rag app logging and evaluation

* update basic readme

* add search to basic samples

* remove unused import from create_search_index

* fix requirements.txt files

* update rag sample from bug bash

* README update and code fix for param

* add initial tests

* update readmes add test for rag

* readme tweaks

* run linters

* fix linter and formatting errors

* remove packages

* move scripts from chat-app into projects

* Add services to rag readme

---------

Co-authored-by: Sheri Gilley <sgilley@microsoft.com>
Co-authored-by: Leah Bar-On Simmons <lebaro@microsoft.com>
  • Loading branch information
3 people authored Nov 18, 2024
1 parent c06c3c7 commit 52ab914
Show file tree
Hide file tree
Showing 78 changed files with 1,527 additions and 8,545 deletions.
1 change: 1 addition & 0 deletions scenarios/projects/basic/.env.sample
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
AIPROJECT_CONNECTION_STRING=your_connection_string
100 changes: 100 additions & 0 deletions scenarios/projects/basic/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
---
page_type: sample
languages:
- python
products:
- ai-products
- ai-model-inference
- ai-search
- ai-evaluation
description: Hello world samples for the projects SDK client
---

## Project SDK Basic samples

### Overview

This folder contains hello world samples for the projects SDK client.

### Objective

This is meant to test out code that's used in our [SDK Overview](https://aka.ms/aifoundrysdk) page.

## Create resources

To run this sample, you'll need to create an Azure AI Project with an Azure AI Services resource connected to it. If you have an existing one, you can skip these steps and move to the next section.

### Create an AI Project and AI Services resource

First we'll create a project in Azure AI Studio:
- Navigate to [ai.azure.com](ai.azure.com)
- Click **New Project** on the homepage
- Enter a project name
- Click **Create new hub**, provide a hub name
- In **Customize** change the location to **East US 2** or **Sweden Central**
- Click **Create Project**

This will take about 3 minutes to complete.

### Deploy a gpt-4o-mini model

Now we'll need to deploy a model so that we can call it from code. To start, we'll use a gpt-4o-mini model because it's fast and cheap. You can experiment with using a gpt-4o model for better results.
- Go to the **Models + Endpoints** page
- Click the **+ Deploy Model** dropdown and click **Deploy a base model**
- Select **gpt-4o-mini** from the list and click **Confirm**

Repeat the above steps for text-embedding-ada-002

## Set up a local development environment

First, clone this repo locally from your favorite terminal and open this folder:
```
git clone https://github.com/azure-samples/azureai-samples
cd azureai-samples/scenarios/projects/basic
```

Then run az login to authenticate with Azure:
```
az login
```

### Creating a local Python environment

First, create a virtual environment. Always do this when installing packages locally >:(

On Windows:
```
py -3 -m venv .venv
.venv\scripts\activate
```

On Linux:
```
python3 -m venv .venv
source venv/bin/activate
```

Install the core dependencies to run the sample:
```Python
pip install -r requirements.txt
```

## Configure project string

Go back to the **Overview** page of your project, and in the upper right hand corner click the copy button beside the **Project connection string** field.

Create a ```.env``` file using the sample:
```
cp .env.sample .env
```

Open the ```.env``` file and paste (ctrl-v) the value to the right of the ```AIPROJECT_CONNECTION_STRING=``` variable.

### Try out samples!

Run the different python files to run different code samples in this folder:
```
python <file_name>.py
```

### Estimated Runtime: 10 mins
48 changes: 48 additions & 0 deletions scenarios/projects/basic/agents.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# ruff: noqa: E402

import os
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

from dotenv import load_dotenv

load_dotenv()

project = AIProjectClient.from_connection_string(
conn_str=os.environ["AIPROJECT_CONNECTION_STRING"], credential=DefaultAzureCredential()
)

# <create_agent>
from azure.ai.projects.models import FileSearchTool

file = project.agents.upload_file_and_poll(file_path="product_info_1.md", purpose="assistants")
vector_store = project.agents.create_vector_store_and_poll(file_ids=[file.id], name="my_vectorstore")
file_search = FileSearchTool(vector_store_ids=[vector_store.id])

# Create agent with file search tool and process the agent run
agent = project.agents.create_agent(
model="gpt-4o-mini",
name="my-agent",
instructions="Hello, you are helpful agent and can search information from uploaded files",
tools=file_search.definitions,
tool_resources=file_search.resources,
)

# </create_agent>

# <run_agent>
# create and run a thread with a message
thread = project.agents.create_thread()
message = project.agents.create_message(
thread_id=thread.id, role="user", content="Hello, what Contoso products do you know?"
)
run = project.agents.create_and_process_run(thread_id=thread.id, assistant_id=agent.id)
if run.status == "failed":
print(f"Run failed: {run.last_error}")
exit()

# get messages from the thread and print the response (last message)
messages = project.agents.list_messages(thread_id=thread.id)
print(f"Response: {messages[-1]}")

# </run_agent>
22 changes: 22 additions & 0 deletions scenarios/projects/basic/chat-simple.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

project_connection_string = "<your-connection-string-goes-here>"

project = AIProjectClient.from_connection_string(
conn_str=project_connection_string, credential=DefaultAzureCredential()
)

chat = project.inference.get_chat_completions_client()
response = chat.complete(
model="gpt-4o-mini",
messages=[
{
"role": "system",
"content": "You are an AI assistant that speaks like a techno punk rocker from 2350. Be cool but not too cool. Ya dig?",
},
{"role": "user", "content": "Hey, can you help me with my taxes? I'm a freelancer."},
],
)

print(response.choices[0].message.content)
44 changes: 44 additions & 0 deletions scenarios/projects/basic/chat-template.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# This file is not meant to be run

# ruff: noqa: E402, ANN201, ANN001

chat = None

# <chat_function>
from azure.ai.inference.prompts import PromptTemplate


def get_chat_response(messages, context):
# create a prompt template from an inline string (using mustache syntax)
prompt_template = PromptTemplate.from_message(
prompt_template="""
system:
You are an AI assistant that speaks like a techno punk rocker from 2350. Be cool but not too cool. Ya dig? Refer to the user by their first name, try to work their last name into a pun.
The user's first name is {{first_name}} and their last name is {{last_name}}.
"""
)

# generate system message from the template, passing in the context as variables
system_message = prompt_template.render(data=context)

# add the prompt messages to the user messages
return chat.complete(
model="gpt-4o-mini",
messages=system_message + messages,
temperature=1,
frequency_penalty=0.5,
presence_penalty=0.5,
)


# </chat_function>

# <create_response>
if __name__ == "__main__":
response = get_chat_response(
messages=[{"role": "user", "content": "what city has the best food in the world?"}],
context={"first_name": "Jessie", "last_name": "Irwin"},
)
print(response.choices[0].message.content)
# </create_response>
25 changes: 25 additions & 0 deletions scenarios/projects/basic/evaluate_violence.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# ruff: noqa: E402

import os
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

from dotenv import load_dotenv

load_dotenv()

project = AIProjectClient.from_connection_string(
conn_str=os.environ["AIPROJECT_CONNECTION_STRING"], credential=DefaultAzureCredential()
)

# <evaluate_violence>
from azure.ai.evaluation import ViolenceEvaluator
from azure.identity import DefaultAzureCredential

# Initializing Violence Evaluator with project information
violence_eval = ViolenceEvaluator(azure_ai_project=project.scope, credential=DefaultAzureCredential())

# Running Violence Evaluator on single input row
violence_score = violence_eval(query="what's the capital of france", response="Paris")
print(violence_score)
# </evaluate_violence>
30 changes: 30 additions & 0 deletions scenarios/projects/basic/inference.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
import os
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

from dotenv import load_dotenv

load_dotenv()

project = AIProjectClient.from_connection_string(
conn_str=os.environ["AIPROJECT_CONNECTION_STRING"], credential=DefaultAzureCredential()
)

# <chat_completion>
# get a chat inferencing client using the project's default model inferencing endpoint
chat = project.inference.get_chat_completions_client()

# run a chat completion using the inferencing client
response = chat.complete(
model="gpt-4o-mini",
messages=[
{
"role": "system",
"content": "You are an AI assistant that speaks like a techno punk rocker from 2350. Be cool but not too cool. Ya dig?",
},
{"role": "user", "content": "Hey, can you help me with my taxes? I'm a freelancer."},
],
)

print(response.choices[0].message.content)
# </chat_completion>
17 changes: 17 additions & 0 deletions scenarios/projects/basic/myprompt.prompty
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
---
name: Chat Prompt
description: A prompty that extract users query intent based on the current_query and chat_history of the conversation
model:
api: chat
configuration:
azure_deployment: gpt-4o-mini
parameters:
max_tokens: 256 # limit the output
temperature: 0.8 # higher temperature for creative answers
---
system:
You are a helpful writing assistant.
The user's first name is {{first_name}} and their last name is {{last_name}}.

user:
Write me a short poem about flowers
24 changes: 24 additions & 0 deletions scenarios/projects/basic/openai_client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
import os
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

from dotenv import load_dotenv

load_dotenv()

project = AIProjectClient.from_connection_string(
conn_str=os.environ["AIPROJECT_CONNECTION_STRING"], credential=DefaultAzureCredential()
)

# <get_openai_client>
openai = project.inference.get_azure_openai_client(api_version="2024-06-01")
response = openai.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful writing assistant"},
{"role": "user", "content": "Write me a poem about flowers"},
],
)

print(response.choices[0].message.content)
# </get_openai_client>
51 changes: 51 additions & 0 deletions scenarios/projects/basic/product_info_1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Information about product item_number: 1

## Brand
Contoso Galaxy Innovations

## Category
Smart Eyewear

## Features
- Augmented Reality interface
- Voice-controlled AI assistant
- HD video recording with 3D audio
- UV protection and blue light filtering
- Wireless charging with extended battery life

## User Guide

### 1. Introduction
Introduction to your new SmartView Glasses

### 2. Product Overview
Overview of features and controls

### 3. Sizing and Fit
Finding your perfect fit and style adjustments

### 4. Proper Care and Maintenance
Cleaning and caring for your SmartView Glasses

### 5. Break-in Period
Adjusting to the augmented reality experience

### 6. Safety Tips
Safety guidelines for public and private spaces

### 7. Troubleshooting
Quick fixes for common issues

## Warranty Information
Two-year limited warranty on all electronic components

## Contact Information
Customer Support at support@contoso-galaxy-innovations.com

## Return Policy
30-day return policy with no questions asked

## FAQ
- How to sync your SmartView Glasses with your devices
- Troubleshooting connection issues
- Customizing your augmented reality environment
8 changes: 8 additions & 0 deletions scenarios/projects/basic/project.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
from azure.identity import DefaultAzureCredential
from azure.ai.projects import AIProjectClient

project_connection_string = "your_connection_string"

project = AIProjectClient.from_connection_string(
conn_str=project_connection_string, credential=DefaultAzureCredential()
)
Loading

0 comments on commit 52ab914

Please sign in to comment.