Skip to content

Commit

Permalink
added mistral endpoint
Browse files Browse the repository at this point in the history
  • Loading branch information
haesleinhuepf committed Nov 30, 2024
1 parent 0394bbf commit 80f5929
Show file tree
Hide file tree
Showing 4 changed files with 177 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/00_setup/environment-cpu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,4 +34,5 @@ dependencies:
- opencv-python
- pooch
- azure-ai-inference
- mistralai
prefix: C:\Users\haase\miniconda3\envs\genai-cpu
1 change: 1 addition & 0 deletions docs/00_setup/environment-gpu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,4 +39,5 @@ dependencies:
- opencv-python
- pooch
- azure-ai-inference
- mistralai
prefix: C:\Users\haase\miniconda3\envs\genai-gpu
174 changes: 174 additions & 0 deletions docs/15_endpoint_apis/07_mistral_endpoint.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,174 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "eb253b1f-443d-4f13-9935-ab33e62ebfbf",
"metadata": {},
"source": [
"# Mistral\n",
"In this notebook we access the [Mistral API](https://console.mistral.ai/api-keys/), a french service that offers free API access with certain limitations."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "ee96e226-6f7e-4b3c-a51b-fd0aabb708b1",
"metadata": {},
"outputs": [],
"source": [
"def prompt_mistral(prompt, model=\"mistral-medium-2312\"):\n",
" import os\n",
" from mistralai import Mistral\n",
" \n",
" api_key = os.environ[\"MISTRAL_API_KEY\"]\n",
" \n",
" client = Mistral(api_key=api_key)\n",
" \n",
" chat_response = client.chat.complete(\n",
" model= model,\n",
" messages = [\n",
" {\n",
" \"role\": \"user\",\n",
" \"content\": prompt,\n",
" },\n",
" ]\n",
" )\n",
" return chat_response.choices[0].message.content"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "91f7db17-8555-4d26-ad52-f107bf8ec616",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"\"It's subjective, but many consider Époisses de Bourgogne, a soft, pungent, and creamy cheese from Burgundy, to be one of the best French cheeses.\""
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"prompt_mistral(\"What is the best French cheese? Answer in one sentence.\")"
]
},
{
"cell_type": "markdown",
"id": "6ab42dae-6f7c-46a0-a466-fd68c6a951fe",
"metadata": {},
"source": [
"We can also ask the API a listing of all available models:"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "4bbf48c6-e047-471b-9809-e3d5795549e9",
"metadata": {},
"outputs": [],
"source": [
"from mistralai import Mistral\n",
"import os\n",
"\n",
"s = Mistral(\n",
" api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n",
")\n",
"\n",
"res = s.models.list()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "c0f4e67a-00c2-46ee-9bb3-3f0186ec3a74",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"['ministral-3b-2410',\n",
" 'ministral-3b-latest',\n",
" 'ministral-8b-2410',\n",
" 'ministral-8b-latest',\n",
" 'open-mistral-7b',\n",
" 'mistral-tiny',\n",
" 'mistral-tiny-2312',\n",
" 'open-mistral-nemo',\n",
" 'open-mistral-nemo-2407',\n",
" 'mistral-tiny-2407',\n",
" 'mistral-tiny-latest',\n",
" 'open-mixtral-8x7b',\n",
" 'mistral-small',\n",
" 'mistral-small-2312',\n",
" 'open-mixtral-8x22b',\n",
" 'open-mixtral-8x22b-2404',\n",
" 'mistral-small-2402',\n",
" 'mistral-small-2409',\n",
" 'mistral-small-latest',\n",
" 'mistral-medium-2312',\n",
" 'mistral-medium',\n",
" 'mistral-medium-latest',\n",
" 'mistral-large-2402',\n",
" 'mistral-large-2407',\n",
" 'mistral-large-2411',\n",
" 'mistral-large-latest',\n",
" 'pixtral-large-2411',\n",
" 'pixtral-large-latest',\n",
" 'codestral-2405',\n",
" 'codestral-latest',\n",
" 'codestral-mamba-2407',\n",
" 'open-codestral-mamba',\n",
" 'codestral-mamba-latest',\n",
" 'pixtral-12b-2409',\n",
" 'pixtral-12b',\n",
" 'pixtral-12b-latest',\n",
" 'mistral-embed',\n",
" 'mistral-moderation-2411',\n",
" 'mistral-moderation-latest']"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"[m.id for m in res.data]"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "07c626eb-651d-4add-ba9a-e92577f68e76",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.10"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
1 change: 1 addition & 0 deletions docs/_toc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ parts:
- file: 15_endpoint_apis/06_kisski_endpoint.ipynb
- file: 15_endpoint_apis/03_blablador_endpoint.ipynb
- file: 15_endpoint_apis/05_azure_endpoints.ipynb
- file: 15_endpoint_apis/07_mistral_endpoint.ipynb
- file: 15_endpoint_apis/10_anthropic_api.ipynb
- file: 15_endpoint_apis/20_google_gemini.ipynb
- file: 15_endpoint_apis/30_huggingface.ipynb
Expand Down

0 comments on commit 80f5929

Please sign in to comment.