-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add perf tests for eventgrid #16949
Add perf tests for eventgrid #16949
Changes from 3 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,49 @@ | ||
# EventGrid Performance Tests | ||
|
||
In order to run the performance tests, the `azure-devtools` package must be installed. This is done as part of the `dev_requirements`. | ||
Start by creating a new virtual environment for your perf tests. This will need to be a Python 3 environment, preferably >=3.7. | ||
|
||
### Setup for test resources | ||
|
||
These tests will run against a pre-configured Eventgrid topic. The following environment variable will need to be set for the tests to access the live resources: | ||
``` | ||
EG_ACCESS_KEY=<access key of your eventgrid account> | ||
EG_TOPIC_HOSTNAME=<hostname of the eventgrid topic> | ||
``` | ||
|
||
### Setup for perf test runs | ||
|
||
```cmd | ||
(env) ~/azure-eventgrid> pip install -r dev_requirements.txt | ||
(env) ~/azure-eventgrid> pip install -e . | ||
``` | ||
|
||
## Test commands | ||
|
||
```cmd | ||
(env) ~/azure-eventgrid> cd tests | ||
(env) ~/azure-eventgrid/tests> perfstress | ||
``` | ||
|
||
### Common perf command line options | ||
These options are available for all perf tests: | ||
- `--duration=10` Number of seconds to run as many operations (the "run" function) as possible. Default is 10. | ||
- `--iterations=1` Number of test iterations to run. Default is 1. | ||
- `--parallel=1` Number of tests to run in parallel. Default is 1. | ||
- `--no-client-share` Whether each parallel test instance should share a single client, or use their own. Default is False (sharing). | ||
- `--warm-up=5` Number of seconds to spend warming up the connection before measuring begins. Default is 5. | ||
- `--sync` Whether to run the tests in sync or async. Default is False (async). | ||
- `--no-cleanup` Whether to keep newly created resources after test run. Default is False (resources will be deleted). | ||
|
||
### EventGrid Test options | ||
These options are available for all eventgrid perf tests: | ||
- `--num-events` Number of events to be published using the send method. | ||
|
||
### T2 Tests | ||
The tests currently written for the T2 SDK: | ||
- `EventGridPerfTest` Publishes a list of eventgrid events. | ||
|
||
## Example command | ||
```cmd | ||
(env) ~/azure-eventgrid/tests> perfstress EventGridPerfTest --num-events=100 | ||
``` |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,69 @@ | ||
import random | ||
import asyncio | ||
from azure_devtools.perfstress_tests import PerfStressTest | ||
|
||
from azure.eventgrid import EventGridPublisherClient as SyncPublisherClient, EventGridEvent | ||
from azure.eventgrid.aio import EventGridPublisherClient as AsyncPublisherClient | ||
|
||
from azure.core.credentials import AzureKeyCredential | ||
|
||
class EventGridPerfTest(PerfStressTest): | ||
def __init__(self, arguments): | ||
super().__init__(arguments) | ||
|
||
# auth configuration | ||
topic_key = self.get_from_env("EG_ACCESS_KEY") | ||
endpoint = self.get_from_env("EG_TOPIC_HOSTNAME") | ||
|
||
# Create clients | ||
self.publisher_client = SyncPublisherClient( | ||
endpoint=endpoint, | ||
credential=AzureKeyCredential(topic_key) | ||
) | ||
self.async_publisher_client = AsyncPublisherClient( | ||
endpoint=endpoint, | ||
credential=AzureKeyCredential(topic_key) | ||
) | ||
|
||
services = ["EventGrid", "ServiceBus", "EventHubs", "Storage"] | ||
self.event_list = [] | ||
for _ in range(self.args.num_events): | ||
self.event_list.append(EventGridEvent( | ||
event_type="Contoso.Items.ItemReceived", | ||
data={ | ||
"services": random.sample(services, k=random.randint(1, 4)) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This will test a different data set every time - we should probably keep each test consistent with throughput. parser.add_argument('-e', '--event-service', nargs='?', type=str, help='The event service type. Default is "EventGrib"', default='EventGrid') But otherwise I think we can just stick to a single value. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. sticking to single value should be good |
||
}, | ||
subject="Door1", | ||
data_version="2.0" | ||
)) | ||
|
||
async def close(self): | ||
"""This is run after cleanup. | ||
|
||
Use this to close any open handles or clients. | ||
""" | ||
await self.async_publisher_client.close() | ||
await super().close() | ||
|
||
def run_sync(self): | ||
"""The synchronous perf test. | ||
|
||
Try to keep this minimal and focused. Using only a single client API. | ||
Avoid putting any ancilliary logic (e.g. generating UUIDs), and put this in the setup/init instead | ||
so that we're only measuring the client API call. | ||
""" | ||
self.publisher_client.send(self.event_list) | ||
|
||
async def run_async(self): | ||
"""The asynchronous perf test. | ||
|
||
Try to keep this minimal and focused. Using only a single client API. | ||
Avoid putting any ancilliary logic (e.g. generating UUIDs), and put this in the setup/init instead | ||
so that we're only measuring the client API call. | ||
""" | ||
await self.async_publisher_client.send(self.event_list) | ||
|
||
@staticmethod | ||
def add_arguments(parser): | ||
super(EventGridPerfTest, EventGridPerfTest).add_arguments(parser) | ||
parser.add_argument('-n', '--num-events', nargs='?', type=int, help='Number of events to be sent. Defaults to 100', default=100) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can remove this option unless you want to implement sharing a single client between test instances.
Probably not needed for now.