Skip to content

Commit a850213

Browse files
author
あで
committed
Fix review comments
1 parent fc9ec96 commit a850213

File tree

16 files changed

+1533
-302
lines changed

16 files changed

+1533
-302
lines changed

.github/workflows/main.yml

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,12 @@ jobs:
2323
with:
2424
python-version: ${{ matrix.python-version }}
2525
cache: poetry
26-
- name: Install dependencies
26+
- name: Install dependencies (mainline)
27+
if: ${{ matrix.python-version != 'pypy3.10' }}
2728
run: poetry install --no-interaction --no-root --with dev,examples --all-extras
29+
- name: Install dependencies (pypy)
30+
if: ${{ matrix.python-version == 'pypy3.10' }}
31+
run: poetry install --no-interaction --no-root --with dev,examples --extras=exporter-otlp-proto-http
2832
- name: Check code formatting
2933
run: poetry run black .
3034
- name: Lint lib code

README.md

Lines changed: 21 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -240,31 +240,33 @@ exemplar collection by setting `AUTOMETRICS_EXEMPLARS=true`. You also need to en
240240

241241
## Exporting metrics
242242

243-
There are multiple ways to export metrics from your application, depending on your setup.
243+
There are multiple ways to export metrics from your application, depending on your setup. You can see examples of how to do this in the [examples/export_metrics](https://github.com/autometrics-dev/autometrics-py/tree/main/examples/export_metrics) directory of this repository.
244244

245245
If you use the `prometheus` tracker, you have two options.
246246

247247
1. Create a route inside your app and respond with `generate_latest()`
248-
```python
249-
# This example uses FastAPI, but you can use any web framework
250-
from fastapi import FastAPI, Response
251-
from prometheus_client import generate_latest
252-
253-
# Set up a metrics endpoint for Prometheus to scrape
254-
@app.get("/metrics")
255-
def metrics():
256-
return Response(generate_latest())
257-
```
248+
249+
```python
250+
# This example uses FastAPI, but you can use any web framework
251+
from fastapi import FastAPI, Response
252+
from prometheus_client import generate_latest
253+
254+
# Set up a metrics endpoint for Prometheus to scrape
255+
@app.get("/metrics")
256+
def metrics():
257+
return Response(generate_latest())
258+
```
258259

259260
2. Specify `prometheus-client` as the exporter type, and a separate server will be started to expose metrics from your app:
260-
```python
261-
exporter = {
262-
"type": "prometheus-client",
263-
"address": "localhost",
264-
"port": 9464
265-
}
266-
init(tracker="prometheus", service_name="my-service", exporter=exporter)
267-
```
261+
262+
```python
263+
exporter = {
264+
"type": "prometheus-client",
265+
"address": "localhost",
266+
"port": 9464
267+
}
268+
init(tracker="prometheus", service_name="my-service", exporter=exporter)
269+
```
268270

269271
For the OpenTelemetry tracker, you have more options, including a custom metric reader. By default, when using this tracker, autometrics
270272
will export metrics to the Prometheus registry via `prometheus-client`, from which you can export via one of the ways described above.

cat

Lines changed: 980 additions & 0 deletions
Large diffs are not rendered by default.

docker-compose.yaml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,19 @@
1-
version: '3.9'
1+
version: "3.9"
22

33
volumes:
44
app-logs:
55

6-
76
services:
87
am:
98
image: autometrics/am:latest
109
extra_hosts:
1110
- host.docker.internal:host-gateway
1211
ports:
1312
- "6789:6789"
13+
- "9090:9090"
1414
container_name: am
15-
command: "start host.docker.internal:8080"
16-
environment:
15+
command: "start host.docker.internal:9464"
16+
environment:
1717
- LISTEN_ADDRESS=0.0.0.0:6789
1818
restart: unless-stopped
1919
volumes:
@@ -31,4 +31,4 @@ services:
3131
- "55679:55679"
3232
restart: unless-stopped
3333
push-gateway:
34-
image: ghcr.io/zapier/prom-aggregation-gateway:v0.7.0
34+
image: ghcr.io/zapier/prom-aggregation-gateway:latest
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
import time
2+
from autometrics import autometrics, init
3+
4+
# Autometrics supports exporting metrics to Prometheus via the OpenTelemetry.
5+
# This example uses the Prometheus Python client, available settings are same as the
6+
# Prometheus Python client. By default, the Prometheus exporter will expose metrics
7+
# on port 9464. If you don't have a Prometheus server running, you can run Tilt or
8+
# Docker Compose from the root of this repo to start one up.
9+
10+
init(
11+
tracker="opentelemetry",
12+
exporter={
13+
"type": "otel-prometheus",
14+
},
15+
service_name="my-service",
16+
)
17+
18+
19+
@autometrics
20+
def my_function():
21+
pass
22+
23+
24+
while True:
25+
my_function()
26+
time.sleep(1)

examples/export_metrics/otlp.py

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
import time
2+
from autometrics import autometrics, init
3+
4+
# Autometrics supports exporting metrics to OTLP collectors via gRPC and HTTP transports.
5+
# This example uses the gRPC transport, available settings are similar to the OpenTelemetry
6+
# Python SDK. By default, the OTLP exporter will send metrics to localhost:4317.
7+
# If you don't have an OTLP collector running, you can run Tilt or Docker Compose
8+
# to start one up. See the README for more details.
9+
10+
init(
11+
exporter={
12+
"type": "otlp-proto-grpc",
13+
"push_interval": 1000,
14+
},
15+
service_name="my-service",
16+
)
17+
18+
19+
@autometrics
20+
def my_function():
21+
pass
22+
23+
24+
while True:
25+
my_function()
26+
time.sleep(1)
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
import time
2+
from autometrics import autometrics, init
3+
4+
# Autometrics supports exporting metrics to Prometheus via the Prometheus Python client.
5+
# This example uses the Prometheus Python client, available settings are same as the
6+
# Prometheus Python client. By default, the Prometheus exporter will expose metrics
7+
# on port 9464. If you don't have a Prometheus server running, you can run Tilt or
8+
# Docker Compose from the root of this repo to start one up.
9+
10+
init(
11+
tracker="prometheus",
12+
exporter={
13+
"type": "prometheus-client",
14+
"port": 9464,
15+
},
16+
service_name="my-service",
17+
)
18+
19+
20+
@autometrics
21+
def my_function():
22+
pass
23+
24+
25+
while True:
26+
my_function()
27+
time.sleep(1)

examples/otlp/start.py

Lines changed: 0 additions & 19 deletions
This file was deleted.

0 commit comments

Comments
 (0)