Skip to content

Commit

Permalink
Added documentation and trying to fix the test.yml.
Browse files Browse the repository at this point in the history
  • Loading branch information
edavalosanaya committed Oct 30, 2023
1 parent 0260656 commit f89130f
Show file tree
Hide file tree
Showing 5 changed files with 116 additions and 6 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ jobs:
steps:
- name: Finished
run: |
${{ steps.cp39.outputs.python-path }} -m pip install --upgrade coveralls
python -m pip install --upgrade coveralls
coveralls --service=github --finish
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
26 changes: 22 additions & 4 deletions docs/bus_examples.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,52 @@
# Local EventBus & EntryPoint

The front-facing API of both ``(EventBus, EntryPoint)`` and ``(DEventBus, DEntryPoint)`` are identifical, but important differences need to be consider. For a distributed configuration, ensure that any emitted ``Event`` data is serializable -- this is not a requirement for local setup.
For a local eventbus implementation, the API and configuration is similar to the their distributed counterparts. Key note here is that serialization is not necessary but recommended -- in case you are briding messages between eventbuses.

First, our imports are for specifically the local versions:

```python
from dataclasses import dataclass
from dataclasses_json import DataClassJsonMixin
from aiodistbus import EntryPoint, EventBus # or DEntryPoint, DEventBus
from aiodistbus import EntryPoint, EventBus
```

Then we add the handler and the data type (it can be anything serializable) that is going to be the handler's input.

```python
@dataclass
class ExampleEvent(DataClassJsonMixin):
msg: str

async def handler(event: ExampleEvent):
assert isinstance(event, ExampleEvent)
logger.info(f"Received event {event}")
```

After the configuration, we have to create the necessary server-client resources and connect the setup:

```python
# Create resources
bus = EventBus() # of DEventBus for distributed eventbus
e1, e2 = EntryPoint(), EntryPoint() # or DEntryPoint for distributed eventbus
bus = EventBus()
e1, e2 = EntryPoint(), EntryPoint()

# Add handlers
await e1.on('example', handler, ExampleEvent)

# Connect
await e1.connect(bus)
await e2.connect(bus)
```

With everything configured, we can not emit messages between the entrypoints:

```python
# Send message and e1's handler is executed
event = await e2.emit('example', ExampleEvent(msg="hello"))
```

Make sure to close the resources at the end of the program.

```python
# Closing (order doesn't matter)
await bus.close()
await e1.close()
Expand Down
55 changes: 55 additions & 0 deletions docs/dbus_examples.md
Original file line number Diff line number Diff line change
@@ -1 +1,56 @@
# Distributed EventBus & DEntryPoint

For the distributed eventbus implementation, the data transmitted within the bus needs to be serializable.

!!! note
Make sure to use ``DataClassJsonMixin`` when using a ``dataclass`` to permit the serialization of the data and documenting the dtypes.

```python
from dataclasses import dataclass
from dataclasses_json import DataClassJsonMixin
from aiodistbus import DEntryPoint, DEventBus
```

The handlers for events is the same as the local eventbus implementation, making interoperability feasible.

```python
@dataclass
class ExampleEvent(DataClassJsonMixin):
msg: str

async def handler(event: ExampleEvent):
assert isinstance(event, ExampleEvent)
logger.info(f"Received event {event}")
```

After the configuration, we have to create the necessary server-client resources and connect the setup:

```python
# Create resources
dbus = DEventBus()
e1, e2 = DEntryPoint(), DEntryPoint()

# Add handlers
await e1.on('example', handler, ExampleEvent)

# Connect
await e1.connect(dbus.ip, dbus.port)
await e2.connect(dbus.ip, dbus.port)
```

With everything configured, we can not emit messages between the entrypoints:

```python
# Send message and e1's handler is executed
event = await e2.emit('example', ExampleEvent(msg="hello"))
```

Make sure to close the resources at the end of the program.


```python
# Closing (order doesn't matter)
await bus.close()
await e1.close()
await e2.close()
```
37 changes: 37 additions & 0 deletions docs/evented_dataclass.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Registry: Event Handler Decorator

To link the handlers to event types in a more streamline fashion, we can use a ``Registry`` to have ``Namespaces`` decided to list of handlers and their event types.

Here we perform the imports and configuration with the registry

```python
from aiodistbus import Event, registry

@dataclass
class ExampleEvent:
msg: str

@registry.on("test", ExampleEvent)
async def func(event: ExampleEvent):
assert isinstance(event, ExampleEvent)
print(event)
```

Then we create the resources and leverage the ``use`` method of the ``EntryPoint`` to access the stored handler/event_type pairs.

```python
# Create resources
e1, e2 = entrypoints

# Add handlers
await e1.use(registry)

# Connect
await e1.connect(bus)
await e2.connect(bus)

# Send message
event = await e2.emit(event_type, dtype_instance)
```

The reason for a registry instead of a ``@ee.on('test', handler)``, like ``pyee``'s ``EventEmitter``, aproach is because in a multiprocessing context, a global eventbus or entrypoint isn't not going to work properly accross different multiprocessing context -- especially Windows's "spawn".
2 changes: 1 addition & 1 deletion test/test_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
import pytest
from dataclasses_json import DataClassJsonMixin

from aiodistbus import DataClassEvent, make_evented
from aiodistbus import make_evented

logger = logging.getLogger("aiodistbus")

Expand Down

0 comments on commit f89130f

Please sign in to comment.