event streams: produce and consume #1289
Replies: 7 comments
-
Is this issue about having an event-based construct available for use in arc, or specifically about adding Kafka support (is that a service AWS offers)? My initial gut reaction is that arc already provides a built-in pub/sub construct using |
Beta Was this translation helpful? Give feedback.
-
Also SQS via |
Beta Was this translation helpful? Give feedback.
-
Not sure if reusable, but, for this case, in staging and production, I’d like my streams to be persistent. This allows different consumers to consume the same topic at different speeds. |
Beta Was this translation helpful? Give feedback.
-
So, not sure how well this maps to the events directive because of these characteristics. |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
yeah think this is def a new primitive; tho the patterns do map to SQS / SNS well ! |
Beta Was this translation helpful? Give feedback.
-
FYI / update: we think this should be a plugin, and will move this to Discussions for further, er, discussion. Good news: |
Beta Was this translation helpful? Give feedback.
-
Problem
Event-based architectures are super cool. When using Kafka, it lets me define topics, into which I can produce events. These events can then be consumed by one or more consumers. A consumer for a topic can be a lambda functions.
This architecture is very convenient, as, from the API, mutations are simply events that are produced and that, once consumed, produce side-effects in different stores. For instance, an updated document can have multiple consumers: one that saves the document to S3, another that indexes it on ElasticSearch, and another that updates the user account in a relational database.
This is all fine, but the development environment is a pain. You have to have Docker to run Kafka, etc etc.
Idea
For the simple use cases I'm using Kafka for, I'd rather just have a trimmed-down local Kafka-like service running locally (like you do for Dynamo and a bunch of other services).
In configuration, I'd define my Kafka topics, the number of partitions and some other options for each one.
I'd then define consumers for each topic as lambda functions (AWS already supports having a lambda consuming a Kafka stream).
I should also be able to produce events in any Lambda function.
What do you think?
If you're interested and you provided some initial pointers, I'd consider developing a plugin that accomplishes this (plus the necessary cloud formation generation).
Beta Was this translation helpful? Give feedback.
All reactions