Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refine documentation on new reliable event handling framework built on Kafka #2679

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -439,4 +439,30 @@ NOTE: All the default serializers are having `Ordered.LOWEST_PRECEDENCE`.
|`false`
|Whether the external event sending is enabled or disabled.

|===
|===

== Fineract Events Reliability with Kafka
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


A new reliable event handling framework built on Kafka now makes Fineract events more reliable and improves performance.

Users and customers requires guaranteed, at least once, message delivery.

=== Background

Fineract has an internal notification system for communicating events. This system can have various adapters connected to forward the event to an external system. Now Fineract has a Kafka (MSK) adapter. Using these events can be generated from any write operations via and API operation or from COB. Events will be lost if there is any fatal error with the EC2/JVM after the DB TX (Database Transaction) is committed and before the event is committed to MSK.

=== Engineering Solution

Events are guaranteed to be delivered at least once. Events delivered more than once must have the same UUID for deduplication. Events much have a stable UUID assigned and stored in the DB. Events must be committed to the DB part of the write operation that created the event.

== Fineract Events Performance Enhancements with Kafka

Users and customers requires events, for all state changes, from Fineract to maintain eventually consistent datasets in the Credit Platform. The volume of events, from API writes and COB, must be scalable and not impact Fineract's overall performance.

=== Background

Fineract has adapter(s) to publish events to webhooks, publishing to Apache Kafka would yeild the best results.

=== Engineering Implementation

Now, AWS MSK is used with a new Fineract adapter for Kafka. This is hardened/productionized with guaranteed message delivery.