Skip to content

Conversation

@LeoPatOZ
Copy link
Collaborator

@LeoPatOZ LeoPatOZ commented Nov 11, 2025

Resolves #169

Our scanner internally naturally allows for multiple event listeners so it was more updating the api to allow for this.

Question: What should our scanner do if we ask for count of two events? Return first one with that count? (I forgot).

impl<M, N: Network> EventScanner<M, N> {
#[must_use]
pub fn subscribe(&mut self, filter: EventFilter) -> ReceiverStream<Message> {
pub fn subscribe(&mut self, filters: impl Into<Vec<EventFilter>>) -> ReceiverStream<Message> {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

impl IntoVec is quick and dirty. I think we could make it more comprehensive to accept more 'array' types (iters, slices etc...) but I thought this is good enough lmk

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should use IntoIterator here

Comment on lines +411 to +415
let filters: Vec<EventFilter> = filters.into();

for filter in filters {
self.listeners.push(EventListener { filter, sender: sender.clone() });
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the current setup, every EventListener will run in its own tokio task, but still send to the appropriate stream, which is good.
The potential problem is that event listeners will process block ranges independently and will likely stream their respective logs out of order.

Example showing what I'd consider intuitive:

  • event listeners: A and B
  • block ranges being processed: 1-10, 11-20, 21-30
  • expected: streaming events A and B in chronological order first in block range 1-10, then in 11-20, then in 21-30...
  • actual: streaming first all events A in ranges 1-10, 11-20, 21-30, then all events B in the same ranges

Does this make sense?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I realised however I thought this was a necessary by product, so you can process them in parallel. Otherwise you would need to merge the event filters into one event listener (but miss out on some concurrency "gains"?)

Maybe I'm missing something with my logic here.

I think if the user wanted them in order they could do that on their end by sorting by timestamp

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think if the user wanted them in order they could do that on their end by sorting by timestamp

They wouldn't need a single stream for different event types then, they could just start multiple streams.

The point of this feature is to do the "sorting by timestamps" on the scanner's end. Will update the issue description to make that clearer.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They wouldn't need a single stream for different event types then, they could just start multiple streams.

But then they need to run multiple streams so starting multiple connections to same provider, iterating of the same blocks etc.. here the same block numbers are handled in parallel.

But yes im just playing devils advocate i see your point closing the PR for later - just thought it was worth mentioning

@LeoPatOZ LeoPatOZ closed this Nov 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: Support Multiple Event Filters for a Single Event Stream

3 participants