Skip to content

Strange behaviour on Kafka multi consumers #470

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Gnucki opened this issue Jun 27, 2018 · 2 comments
Closed

Strange behaviour on Kafka multi consumers #470

Gnucki opened this issue Jun 27, 2018 · 2 comments

Comments

@Gnucki
Copy link
Contributor

Gnucki commented Jun 27, 2018

Hi,

I'm a bit confused with the use of rdkafka.
I start 2 consumers and a producer with following config:

enqueue:
    async_events:
        enabled: false
    transport:
        default: rdkafka
        rdkafka:
            global:
                group.id: '%app.name%'
                offset.store.method: broker
                metadata.broker.list: '%env(KAFKA_BROKER_LIST)%'
            topic:
                offset.store.method: broker
    client: ~

I publish a message on a topic with only 1 partition.
I expect only 1 consumer to process the message.
However, both process all messages.
Moreover, when I start a new consumer, it does not care of current commited offset and process all messages in the stream again.

Did I miss something?

Thanks for your help.

Note that I use Symfony 4.1 messenger (so php-enqueue/messenger-adapter).

@makasim
Copy link
Member

makasim commented Oct 19, 2018

I think this should fix the issue #508 and here I proposed to back port the fix to 0.8 branch

@Gnucki
Copy link
Contributor Author

Gnucki commented Oct 22, 2018

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants