Comment by yesnomaybe

Comment by yesnomaybe 11 hours ago

0 replies

Been on Kafka (MSK) for a couple of years. I find the programming model and getting everything perfectly set up to be sitting behind a steep learning curve, to my surprise. For example, at some point I had a timestamp header but only very much later realised that it all ends up as number[] on the consumer side. So I lost data. My fault, but still. I came to the realisation that the programming model especially in MSK is rather unintuitive.

I found it hard to shift mentally from MSK and its even triggers back to regular consumer spun up in containers etc. but that also it rather MSK than Kafka.

I am currently swapping out the whole pub/sub layer to MongoDB change streams, which I have found to be working really well. For queuing it attempts to lock on read so I can scale consumers with retry / backoff etc. Broadcast is simple and without locking, auto delete in Mongo.

I will have to see how it really scales and I'm sure I'm trading one problem for another but, it will definitely help to remove a moving part. Overall, app is rather low volume with the occasional spike. I would have stayed with Kafka were there be let's say >100rpm on the core functions.